Apr 21 17:33:32.733556 ip-10-0-129-92 systemd[1]: Starting Kubernetes Kubelet... Apr 21 17:33:33.214396 ip-10-0-129-92 kubenswrapper[2583]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 17:33:33.214396 ip-10-0-129-92 kubenswrapper[2583]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 17:33:33.214396 ip-10-0-129-92 kubenswrapper[2583]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 17:33:33.215049 ip-10-0-129-92 kubenswrapper[2583]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 17:33:33.215049 ip-10-0-129-92 kubenswrapper[2583]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 17:33:33.216199 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.216093 2583 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 17:33:33.219421 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219403 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219432 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219436 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219439 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219442 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219447 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219451 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219454 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219457 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219460 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219463 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219466 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219468 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219471 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219474 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219476 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219479 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219481 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219483 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219486 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 17:33:33.219482 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219489 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219492 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219495 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219503 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219506 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219509 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219511 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219513 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219516 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219518 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219520 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219524 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219526 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219529 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219537 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219539 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219541 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219544 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219546 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 17:33:33.219959 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219549 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219551 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219554 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219557 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219560 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219562 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219564 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219567 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219569 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219572 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219574 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219577 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219579 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219582 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219584 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219588 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219591 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219594 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219597 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219599 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 17:33:33.220434 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219602 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219604 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219607 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219609 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219612 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219614 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219617 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219620 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219629 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219632 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219634 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219636 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219639 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219641 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219644 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219646 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219652 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219655 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219657 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 17:33:33.220958 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219660 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219662 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219664 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219667 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219669 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219672 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219674 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.219676 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220186 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220193 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220195 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220198 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220200 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220203 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220206 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220208 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220211 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220213 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220215 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220218 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 17:33:33.221449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220220 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220228 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220231 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220234 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220236 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220238 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220241 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220243 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220246 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220248 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220251 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220253 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220255 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220258 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220260 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220262 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220265 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220267 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220269 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220272 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 17:33:33.221934 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220275 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220278 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220280 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220283 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220286 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220288 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220290 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220293 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220295 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220297 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220300 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220302 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220305 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220307 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220314 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220317 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220320 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220322 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220326 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 17:33:33.222448 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220330 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220334 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220337 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220339 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220342 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220345 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220348 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220350 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220353 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220356 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220358 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220360 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220363 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220366 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220369 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220371 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220373 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220377 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220381 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220384 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 17:33:33.222912 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220386 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220389 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220391 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220393 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220396 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220398 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220400 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220403 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220411 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220414 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220416 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220419 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220421 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220424 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.220426 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221520 2583 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221530 2583 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221546 2583 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221551 2583 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221555 2583 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221558 2583 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 17:33:33.223430 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221563 2583 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221567 2583 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221570 2583 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221573 2583 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221579 2583 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221582 2583 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221585 2583 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221589 2583 flags.go:64] FLAG: --cgroup-root="" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221591 2583 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221595 2583 flags.go:64] FLAG: --client-ca-file="" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221597 2583 flags.go:64] FLAG: --cloud-config="" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221600 2583 flags.go:64] FLAG: --cloud-provider="external" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221603 2583 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221610 2583 flags.go:64] FLAG: --cluster-domain="" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221612 2583 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221616 2583 flags.go:64] FLAG: --config-dir="" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221618 2583 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221622 2583 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221626 2583 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221629 2583 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221634 2583 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221638 2583 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221641 2583 flags.go:64] FLAG: --contention-profiling="false" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221643 2583 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 17:33:33.223940 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221646 2583 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221650 2583 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221653 2583 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221658 2583 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221661 2583 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221664 2583 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221667 2583 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221670 2583 flags.go:64] FLAG: --enable-server="true" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.221673 2583 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222436 2583 flags.go:64] FLAG: --event-burst="100" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222440 2583 flags.go:64] FLAG: --event-qps="50" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222443 2583 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222447 2583 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222450 2583 flags.go:64] FLAG: --eviction-hard="" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222455 2583 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222458 2583 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222461 2583 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222464 2583 flags.go:64] FLAG: --eviction-soft="" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222467 2583 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222470 2583 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222473 2583 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222476 2583 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222479 2583 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222481 2583 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222484 2583 flags.go:64] FLAG: --feature-gates="" Apr 21 17:33:33.224541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222488 2583 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222491 2583 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222494 2583 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222497 2583 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222502 2583 flags.go:64] FLAG: --healthz-port="10248" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222504 2583 flags.go:64] FLAG: --help="false" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222507 2583 flags.go:64] FLAG: --hostname-override="ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222510 2583 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222513 2583 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222516 2583 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222520 2583 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222523 2583 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222526 2583 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222529 2583 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222531 2583 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222534 2583 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222537 2583 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222540 2583 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222542 2583 flags.go:64] FLAG: --kube-reserved="" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222546 2583 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222549 2583 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222552 2583 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222555 2583 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222557 2583 flags.go:64] FLAG: --lock-file="" Apr 21 17:33:33.225196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222560 2583 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222563 2583 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222566 2583 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222572 2583 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222574 2583 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222577 2583 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222580 2583 flags.go:64] FLAG: --logging-format="text" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222582 2583 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222586 2583 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222588 2583 flags.go:64] FLAG: --manifest-url="" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222591 2583 flags.go:64] FLAG: --manifest-url-header="" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222596 2583 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222598 2583 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222603 2583 flags.go:64] FLAG: --max-pods="110" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222607 2583 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222610 2583 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222612 2583 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222615 2583 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222619 2583 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222622 2583 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222625 2583 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222634 2583 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222637 2583 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222640 2583 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 17:33:33.225829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222643 2583 flags.go:64] FLAG: --pod-cidr="" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222646 2583 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222652 2583 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222655 2583 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222658 2583 flags.go:64] FLAG: --pods-per-core="0" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222660 2583 flags.go:64] FLAG: --port="10250" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222663 2583 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222666 2583 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f445311f94f02db0" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222670 2583 flags.go:64] FLAG: --qos-reserved="" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222672 2583 flags.go:64] FLAG: --read-only-port="10255" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222675 2583 flags.go:64] FLAG: --register-node="true" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222679 2583 flags.go:64] FLAG: --register-schedulable="true" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222682 2583 flags.go:64] FLAG: --register-with-taints="" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222685 2583 flags.go:64] FLAG: --registry-burst="10" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222688 2583 flags.go:64] FLAG: --registry-qps="5" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222690 2583 flags.go:64] FLAG: --reserved-cpus="" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222693 2583 flags.go:64] FLAG: --reserved-memory="" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222697 2583 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222700 2583 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222703 2583 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222706 2583 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222708 2583 flags.go:64] FLAG: --runonce="false" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222712 2583 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222715 2583 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222717 2583 flags.go:64] FLAG: --seccomp-default="false" Apr 21 17:33:33.226422 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222720 2583 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222723 2583 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222727 2583 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222730 2583 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222733 2583 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222736 2583 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222739 2583 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222742 2583 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222745 2583 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222748 2583 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222750 2583 flags.go:64] FLAG: --system-cgroups="" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222753 2583 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222759 2583 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222761 2583 flags.go:64] FLAG: --tls-cert-file="" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222764 2583 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222768 2583 flags.go:64] FLAG: --tls-min-version="" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222771 2583 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222774 2583 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222777 2583 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222779 2583 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222782 2583 flags.go:64] FLAG: --v="2" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222787 2583 flags.go:64] FLAG: --version="false" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222791 2583 flags.go:64] FLAG: --vmodule="" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222796 2583 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.222799 2583 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 17:33:33.227018 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222912 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222917 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222921 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222924 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222928 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222931 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222934 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222937 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222942 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222945 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222947 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222950 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222952 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222955 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222957 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222960 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222962 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222965 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222967 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222970 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 17:33:33.227648 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222972 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222975 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222977 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222980 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222982 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222985 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222987 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222990 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222993 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222995 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.222998 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223000 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223003 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223006 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223008 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223011 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223013 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223015 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223018 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223021 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 17:33:33.228139 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223025 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223027 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223029 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223032 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223035 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223037 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223040 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223042 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223045 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223047 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223049 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223052 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223054 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223057 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223060 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223062 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223065 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223067 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223070 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223072 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 17:33:33.228680 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223074 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223077 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223079 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223082 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223084 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223087 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223089 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223092 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223094 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223096 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223099 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223101 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223105 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223108 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223110 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223113 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223117 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223120 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223124 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 17:33:33.229219 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223126 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 17:33:33.229694 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223129 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 17:33:33.229694 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223133 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 17:33:33.229694 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223137 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 17:33:33.229694 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223140 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 17:33:33.229694 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223143 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 17:33:33.229694 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.223146 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 17:33:33.229694 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.223154 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 17:33:33.230802 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.230773 2583 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 17:33:33.230802 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.230797 2583 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230887 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230896 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230900 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230904 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230907 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230910 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230913 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230916 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230920 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230923 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230926 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230928 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230931 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230934 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230940 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230943 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230946 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230949 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230952 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 17:33:33.230957 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230955 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230958 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230961 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230963 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230966 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230970 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230973 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230982 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230986 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230991 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230995 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.230999 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231003 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231008 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231012 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231016 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231022 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231026 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231030 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231034 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 17:33:33.231654 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231043 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231047 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231051 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231056 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231060 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231064 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231068 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231073 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231079 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231087 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231092 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231096 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231106 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231112 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231117 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231123 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231127 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231132 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231137 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 17:33:33.232150 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231141 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231146 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231152 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231157 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231162 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231188 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231193 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231198 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231202 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231206 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231210 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231214 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231218 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231222 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231226 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231230 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231234 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231238 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231247 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231250 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 17:33:33.232642 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231253 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231255 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231258 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231261 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231264 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231267 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231270 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231274 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.231281 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231733 2583 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231750 2583 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231757 2583 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231762 2583 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231767 2583 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231772 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231777 2583 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 17:33:33.233132 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231781 2583 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231785 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231790 2583 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231794 2583 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231798 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231801 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231805 2583 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231811 2583 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231818 2583 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231823 2583 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231826 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231831 2583 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231835 2583 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231839 2583 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231843 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231847 2583 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231851 2583 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231855 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231860 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 17:33:33.233702 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231864 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231868 2583 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231872 2583 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231876 2583 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231881 2583 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231885 2583 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231889 2583 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231895 2583 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231899 2583 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231903 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231908 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231911 2583 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231913 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231916 2583 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231919 2583 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231921 2583 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231924 2583 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231926 2583 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231929 2583 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231931 2583 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 17:33:33.234185 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231934 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231936 2583 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231939 2583 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231942 2583 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231945 2583 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231948 2583 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231950 2583 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231953 2583 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231956 2583 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231960 2583 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231963 2583 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231965 2583 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231968 2583 feature_gate.go:328] unrecognized feature gate: Example Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231971 2583 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231973 2583 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231976 2583 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231978 2583 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231982 2583 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231985 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 17:33:33.234673 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231989 2583 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231992 2583 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231994 2583 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231997 2583 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.231999 2583 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232002 2583 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232004 2583 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232007 2583 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232010 2583 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232012 2583 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232015 2583 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232018 2583 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232020 2583 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232023 2583 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232025 2583 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232028 2583 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232030 2583 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232036 2583 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232038 2583 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232041 2583 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 17:33:33.235151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:33.232043 2583 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 17:33:33.235668 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.232049 2583 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 17:33:33.235668 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.232859 2583 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 17:33:33.239917 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.239898 2583 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 17:33:33.242162 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.242148 2583 server.go:1019] "Starting client certificate rotation" Apr 21 17:33:33.242289 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.242272 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 17:33:33.242328 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.242312 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 17:33:33.271036 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.271012 2583 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 17:33:33.278380 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.278354 2583 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 17:33:33.295269 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.295249 2583 log.go:25] "Validated CRI v1 runtime API" Apr 21 17:33:33.301660 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.301641 2583 log.go:25] "Validated CRI v1 image API" Apr 21 17:33:33.302934 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.302919 2583 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 17:33:33.309082 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.309062 2583 fs.go:135] Filesystem UUIDs: map[4cd19ee6-6702-47e5-8a38-20d97fed3d6a:/dev/nvme0n1p4 58df0d18-9c59-4dc7-998a-57d62ade8203:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 21 17:33:33.309141 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.309083 2583 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 17:33:33.315577 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.315457 2583 manager.go:217] Machine: {Timestamp:2026-04-21 17:33:33.313837245 +0000 UTC m=+0.453900010 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3094778 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2aef80edbc6b08c06bec7ad5d13418 SystemUUID:ec2aef80-edbc-6b08-c06b-ec7ad5d13418 BootID:33c23563-2c75-4213-96c4-88fa8f7bb0a9 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5e:1d:3f:b8:05 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5e:1d:3f:b8:05 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c6:a8:4e:56:a6:4c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 17:33:33.316015 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.316004 2583 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 17:33:33.316112 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.316100 2583 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 17:33:33.318981 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.318946 2583 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 17:33:33.319254 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.318985 2583 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-92.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 17:33:33.319310 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.319264 2583 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 17:33:33.319310 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.319278 2583 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 17:33:33.319310 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.319296 2583 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 17:33:33.319388 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.319317 2583 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 17:33:33.320744 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.320730 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 21 17:33:33.320871 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.320861 2583 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 17:33:33.324740 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.324727 2583 kubelet.go:491] "Attempting to sync node with API server" Apr 21 17:33:33.324781 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.324745 2583 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 17:33:33.324781 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.324758 2583 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 17:33:33.324781 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.324767 2583 kubelet.go:397] "Adding apiserver pod source" Apr 21 17:33:33.324858 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.324783 2583 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 17:33:33.326254 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.326232 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 17:33:33.326334 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.326267 2583 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 17:33:33.329481 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.329463 2583 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 17:33:33.332315 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.332296 2583 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 17:33:33.334296 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.334281 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 17:33:33.334296 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.334298 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 17:33:33.334405 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.334304 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 17:33:33.334405 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.334309 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 17:33:33.334405 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.334315 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 17:33:33.334405 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.334324 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 17:33:33.334405 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.334332 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 17:33:33.334405 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.334339 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 17:33:33.334405 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.334346 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 17:33:33.334405 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.334353 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 17:33:33.334405 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.334361 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 17:33:33.334405 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.334370 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 17:33:33.335216 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.335203 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 17:33:33.335216 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.335215 2583 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 17:33:33.339079 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.339064 2583 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 17:33:33.339146 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.339104 2583 server.go:1295] "Started kubelet" Apr 21 17:33:33.339265 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.339231 2583 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 17:33:33.339328 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.339226 2583 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 17:33:33.339328 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.339313 2583 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 17:33:33.339961 ip-10-0-129-92 systemd[1]: Started Kubernetes Kubelet. Apr 21 17:33:33.340393 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.340376 2583 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 17:33:33.343016 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.342999 2583 server.go:317] "Adding debug handlers to kubelet server" Apr 21 17:33:33.346613 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.346590 2583 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 17:33:33.348013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.347190 2583 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 17:33:33.348117 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.348101 2583 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 17:33:33.348196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.348124 2583 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 17:33:33.348295 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.348278 2583 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 17:33:33.348350 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.348328 2583 reconstruct.go:97] "Volume reconstruction finished" Apr 21 17:33:33.348350 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.348337 2583 reconciler.go:26] "Reconciler: start to sync state" Apr 21 17:33:33.349158 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.349140 2583 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 17:33:33.349273 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.349161 2583 factory.go:55] Registering systemd factory Apr 21 17:33:33.349273 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.349192 2583 factory.go:223] Registration of the systemd container factory successfully Apr 21 17:33:33.349405 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.349346 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:33.349567 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.349449 2583 factory.go:153] Registering CRI-O factory Apr 21 17:33:33.349567 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.349461 2583 factory.go:223] Registration of the crio container factory successfully Apr 21 17:33:33.349567 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.349476 2583 factory.go:103] Registering Raw factory Apr 21 17:33:33.349567 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.349491 2583 manager.go:1196] Started watching for new ooms in manager Apr 21 17:33:33.350599 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.350582 2583 manager.go:319] Starting recovery of all containers Apr 21 17:33:33.351203 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.351121 2583 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 17:33:33.354405 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.354379 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 17:33:33.354549 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.354529 2583 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-92.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 17:33:33.354874 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.353823 2583 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-92.ec2.internal.18a86fa65e8ae910 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-92.ec2.internal,UID:ip-10-0-129-92.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-92.ec2.internal,},FirstTimestamp:2026-04-21 17:33:33.33907688 +0000 UTC m=+0.479139637,LastTimestamp:2026-04-21 17:33:33.33907688 +0000 UTC m=+0.479139637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-92.ec2.internal,}" Apr 21 17:33:33.355834 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.355653 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 17:33:33.355975 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.355952 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 17:33:33.356327 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.356296 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-92.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 17:33:33.357209 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.357139 2583 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-92.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 17:33:33.362622 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.362605 2583 manager.go:324] Recovery completed Apr 21 17:33:33.366756 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.366742 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 17:33:33.369305 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.369271 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasSufficientMemory" Apr 21 17:33:33.369384 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.369320 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 17:33:33.369384 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.369331 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasSufficientPID" Apr 21 17:33:33.369877 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.369860 2583 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 17:33:33.369877 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.369876 2583 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 17:33:33.369956 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.369914 2583 state_mem.go:36] "Initialized new in-memory state store" Apr 21 17:33:33.372149 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.372082 2583 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-92.ec2.internal.18a86fa660582501 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-92.ec2.internal,UID:ip-10-0-129-92.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-92.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-92.ec2.internal,},FirstTimestamp:2026-04-21 17:33:33.369304321 +0000 UTC m=+0.509367079,LastTimestamp:2026-04-21 17:33:33.369304321 +0000 UTC m=+0.509367079,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-92.ec2.internal,}" Apr 21 17:33:33.372319 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.372305 2583 policy_none.go:49] "None policy: Start" Apr 21 17:33:33.372350 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.372327 2583 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 17:33:33.372350 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.372338 2583 state_mem.go:35] "Initializing new in-memory state store" Apr 21 17:33:33.408096 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.408023 2583 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-92.ec2.internal.18a86fa6605874b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-92.ec2.internal,UID:ip-10-0-129-92.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-129-92.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-129-92.ec2.internal,},FirstTimestamp:2026-04-21 17:33:33.369324722 +0000 UTC m=+0.509387480,LastTimestamp:2026-04-21 17:33:33.369324722 +0000 UTC m=+0.509387480,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-92.ec2.internal,}" Apr 21 17:33:33.416830 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.416809 2583 manager.go:341] "Starting Device Plugin manager" Apr 21 17:33:33.430143 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.416859 2583 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 17:33:33.430143 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.416887 2583 server.go:85] "Starting device plugin registration server" Apr 21 17:33:33.430143 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.417329 2583 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 17:33:33.430143 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.417343 2583 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 17:33:33.430143 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.417428 2583 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 17:33:33.430143 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.417529 2583 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 17:33:33.430143 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.417539 2583 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 17:33:33.430143 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.418067 2583 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 17:33:33.430143 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.418099 2583 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:33.444033 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.443938 2583 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-92.ec2.internal.18a86fa66058a6a5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-92.ec2.internal,UID:ip-10-0-129-92.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-129-92.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-129-92.ec2.internal,},FirstTimestamp:2026-04-21 17:33:33.369337509 +0000 UTC m=+0.509400268,LastTimestamp:2026-04-21 17:33:33.369337509 +0000 UTC m=+0.509400268,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-92.ec2.internal,}" Apr 21 17:33:33.446913 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.446891 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xcb6r" Apr 21 17:33:33.458069 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.458033 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 17:33:33.459515 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.459490 2583 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 17:33:33.459691 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.459523 2583 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 17:33:33.459691 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.459550 2583 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 17:33:33.459691 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.459559 2583 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 17:33:33.459691 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.459603 2583 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 17:33:33.464016 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.463993 2583 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 21 17:33:33.464465 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.464343 2583 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-92.ec2.internal.18a86fa66353faee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-92.ec2.internal,UID:ip-10-0-129-92.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-129-92.ec2.internal,},FirstTimestamp:2026-04-21 17:33:33.419363054 +0000 UTC m=+0.559425800,LastTimestamp:2026-04-21 17:33:33.419363054 +0000 UTC m=+0.559425800,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-92.ec2.internal,}" Apr 21 17:33:33.467222 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.467206 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xcb6r" Apr 21 17:33:33.517605 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.517571 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 17:33:33.518552 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.518531 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasSufficientMemory" Apr 21 17:33:33.518675 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.518562 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 17:33:33.518675 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.518577 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasSufficientPID" Apr 21 17:33:33.518675 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.518606 2583 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.543189 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.543151 2583 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.543334 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.543199 2583 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-92.ec2.internal\": node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:33.560237 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.560203 2583 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-92.ec2.internal"] Apr 21 17:33:33.560398 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.560291 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 17:33:33.561343 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.561326 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasSufficientMemory" Apr 21 17:33:33.561407 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.561357 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 17:33:33.561407 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.561368 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasSufficientPID" Apr 21 17:33:33.562873 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.562858 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 17:33:33.563017 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.563003 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.563052 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.563033 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 17:33:33.563588 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.563566 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasSufficientMemory" Apr 21 17:33:33.563588 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.563588 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 17:33:33.563698 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.563597 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasSufficientPID" Apr 21 17:33:33.563698 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.563644 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasSufficientMemory" Apr 21 17:33:33.563698 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.563665 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 17:33:33.563698 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.563674 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasSufficientPID" Apr 21 17:33:33.565007 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.564993 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.565049 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.565018 2583 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 17:33:33.565749 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.565733 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasSufficientMemory" Apr 21 17:33:33.565830 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.565761 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 17:33:33.565830 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.565777 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeHasSufficientPID" Apr 21 17:33:33.587692 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.587663 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:33.594036 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.594019 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-92.ec2.internal\" not found" node="ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.598545 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.598526 2583 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-92.ec2.internal\" not found" node="ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.687781 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.687739 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:33.750955 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.750863 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1a7117e9f363df7d23bfcc1cce2414c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal\" (UID: \"d1a7117e9f363df7d23bfcc1cce2414c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.750955 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.750900 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1a7117e9f363df7d23bfcc1cce2414c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal\" (UID: \"d1a7117e9f363df7d23bfcc1cce2414c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.750955 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.750917 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/666dcd2066b38a4dbb7941535c2eb7f9-config\") pod \"kube-apiserver-proxy-ip-10-0-129-92.ec2.internal\" (UID: \"666dcd2066b38a4dbb7941535c2eb7f9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.788381 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.788356 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:33.851659 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.851631 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1a7117e9f363df7d23bfcc1cce2414c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal\" (UID: \"d1a7117e9f363df7d23bfcc1cce2414c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.851777 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.851665 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1a7117e9f363df7d23bfcc1cce2414c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal\" (UID: \"d1a7117e9f363df7d23bfcc1cce2414c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.851777 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.851687 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/666dcd2066b38a4dbb7941535c2eb7f9-config\") pod \"kube-apiserver-proxy-ip-10-0-129-92.ec2.internal\" (UID: \"666dcd2066b38a4dbb7941535c2eb7f9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.851777 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.851746 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1a7117e9f363df7d23bfcc1cce2414c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal\" (UID: \"d1a7117e9f363df7d23bfcc1cce2414c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.851894 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.851803 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1a7117e9f363df7d23bfcc1cce2414c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal\" (UID: \"d1a7117e9f363df7d23bfcc1cce2414c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.851894 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.851856 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/666dcd2066b38a4dbb7941535c2eb7f9-config\") pod \"kube-apiserver-proxy-ip-10-0-129-92.ec2.internal\" (UID: \"666dcd2066b38a4dbb7941535c2eb7f9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.888697 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.888661 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:33.895887 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.895864 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.901639 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:33.901612 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-92.ec2.internal" Apr 21 17:33:33.989188 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:33.989128 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:34.089706 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:34.089628 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:34.190206 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:34.190154 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:34.240494 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.240440 2583 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 17:33:34.291138 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:34.291104 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:34.347647 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.347566 2583 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 17:33:34.391384 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:34.391358 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:34.393984 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.393964 2583 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 17:33:34.399535 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:34.399501 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod666dcd2066b38a4dbb7941535c2eb7f9.slice/crio-54e1f250969514e3f68213dc09ebe30186542116ee9c86d029a87aac3d739df1 WatchSource:0}: Error finding container 54e1f250969514e3f68213dc09ebe30186542116ee9c86d029a87aac3d739df1: Status 404 returned error can't find the container with id 54e1f250969514e3f68213dc09ebe30186542116ee9c86d029a87aac3d739df1 Apr 21 17:33:34.400001 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:34.399978 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1a7117e9f363df7d23bfcc1cce2414c.slice/crio-86fc941a8c1e52ddf2dcc39db9a162a439f18fbcae0ea53abeb461f5d69da984 WatchSource:0}: Error finding container 86fc941a8c1e52ddf2dcc39db9a162a439f18fbcae0ea53abeb461f5d69da984: Status 404 returned error can't find the container with id 86fc941a8c1e52ddf2dcc39db9a162a439f18fbcae0ea53abeb461f5d69da984 Apr 21 17:33:34.405364 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.405346 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 17:33:34.458431 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.458396 2583 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-86fzr" Apr 21 17:33:34.463449 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.463403 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal" event={"ID":"d1a7117e9f363df7d23bfcc1cce2414c","Type":"ContainerStarted","Data":"86fc941a8c1e52ddf2dcc39db9a162a439f18fbcae0ea53abeb461f5d69da984"} Apr 21 17:33:34.464250 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.464231 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-92.ec2.internal" event={"ID":"666dcd2066b38a4dbb7941535c2eb7f9","Type":"ContainerStarted","Data":"54e1f250969514e3f68213dc09ebe30186542116ee9c86d029a87aac3d739df1"} Apr 21 17:33:34.469406 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.469380 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 17:28:33 +0000 UTC" deadline="2028-01-16 05:40:39.667624308 +0000 UTC" Apr 21 17:33:34.469464 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.469407 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15228h7m5.198220943s" Apr 21 17:33:34.473946 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.473930 2583 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-86fzr" Apr 21 17:33:34.491625 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:34.491589 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:34.591830 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:34.591791 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:34.636610 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.636520 2583 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 17:33:34.692851 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:34.692805 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:34.708440 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.708410 2583 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 17:33:34.755889 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.755860 2583 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 17:33:34.793492 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:34.793452 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:34.894211 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:34.894105 2583 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-92.ec2.internal\" not found" Apr 21 17:33:34.902250 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.902215 2583 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 17:33:34.948983 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.948936 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal" Apr 21 17:33:34.964706 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.964665 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 17:33:34.964887 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.964812 2583 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-92.ec2.internal" Apr 21 17:33:34.976901 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:34.976864 2583 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 17:33:35.327232 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.327195 2583 apiserver.go:52] "Watching apiserver" Apr 21 17:33:35.334238 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.334209 2583 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 17:33:35.336453 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.336426 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal","openshift-multus/multus-sgc5c","openshift-ovn-kubernetes/ovnkube-node-xfgp5","kube-system/konnectivity-agent-sjdgx","openshift-image-registry/node-ca-hmlbc","openshift-multus/multus-additional-cni-plugins-4qt27","openshift-multus/network-metrics-daemon-rfmv6","openshift-network-diagnostics/network-check-target-5bfpn","openshift-network-operator/iptables-alerter-ksjjv","kube-system/kube-apiserver-proxy-ip-10-0-129-92.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s","openshift-cluster-node-tuning-operator/tuned-gblls","openshift-dns/node-resolver-hccth"] Apr 21 17:33:35.338508 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.338482 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:35.338623 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:35.338560 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:33:35.339593 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.339561 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.341100 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.341065 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.342603 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.342583 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 17:33:35.342697 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.342582 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 17:33:35.343279 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.342947 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 17:33:35.343279 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.342962 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-g24gs\"" Apr 21 17:33:35.343279 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.343014 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-sjdgx" Apr 21 17:33:35.343874 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.343853 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 17:33:35.344611 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.344592 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 17:33:35.344698 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.344595 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-95lrc\"" Apr 21 17:33:35.345416 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.345397 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hmlbc" Apr 21 17:33:35.345500 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.345486 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.346954 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.346844 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:35.346954 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:35.346913 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:33:35.348136 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.348036 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 17:33:35.351013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.350475 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ksjjv" Apr 21 17:33:35.351013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.350556 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 17:33:35.351013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.350809 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 17:33:35.351711 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.351627 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 17:33:35.351971 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.351952 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 17:33:35.352127 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.352106 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 17:33:35.352219 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.352153 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-n5nk4\"" Apr 21 17:33:35.352219 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.352195 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jgxnc\"" Apr 21 17:33:35.352374 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.352113 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-q4hxb\"" Apr 21 17:33:35.352433 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.352382 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 17:33:35.352645 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.352629 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 17:33:35.352699 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.352671 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 17:33:35.352831 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.352386 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 17:33:35.353077 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.353062 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 17:33:35.353506 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.353429 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hp4ht\"" Apr 21 17:33:35.353506 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.353483 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 17:33:35.354308 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.354287 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.355024 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.354998 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 17:33:35.355514 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.355498 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.355601 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.355517 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hccth" Apr 21 17:33:35.358824 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.358802 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 17:33:35.358824 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.358817 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 17:33:35.358824 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.358805 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 17:33:35.359021 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.358826 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 17:33:35.359092 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.359065 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 17:33:35.360191 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360159 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 17:33:35.360284 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360203 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-hostroot\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.360284 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360260 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-etc-kubernetes\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.360284 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360212 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8xq45\"" Apr 21 17:33:35.360440 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360286 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nzsmm\"" Apr 21 17:33:35.360440 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360288 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/45e8c620-ac92-4664-985c-5abe0fc26bed-agent-certs\") pod \"konnectivity-agent-sjdgx\" (UID: \"45e8c620-ac92-4664-985c-5abe0fc26bed\") " pod="kube-system/konnectivity-agent-sjdgx" Apr 21 17:33:35.360440 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360360 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 17:33:35.360440 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360332 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6ghv\" (UniqueName: \"kubernetes.io/projected/47035621-4957-4280-94ce-ecd6810f7254-kube-api-access-d6ghv\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.360440 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360397 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-etc-selinux\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.360440 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360435 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-sys-fs\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.360674 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360462 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-multus-cni-dir\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.360674 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360265 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-shh2b\"" Apr 21 17:33:35.360674 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360493 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 17:33:35.360674 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360486 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-var-lib-kubelet\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.360674 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360539 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-run-ovn\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.360674 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360564 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edc1db03-a462-4f21-bb36-369766777418-cni-binary-copy\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.360674 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360589 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-multus-conf-dir\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.360674 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360620 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edc1db03-a462-4f21-bb36-369766777418-multus-daemon-config\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.360674 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360651 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5d8e485-86a1-4255-912b-af222842087c-host\") pod \"node-ca-hmlbc\" (UID: \"d5d8e485-86a1-4255-912b-af222842087c\") " pod="openshift-image-registry/node-ca-hmlbc" Apr 21 17:33:35.361031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360676 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-run-systemd\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.361031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360702 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-var-lib-cni-bin\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.361031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360744 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-multus-socket-dir-parent\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.361031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360786 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-kubelet-dir\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.361031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360813 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-socket-dir\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.361031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360842 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-var-lib-cni-multus\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.361031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360871 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-systemd-units\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.361031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360892 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47035621-4957-4280-94ce-ecd6810f7254-os-release\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.361031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360953 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/47035621-4957-4280-94ce-ecd6810f7254-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.361031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.360968 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-etc-openvswitch\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.361031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361019 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-node-log\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361040 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361082 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftvdk\" (UniqueName: \"kubernetes.io/projected/14778c8a-9e3f-4e53-aea1-4de908a64e9f-kube-api-access-ftvdk\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361116 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/45e8c620-ac92-4664-985c-5abe0fc26bed-konnectivity-ca\") pod \"konnectivity-agent-sjdgx\" (UID: \"45e8c620-ac92-4664-985c-5abe0fc26bed\") " pod="kube-system/konnectivity-agent-sjdgx" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361144 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47035621-4957-4280-94ce-ecd6810f7254-system-cni-dir\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361161 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47035621-4957-4280-94ce-ecd6810f7254-cnibin\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361210 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwzrs\" (UniqueName: \"kubernetes.io/projected/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-kube-api-access-jwzrs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361234 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-system-cni-dir\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361256 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-cni-bin\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361284 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56-iptables-alerter-script\") pod \"iptables-alerter-ksjjv\" (UID: \"eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56\") " pod="openshift-network-operator/iptables-alerter-ksjjv" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361306 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-cnibin\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361329 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-run-netns\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361351 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xqsc\" (UniqueName: \"kubernetes.io/projected/edc1db03-a462-4f21-bb36-369766777418-kube-api-access-4xqsc\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361374 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-log-socket\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361414 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-run-ovn-kubernetes\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361448 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14778c8a-9e3f-4e53-aea1-4de908a64e9f-ovn-node-metrics-cert\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.361520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361479 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56-host-slash\") pod \"iptables-alerter-ksjjv\" (UID: \"eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56\") " pod="openshift-network-operator/iptables-alerter-ksjjv" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361508 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-os-release\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361532 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-run-multus-certs\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361547 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-run-netns\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361569 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-run-openvswitch\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361604 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d5d8e485-86a1-4255-912b-af222842087c-serviceca\") pod \"node-ca-hmlbc\" (UID: \"d5d8e485-86a1-4255-912b-af222842087c\") " pod="openshift-image-registry/node-ca-hmlbc" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361634 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5jzm\" (UniqueName: \"kubernetes.io/projected/eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56-kube-api-access-q5jzm\") pod \"iptables-alerter-ksjjv\" (UID: \"eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56\") " pod="openshift-network-operator/iptables-alerter-ksjjv" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361659 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-var-lib-openvswitch\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361696 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14778c8a-9e3f-4e53-aea1-4de908a64e9f-ovnkube-config\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361721 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqnwn\" (UniqueName: \"kubernetes.io/projected/d5d8e485-86a1-4255-912b-af222842087c-kube-api-access-tqnwn\") pod \"node-ca-hmlbc\" (UID: \"d5d8e485-86a1-4255-912b-af222842087c\") " pod="openshift-image-registry/node-ca-hmlbc" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361742 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47035621-4957-4280-94ce-ecd6810f7254-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361768 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/47035621-4957-4280-94ce-ecd6810f7254-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361794 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-registration-dir\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361821 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-cni-netd\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361855 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14778c8a-9e3f-4e53-aea1-4de908a64e9f-env-overrides\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361878 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47035621-4957-4280-94ce-ecd6810f7254-cni-binary-copy\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.362236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361893 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:35.362802 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361907 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd69d\" (UniqueName: \"kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d\") pod \"network-check-target-5bfpn\" (UID: \"fbb6a7fe-cc60-43c1-919d-78f0d38148cd\") " pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:35.362802 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361952 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl89t\" (UniqueName: \"kubernetes.io/projected/aaab4344-56a1-42b9-9a96-5071b6e23282-kube-api-access-nl89t\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.362802 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.361998 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-run-k8s-cni-cncf-io\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.362802 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.362023 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-kubelet\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.362802 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.362049 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-slash\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.362802 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.362076 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/14778c8a-9e3f-4e53-aea1-4de908a64e9f-ovnkube-script-lib\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.362802 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.362100 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-device-dir\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.365647 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.365601 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 17:33:35.449196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.449148 2583 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 17:33:35.462342 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462304 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-cni-netd\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.462515 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462350 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14778c8a-9e3f-4e53-aea1-4de908a64e9f-env-overrides\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.462515 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462380 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47035621-4957-4280-94ce-ecd6810f7254-cni-binary-copy\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.462515 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462392 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-cni-netd\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.462515 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462409 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/16529597-1f2f-47de-ade5-9fb7b122147c-tmp-dir\") pod \"node-resolver-hccth\" (UID: \"16529597-1f2f-47de-ade5-9fb7b122147c\") " pod="openshift-dns/node-resolver-hccth" Apr 21 17:33:35.462713 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462560 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:35.462713 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462588 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd69d\" (UniqueName: \"kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d\") pod \"network-check-target-5bfpn\" (UID: \"fbb6a7fe-cc60-43c1-919d-78f0d38148cd\") " pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:35.462713 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462623 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nl89t\" (UniqueName: \"kubernetes.io/projected/aaab4344-56a1-42b9-9a96-5071b6e23282-kube-api-access-nl89t\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.462713 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462654 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-sysconfig\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.462713 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462677 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-systemd\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.462713 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462703 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-sys\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.462926 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:35.462724 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:35.462926 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462731 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5kff\" (UniqueName: \"kubernetes.io/projected/16529597-1f2f-47de-ade5-9fb7b122147c-kube-api-access-k5kff\") pod \"node-resolver-hccth\" (UID: \"16529597-1f2f-47de-ade5-9fb7b122147c\") " pod="openshift-dns/node-resolver-hccth" Apr 21 17:33:35.462926 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462756 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-run-k8s-cni-cncf-io\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.462926 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462781 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-kubelet\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.462926 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:35.462810 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs podName:38cd15ba-d0c7-4b4f-b220-f72981ccd9da nodeName:}" failed. No retries permitted until 2026-04-21 17:33:35.96277691 +0000 UTC m=+3.102839675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs") pod "network-metrics-daemon-rfmv6" (UID: "38cd15ba-d0c7-4b4f-b220-f72981ccd9da") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:35.462926 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462827 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-kubelet\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.462926 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462903 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-run-k8s-cni-cncf-io\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.463188 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462928 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-slash\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.463188 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462935 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/14778c8a-9e3f-4e53-aea1-4de908a64e9f-env-overrides\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.463188 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.462956 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-modprobe-d\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.463188 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463008 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-slash\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.463188 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463054 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s9wb\" (UniqueName: \"kubernetes.io/projected/b1a13fb4-794b-44ec-aaad-3da758847a9e-kube-api-access-8s9wb\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.463188 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463067 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/47035621-4957-4280-94ce-ecd6810f7254-cni-binary-copy\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.463188 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463087 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/14778c8a-9e3f-4e53-aea1-4de908a64e9f-ovnkube-script-lib\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.463188 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463114 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-device-dir\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.463553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463219 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-sysctl-d\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.463553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463249 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-hostroot\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.463553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463266 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-etc-kubernetes\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.463553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463282 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/45e8c620-ac92-4664-985c-5abe0fc26bed-agent-certs\") pod \"konnectivity-agent-sjdgx\" (UID: \"45e8c620-ac92-4664-985c-5abe0fc26bed\") " pod="kube-system/konnectivity-agent-sjdgx" Apr 21 17:33:35.463553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463304 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6ghv\" (UniqueName: \"kubernetes.io/projected/47035621-4957-4280-94ce-ecd6810f7254-kube-api-access-d6ghv\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.463553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463320 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-etc-selinux\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.463553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463320 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-etc-kubernetes\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.463553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463320 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-hostroot\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.463553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463335 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-sys-fs\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.463553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463368 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-run\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.463553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463385 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-sys-fs\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.463553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463339 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-device-dir\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.463553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463404 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-etc-selinux\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.463553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463521 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-multus-cni-dir\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.463961 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.463539 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/14778c8a-9e3f-4e53-aea1-4de908a64e9f-ovnkube-script-lib\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.464100 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464079 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-multus-cni-dir\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.464164 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464149 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-var-lib-kubelet\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.464257 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464242 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-run-ovn\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.464309 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464284 2583 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 17:33:35.464389 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464311 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-run-ovn\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.464389 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464313 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edc1db03-a462-4f21-bb36-369766777418-cni-binary-copy\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.464389 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464361 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-multus-conf-dir\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.464522 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464387 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edc1db03-a462-4f21-bb36-369766777418-multus-daemon-config\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.464522 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464417 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5d8e485-86a1-4255-912b-af222842087c-host\") pod \"node-ca-hmlbc\" (UID: \"d5d8e485-86a1-4255-912b-af222842087c\") " pod="openshift-image-registry/node-ca-hmlbc" Apr 21 17:33:35.464522 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464450 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/16529597-1f2f-47de-ade5-9fb7b122147c-hosts-file\") pod \"node-resolver-hccth\" (UID: \"16529597-1f2f-47de-ade5-9fb7b122147c\") " pod="openshift-dns/node-resolver-hccth" Apr 21 17:33:35.464522 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464484 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-run-systemd\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.464522 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464516 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-var-lib-cni-bin\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.464758 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464548 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-multus-socket-dir-parent\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.464758 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464576 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-kubelet-dir\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.464758 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464607 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-socket-dir\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.464758 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464641 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-var-lib-cni-multus\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.464758 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464677 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-systemd-units\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.464758 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464710 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47035621-4957-4280-94ce-ecd6810f7254-os-release\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.464758 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464742 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-lib-modules\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.465013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464767 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-host\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.465013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464798 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/47035621-4957-4280-94ce-ecd6810f7254-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.465013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464827 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-etc-openvswitch\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.465013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464826 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edc1db03-a462-4f21-bb36-369766777418-cni-binary-copy\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.465013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464855 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-node-log\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.465013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464885 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.465013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464910 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftvdk\" (UniqueName: \"kubernetes.io/projected/14778c8a-9e3f-4e53-aea1-4de908a64e9f-kube-api-access-ftvdk\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.465013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464939 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/45e8c620-ac92-4664-985c-5abe0fc26bed-konnectivity-ca\") pod \"konnectivity-agent-sjdgx\" (UID: \"45e8c620-ac92-4664-985c-5abe0fc26bed\") " pod="kube-system/konnectivity-agent-sjdgx" Apr 21 17:33:35.465013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464952 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-var-lib-kubelet\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.465013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464966 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47035621-4957-4280-94ce-ecd6810f7254-system-cni-dir\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.465013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464996 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47035621-4957-4280-94ce-ecd6810f7254-cnibin\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465026 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwzrs\" (UniqueName: \"kubernetes.io/projected/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-kube-api-access-jwzrs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465029 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-multus-conf-dir\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465074 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-system-cni-dir\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465110 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-cni-bin\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465141 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56-iptables-alerter-script\") pod \"iptables-alerter-ksjjv\" (UID: \"eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56\") " pod="openshift-network-operator/iptables-alerter-ksjjv" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465145 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-run-systemd\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465191 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-socket-dir\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465205 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-tuned\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465256 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-var-lib-cni-multus\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465317 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-node-log\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465354 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/47035621-4957-4280-94ce-ecd6810f7254-os-release\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465361 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d5d8e485-86a1-4255-912b-af222842087c-host\") pod \"node-ca-hmlbc\" (UID: \"d5d8e485-86a1-4255-912b-af222842087c\") " pod="openshift-image-registry/node-ca-hmlbc" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465397 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-var-lib-cni-bin\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465401 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.464910 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-kubelet-dir\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.465545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465463 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-multus-socket-dir-parent\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465760 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edc1db03-a462-4f21-bb36-369766777418-multus-daemon-config\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465805 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b1a13fb4-794b-44ec-aaad-3da758847a9e-tmp\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465871 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-cnibin\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465899 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/45e8c620-ac92-4664-985c-5abe0fc26bed-konnectivity-ca\") pod \"konnectivity-agent-sjdgx\" (UID: \"45e8c620-ac92-4664-985c-5abe0fc26bed\") " pod="kube-system/konnectivity-agent-sjdgx" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465900 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-run-netns\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465940 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-run-netns\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465962 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xqsc\" (UniqueName: \"kubernetes.io/projected/edc1db03-a462-4f21-bb36-369766777418-kube-api-access-4xqsc\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465981 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-log-socket\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.465982 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-cni-bin\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466009 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-run-ovn-kubernetes\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466048 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14778c8a-9e3f-4e53-aea1-4de908a64e9f-ovn-node-metrics-cert\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466090 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-system-cni-dir\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466131 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56-host-slash\") pod \"iptables-alerter-ksjjv\" (UID: \"eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56\") " pod="openshift-network-operator/iptables-alerter-ksjjv" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466136 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/47035621-4957-4280-94ce-ecd6810f7254-cnibin\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466203 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/47035621-4957-4280-94ce-ecd6810f7254-system-cni-dir\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.466206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466213 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-os-release\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466270 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-etc-openvswitch\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466302 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-run-multus-certs\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466346 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-systemd-units\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466401 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-os-release\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466571 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-cnibin\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466615 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-run-netns\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466647 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-run-openvswitch\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466677 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d5d8e485-86a1-4255-912b-af222842087c-serviceca\") pod \"node-ca-hmlbc\" (UID: \"d5d8e485-86a1-4255-912b-af222842087c\") " pod="openshift-image-registry/node-ca-hmlbc" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466675 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/47035621-4957-4280-94ce-ecd6810f7254-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466727 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-run-ovn-kubernetes\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466735 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56-host-slash\") pod \"iptables-alerter-ksjjv\" (UID: \"eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56\") " pod="openshift-network-operator/iptables-alerter-ksjjv" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466774 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5jzm\" (UniqueName: \"kubernetes.io/projected/eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56-kube-api-access-q5jzm\") pod \"iptables-alerter-ksjjv\" (UID: \"eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56\") " pod="openshift-network-operator/iptables-alerter-ksjjv" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466800 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edc1db03-a462-4f21-bb36-369766777418-host-run-multus-certs\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466812 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-kubernetes\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466837 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-var-lib-openvswitch\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466843 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-log-socket\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466859 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14778c8a-9e3f-4e53-aea1-4de908a64e9f-ovnkube-config\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.467060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466916 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-var-lib-openvswitch\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.467882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466923 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqnwn\" (UniqueName: \"kubernetes.io/projected/d5d8e485-86a1-4255-912b-af222842087c-kube-api-access-tqnwn\") pod \"node-ca-hmlbc\" (UID: \"d5d8e485-86a1-4255-912b-af222842087c\") " pod="openshift-image-registry/node-ca-hmlbc" Apr 21 17:33:35.467882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466963 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-host-run-netns\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.467882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.466992 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47035621-4957-4280-94ce-ecd6810f7254-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.467882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.467003 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/14778c8a-9e3f-4e53-aea1-4de908a64e9f-run-openvswitch\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.467882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.467035 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/47035621-4957-4280-94ce-ecd6810f7254-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.467882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.467069 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-registration-dir\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.467882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.467108 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56-iptables-alerter-script\") pod \"iptables-alerter-ksjjv\" (UID: \"eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56\") " pod="openshift-network-operator/iptables-alerter-ksjjv" Apr 21 17:33:35.467882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.467127 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-sysctl-conf\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.467882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.467153 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/47035621-4957-4280-94ce-ecd6810f7254-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.467882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.467163 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-var-lib-kubelet\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.467882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.467345 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/aaab4344-56a1-42b9-9a96-5071b6e23282-registration-dir\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.467882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.467490 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d5d8e485-86a1-4255-912b-af222842087c-serviceca\") pod \"node-ca-hmlbc\" (UID: \"d5d8e485-86a1-4255-912b-af222842087c\") " pod="openshift-image-registry/node-ca-hmlbc" Apr 21 17:33:35.467882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.467569 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/47035621-4957-4280-94ce-ecd6810f7254-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.468480 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.467919 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/14778c8a-9e3f-4e53-aea1-4de908a64e9f-ovnkube-config\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.469675 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.469654 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/45e8c620-ac92-4664-985c-5abe0fc26bed-agent-certs\") pod \"konnectivity-agent-sjdgx\" (UID: \"45e8c620-ac92-4664-985c-5abe0fc26bed\") " pod="kube-system/konnectivity-agent-sjdgx" Apr 21 17:33:35.470619 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.470603 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/14778c8a-9e3f-4e53-aea1-4de908a64e9f-ovn-node-metrics-cert\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.477673 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.477636 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 17:28:34 +0000 UTC" deadline="2027-12-18 17:33:02.674212062 +0000 UTC" Apr 21 17:33:35.477673 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.477673 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14543h59m27.196543288s" Apr 21 17:33:35.482520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.482464 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl89t\" (UniqueName: \"kubernetes.io/projected/aaab4344-56a1-42b9-9a96-5071b6e23282-kube-api-access-nl89t\") pod \"aws-ebs-csi-driver-node-msn6s\" (UID: \"aaab4344-56a1-42b9-9a96-5071b6e23282\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.485727 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:35.485701 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 17:33:35.485844 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:35.485731 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 17:33:35.485844 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:35.485747 2583 projected.go:194] Error preparing data for projected volume kube-api-access-jd69d for pod openshift-network-diagnostics/network-check-target-5bfpn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:35.485844 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:35.485836 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d podName:fbb6a7fe-cc60-43c1-919d-78f0d38148cd nodeName:}" failed. No retries permitted until 2026-04-21 17:33:35.98581566 +0000 UTC m=+3.125878409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jd69d" (UniqueName: "kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d") pod "network-check-target-5bfpn" (UID: "fbb6a7fe-cc60-43c1-919d-78f0d38148cd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:35.488363 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.488340 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xqsc\" (UniqueName: \"kubernetes.io/projected/edc1db03-a462-4f21-bb36-369766777418-kube-api-access-4xqsc\") pod \"multus-sgc5c\" (UID: \"edc1db03-a462-4f21-bb36-369766777418\") " pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.488477 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.488385 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwzrs\" (UniqueName: \"kubernetes.io/projected/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-kube-api-access-jwzrs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:35.488535 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.488476 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6ghv\" (UniqueName: \"kubernetes.io/projected/47035621-4957-4280-94ce-ecd6810f7254-kube-api-access-d6ghv\") pod \"multus-additional-cni-plugins-4qt27\" (UID: \"47035621-4957-4280-94ce-ecd6810f7254\") " pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.489649 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.489624 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftvdk\" (UniqueName: \"kubernetes.io/projected/14778c8a-9e3f-4e53-aea1-4de908a64e9f-kube-api-access-ftvdk\") pod \"ovnkube-node-xfgp5\" (UID: \"14778c8a-9e3f-4e53-aea1-4de908a64e9f\") " pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.491186 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.491152 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5jzm\" (UniqueName: \"kubernetes.io/projected/eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56-kube-api-access-q5jzm\") pod \"iptables-alerter-ksjjv\" (UID: \"eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56\") " pod="openshift-network-operator/iptables-alerter-ksjjv" Apr 21 17:33:35.492235 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.492216 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqnwn\" (UniqueName: \"kubernetes.io/projected/d5d8e485-86a1-4255-912b-af222842087c-kube-api-access-tqnwn\") pod \"node-ca-hmlbc\" (UID: \"d5d8e485-86a1-4255-912b-af222842087c\") " pod="openshift-image-registry/node-ca-hmlbc" Apr 21 17:33:35.567900 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.567856 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-run\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568061 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.567917 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/16529597-1f2f-47de-ade5-9fb7b122147c-hosts-file\") pod \"node-resolver-hccth\" (UID: \"16529597-1f2f-47de-ade5-9fb7b122147c\") " pod="openshift-dns/node-resolver-hccth" Apr 21 17:33:35.568061 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.567939 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-run\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568061 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.567947 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-lib-modules\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568061 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.567968 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-host\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568061 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568003 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/16529597-1f2f-47de-ade5-9fb7b122147c-hosts-file\") pod \"node-resolver-hccth\" (UID: \"16529597-1f2f-47de-ade5-9fb7b122147c\") " pod="openshift-dns/node-resolver-hccth" Apr 21 17:33:35.568061 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568028 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-host\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568061 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568050 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-tuned\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568388 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568066 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b1a13fb4-794b-44ec-aaad-3da758847a9e-tmp\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568388 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568098 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-kubernetes\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568388 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568113 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-lib-modules\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568388 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568185 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-kubernetes\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568388 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568127 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-sysctl-conf\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568388 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568254 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-sysctl-conf\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568388 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568261 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-var-lib-kubelet\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568388 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568296 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/16529597-1f2f-47de-ade5-9fb7b122147c-tmp-dir\") pod \"node-resolver-hccth\" (UID: \"16529597-1f2f-47de-ade5-9fb7b122147c\") " pod="openshift-dns/node-resolver-hccth" Apr 21 17:33:35.568388 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568349 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-sysconfig\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568388 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568373 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-systemd\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568388 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568376 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-var-lib-kubelet\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.568388 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568394 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-sys\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.569031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568420 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5kff\" (UniqueName: \"kubernetes.io/projected/16529597-1f2f-47de-ade5-9fb7b122147c-kube-api-access-k5kff\") pod \"node-resolver-hccth\" (UID: \"16529597-1f2f-47de-ade5-9fb7b122147c\") " pod="openshift-dns/node-resolver-hccth" Apr 21 17:33:35.569031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568451 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-modprobe-d\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.569031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568434 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-sysconfig\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.569031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568487 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-systemd\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.569031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568497 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-sys\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.569031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568544 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8s9wb\" (UniqueName: \"kubernetes.io/projected/b1a13fb4-794b-44ec-aaad-3da758847a9e-kube-api-access-8s9wb\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.569031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568570 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-modprobe-d\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.569031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568577 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-sysctl-d\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.569031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568690 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-sysctl-d\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.569031 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.568733 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/16529597-1f2f-47de-ade5-9fb7b122147c-tmp-dir\") pod \"node-resolver-hccth\" (UID: \"16529597-1f2f-47de-ade5-9fb7b122147c\") " pod="openshift-dns/node-resolver-hccth" Apr 21 17:33:35.571116 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.571094 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b1a13fb4-794b-44ec-aaad-3da758847a9e-etc-tuned\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.571240 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.571131 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b1a13fb4-794b-44ec-aaad-3da758847a9e-tmp\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.580400 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.580341 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5kff\" (UniqueName: \"kubernetes.io/projected/16529597-1f2f-47de-ade5-9fb7b122147c-kube-api-access-k5kff\") pod \"node-resolver-hccth\" (UID: \"16529597-1f2f-47de-ade5-9fb7b122147c\") " pod="openshift-dns/node-resolver-hccth" Apr 21 17:33:35.580400 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.580350 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s9wb\" (UniqueName: \"kubernetes.io/projected/b1a13fb4-794b-44ec-aaad-3da758847a9e-kube-api-access-8s9wb\") pod \"tuned-gblls\" (UID: \"b1a13fb4-794b-44ec-aaad-3da758847a9e\") " pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.651206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.651152 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sgc5c" Apr 21 17:33:35.660748 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.660722 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:35.669594 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.669551 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-sjdgx" Apr 21 17:33:35.675302 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.675281 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hmlbc" Apr 21 17:33:35.681445 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.681420 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4qt27" Apr 21 17:33:35.688076 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.688053 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ksjjv" Apr 21 17:33:35.695717 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.695693 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" Apr 21 17:33:35.703347 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.703319 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gblls" Apr 21 17:33:35.709013 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.708993 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hccth" Apr 21 17:33:35.971249 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:35.971196 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:35.971448 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:35.971356 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:35.971448 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:35.971440 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs podName:38cd15ba-d0c7-4b4f-b220-f72981ccd9da nodeName:}" failed. No retries permitted until 2026-04-21 17:33:36.971417645 +0000 UTC m=+4.111480404 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs") pod "network-metrics-daemon-rfmv6" (UID: "38cd15ba-d0c7-4b4f-b220-f72981ccd9da") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:36.046937 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:36.046906 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaab4344_56a1_42b9_9a96_5071b6e23282.slice/crio-dc298a8363103f23e266de55c52f2d65917b7a790e3ea3286a66e0e5be75ed20 WatchSource:0}: Error finding container dc298a8363103f23e266de55c52f2d65917b7a790e3ea3286a66e0e5be75ed20: Status 404 returned error can't find the container with id dc298a8363103f23e266de55c52f2d65917b7a790e3ea3286a66e0e5be75ed20 Apr 21 17:33:36.048237 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:36.048209 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1a13fb4_794b_44ec_aaad_3da758847a9e.slice/crio-8ca45ea24202b101b8b08fb57386956f9ea0a15013a32dd9581c1ed6adb154c2 WatchSource:0}: Error finding container 8ca45ea24202b101b8b08fb57386956f9ea0a15013a32dd9581c1ed6adb154c2: Status 404 returned error can't find the container with id 8ca45ea24202b101b8b08fb57386956f9ea0a15013a32dd9581c1ed6adb154c2 Apr 21 17:33:36.049570 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:36.049513 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45e8c620_ac92_4664_985c_5abe0fc26bed.slice/crio-b02df8ab37b42d69e80aa44f6d890656de47009aa6d98c767d981d86e9ea53f9 WatchSource:0}: Error finding container b02df8ab37b42d69e80aa44f6d890656de47009aa6d98c767d981d86e9ea53f9: Status 404 returned error can't find the container with id b02df8ab37b42d69e80aa44f6d890656de47009aa6d98c767d981d86e9ea53f9 Apr 21 17:33:36.050827 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:36.050796 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb995f1e_e69f_4fb1_b42d_0bc3ebbc7b56.slice/crio-9195dcb8334ce5a18581fae03f5d8b0ee4dd55a70de7c2b4526aeaa02244cb61 WatchSource:0}: Error finding container 9195dcb8334ce5a18581fae03f5d8b0ee4dd55a70de7c2b4526aeaa02244cb61: Status 404 returned error can't find the container with id 9195dcb8334ce5a18581fae03f5d8b0ee4dd55a70de7c2b4526aeaa02244cb61 Apr 21 17:33:36.053658 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:36.053627 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14778c8a_9e3f_4e53_aea1_4de908a64e9f.slice/crio-7c0996be6cfc653cebd859a4650297a7d219def52f3d8216fd877871854a819f WatchSource:0}: Error finding container 7c0996be6cfc653cebd859a4650297a7d219def52f3d8216fd877871854a819f: Status 404 returned error can't find the container with id 7c0996be6cfc653cebd859a4650297a7d219def52f3d8216fd877871854a819f Apr 21 17:33:36.054319 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:36.054296 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5d8e485_86a1_4255_912b_af222842087c.slice/crio-41109a926c442f707538bc96ce415757766028fa794d7c4e4d2e7af8084664d0 WatchSource:0}: Error finding container 41109a926c442f707538bc96ce415757766028fa794d7c4e4d2e7af8084664d0: Status 404 returned error can't find the container with id 41109a926c442f707538bc96ce415757766028fa794d7c4e4d2e7af8084664d0 Apr 21 17:33:36.056151 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:36.055816 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedc1db03_a462_4f21_bb36_369766777418.slice/crio-aac9cd787c384da5a6ba1b7a73d4f0e984cd691f425a8f38edba8f0e96df95ae WatchSource:0}: Error finding container aac9cd787c384da5a6ba1b7a73d4f0e984cd691f425a8f38edba8f0e96df95ae: Status 404 returned error can't find the container with id aac9cd787c384da5a6ba1b7a73d4f0e984cd691f425a8f38edba8f0e96df95ae Apr 21 17:33:36.056875 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:33:36.056851 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47035621_4957_4280_94ce_ecd6810f7254.slice/crio-2706b7045797bbada357d9ec81ad701741de68ab79347f026fc7a1af1b69c51a WatchSource:0}: Error finding container 2706b7045797bbada357d9ec81ad701741de68ab79347f026fc7a1af1b69c51a: Status 404 returned error can't find the container with id 2706b7045797bbada357d9ec81ad701741de68ab79347f026fc7a1af1b69c51a Apr 21 17:33:36.071726 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.071522 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd69d\" (UniqueName: \"kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d\") pod \"network-check-target-5bfpn\" (UID: \"fbb6a7fe-cc60-43c1-919d-78f0d38148cd\") " pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:36.071824 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:36.071664 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 17:33:36.071824 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:36.071801 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 17:33:36.071824 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:36.071812 2583 projected.go:194] Error preparing data for projected volume kube-api-access-jd69d for pod openshift-network-diagnostics/network-check-target-5bfpn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:36.071972 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:36.071862 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d podName:fbb6a7fe-cc60-43c1-919d-78f0d38148cd nodeName:}" failed. No retries permitted until 2026-04-21 17:33:37.071845799 +0000 UTC m=+4.211908563 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jd69d" (UniqueName: "kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d") pod "network-check-target-5bfpn" (UID: "fbb6a7fe-cc60-43c1-919d-78f0d38148cd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:36.315138 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.315060 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-94ktg"] Apr 21 17:33:36.316850 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.316828 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:36.316960 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:36.316899 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:33:36.373322 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.373287 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:36.373983 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.373387 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b0a2f124-319a-473e-9b27-5c36c13da638-kubelet-config\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:36.373983 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.373437 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b0a2f124-319a-473e-9b27-5c36c13da638-dbus\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:36.460913 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.460397 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:36.460913 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:36.460550 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:33:36.474115 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.474075 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:36.474362 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.474163 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b0a2f124-319a-473e-9b27-5c36c13da638-kubelet-config\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:36.474362 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.474225 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b0a2f124-319a-473e-9b27-5c36c13da638-dbus\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:36.474514 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.474495 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b0a2f124-319a-473e-9b27-5c36c13da638-dbus\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:36.474627 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:36.474611 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 17:33:36.474691 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:36.474672 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret podName:b0a2f124-319a-473e-9b27-5c36c13da638 nodeName:}" failed. No retries permitted until 2026-04-21 17:33:36.974653315 +0000 UTC m=+4.114716075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret") pod "global-pull-secret-syncer-94ktg" (UID: "b0a2f124-319a-473e-9b27-5c36c13da638") : object "kube-system"/"original-pull-secret" not registered Apr 21 17:33:36.474755 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.474726 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b0a2f124-319a-473e-9b27-5c36c13da638-kubelet-config\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:36.476121 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.476074 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" event={"ID":"aaab4344-56a1-42b9-9a96-5071b6e23282","Type":"ContainerStarted","Data":"dc298a8363103f23e266de55c52f2d65917b7a790e3ea3286a66e0e5be75ed20"} Apr 21 17:33:36.478412 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.478364 2583 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 17:28:34 +0000 UTC" deadline="2027-10-19 04:49:00.532916537 +0000 UTC" Apr 21 17:33:36.478412 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.478393 2583 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13091h15m24.054526677s" Apr 21 17:33:36.482163 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.482059 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hccth" event={"ID":"16529597-1f2f-47de-ade5-9fb7b122147c","Type":"ContainerStarted","Data":"3093a082b3bf011dd318c09a8977e18640bf92888ec6e60d7bf5861036129f35"} Apr 21 17:33:36.487425 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.485921 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" event={"ID":"14778c8a-9e3f-4e53-aea1-4de908a64e9f","Type":"ContainerStarted","Data":"7c0996be6cfc653cebd859a4650297a7d219def52f3d8216fd877871854a819f"} Apr 21 17:33:36.488496 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.488468 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ksjjv" event={"ID":"eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56","Type":"ContainerStarted","Data":"9195dcb8334ce5a18581fae03f5d8b0ee4dd55a70de7c2b4526aeaa02244cb61"} Apr 21 17:33:36.491411 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.491367 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-sjdgx" event={"ID":"45e8c620-ac92-4664-985c-5abe0fc26bed","Type":"ContainerStarted","Data":"b02df8ab37b42d69e80aa44f6d890656de47009aa6d98c767d981d86e9ea53f9"} Apr 21 17:33:36.493943 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.493894 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gblls" event={"ID":"b1a13fb4-794b-44ec-aaad-3da758847a9e","Type":"ContainerStarted","Data":"8ca45ea24202b101b8b08fb57386956f9ea0a15013a32dd9581c1ed6adb154c2"} Apr 21 17:33:36.503890 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.503228 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-92.ec2.internal" event={"ID":"666dcd2066b38a4dbb7941535c2eb7f9","Type":"ContainerStarted","Data":"fbb31b37f171ca8b506bd7466ce1dea02ea4ddc65127b275234ae27f5c7e0033"} Apr 21 17:33:36.507651 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.507580 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4qt27" event={"ID":"47035621-4957-4280-94ce-ecd6810f7254","Type":"ContainerStarted","Data":"2706b7045797bbada357d9ec81ad701741de68ab79347f026fc7a1af1b69c51a"} Apr 21 17:33:36.518619 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.518539 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgc5c" event={"ID":"edc1db03-a462-4f21-bb36-369766777418","Type":"ContainerStarted","Data":"aac9cd787c384da5a6ba1b7a73d4f0e984cd691f425a8f38edba8f0e96df95ae"} Apr 21 17:33:36.525379 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.525317 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hmlbc" event={"ID":"d5d8e485-86a1-4255-912b-af222842087c","Type":"ContainerStarted","Data":"41109a926c442f707538bc96ce415757766028fa794d7c4e4d2e7af8084664d0"} Apr 21 17:33:36.979376 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.978595 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:36.979376 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:36.978650 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:36.979376 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:36.978811 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:36.979376 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:36.978871 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs podName:38cd15ba-d0c7-4b4f-b220-f72981ccd9da nodeName:}" failed. No retries permitted until 2026-04-21 17:33:38.978852238 +0000 UTC m=+6.118914988 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs") pod "network-metrics-daemon-rfmv6" (UID: "38cd15ba-d0c7-4b4f-b220-f72981ccd9da") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:36.979376 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:36.979288 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 17:33:36.979376 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:36.979354 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret podName:b0a2f124-319a-473e-9b27-5c36c13da638 nodeName:}" failed. No retries permitted until 2026-04-21 17:33:37.979336455 +0000 UTC m=+5.119399206 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret") pod "global-pull-secret-syncer-94ktg" (UID: "b0a2f124-319a-473e-9b27-5c36c13da638") : object "kube-system"/"original-pull-secret" not registered Apr 21 17:33:37.079921 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:37.079886 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd69d\" (UniqueName: \"kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d\") pod \"network-check-target-5bfpn\" (UID: \"fbb6a7fe-cc60-43c1-919d-78f0d38148cd\") " pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:37.080111 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:37.080092 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 17:33:37.080188 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:37.080118 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 17:33:37.080188 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:37.080130 2583 projected.go:194] Error preparing data for projected volume kube-api-access-jd69d for pod openshift-network-diagnostics/network-check-target-5bfpn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:37.080283 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:37.080209 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d podName:fbb6a7fe-cc60-43c1-919d-78f0d38148cd nodeName:}" failed. No retries permitted until 2026-04-21 17:33:39.080188852 +0000 UTC m=+6.220251616 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jd69d" (UniqueName: "kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d") pod "network-check-target-5bfpn" (UID: "fbb6a7fe-cc60-43c1-919d-78f0d38148cd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:37.460035 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:37.459952 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:37.460662 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:37.460224 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:33:37.545330 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:37.545290 2583 generic.go:358] "Generic (PLEG): container finished" podID="d1a7117e9f363df7d23bfcc1cce2414c" containerID="d200c289d8ac95942397bce7568cf6c3c539f2bc599a1dba4a7669503fd93cd4" exitCode=0 Apr 21 17:33:37.545885 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:37.545856 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal" event={"ID":"d1a7117e9f363df7d23bfcc1cce2414c","Type":"ContainerDied","Data":"d200c289d8ac95942397bce7568cf6c3c539f2bc599a1dba4a7669503fd93cd4"} Apr 21 17:33:37.575742 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:37.574527 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-92.ec2.internal" podStartSLOduration=3.574504877 podStartE2EDuration="3.574504877s" podCreationTimestamp="2026-04-21 17:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:33:36.528408682 +0000 UTC m=+3.668471450" watchObservedRunningTime="2026-04-21 17:33:37.574504877 +0000 UTC m=+4.714567644" Apr 21 17:33:37.988790 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:37.988749 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:37.988983 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:37.988954 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 17:33:37.989071 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:37.989054 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret podName:b0a2f124-319a-473e-9b27-5c36c13da638 nodeName:}" failed. No retries permitted until 2026-04-21 17:33:39.989033674 +0000 UTC m=+7.129096434 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret") pod "global-pull-secret-syncer-94ktg" (UID: "b0a2f124-319a-473e-9b27-5c36c13da638") : object "kube-system"/"original-pull-secret" not registered Apr 21 17:33:38.459982 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:38.459941 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:38.460201 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:38.460102 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:33:38.460692 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:38.460596 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:38.460757 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:38.460690 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:33:38.561932 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:38.561891 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal" event={"ID":"d1a7117e9f363df7d23bfcc1cce2414c","Type":"ContainerStarted","Data":"2b880342e1b04a4878412e8c18907651a2aab3eee43fd4866e7971aa34d98111"} Apr 21 17:33:38.999087 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:38.999048 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:38.999271 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:38.999244 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:38.999329 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:38.999316 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs podName:38cd15ba-d0c7-4b4f-b220-f72981ccd9da nodeName:}" failed. No retries permitted until 2026-04-21 17:33:42.999295339 +0000 UTC m=+10.139358088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs") pod "network-metrics-daemon-rfmv6" (UID: "38cd15ba-d0c7-4b4f-b220-f72981ccd9da") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:39.099553 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:39.099504 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd69d\" (UniqueName: \"kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d\") pod \"network-check-target-5bfpn\" (UID: \"fbb6a7fe-cc60-43c1-919d-78f0d38148cd\") " pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:39.099726 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:39.099707 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 17:33:39.099726 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:39.099724 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 17:33:39.099843 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:39.099738 2583 projected.go:194] Error preparing data for projected volume kube-api-access-jd69d for pod openshift-network-diagnostics/network-check-target-5bfpn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:39.099843 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:39.099792 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d podName:fbb6a7fe-cc60-43c1-919d-78f0d38148cd nodeName:}" failed. No retries permitted until 2026-04-21 17:33:43.099777027 +0000 UTC m=+10.239839776 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jd69d" (UniqueName: "kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d") pod "network-check-target-5bfpn" (UID: "fbb6a7fe-cc60-43c1-919d-78f0d38148cd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:39.461982 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:39.461951 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:39.462452 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:39.462083 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:33:40.008512 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:40.008474 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:40.008674 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:40.008658 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 17:33:40.008729 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:40.008724 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret podName:b0a2f124-319a-473e-9b27-5c36c13da638 nodeName:}" failed. No retries permitted until 2026-04-21 17:33:44.008704112 +0000 UTC m=+11.148766861 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret") pod "global-pull-secret-syncer-94ktg" (UID: "b0a2f124-319a-473e-9b27-5c36c13da638") : object "kube-system"/"original-pull-secret" not registered Apr 21 17:33:40.461338 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:40.460579 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:40.461338 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:40.460628 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:40.461338 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:40.460710 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:33:40.461338 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:40.461290 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:33:41.462361 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:41.462326 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:41.462832 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:41.462433 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:33:42.461243 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:42.460713 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:42.461243 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:42.460852 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:33:42.461475 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:42.461324 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:42.461475 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:42.461425 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:33:43.038069 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:43.038033 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:43.038549 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:43.038196 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:43.038549 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:43.038257 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs podName:38cd15ba-d0c7-4b4f-b220-f72981ccd9da nodeName:}" failed. No retries permitted until 2026-04-21 17:33:51.038237867 +0000 UTC m=+18.178300629 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs") pod "network-metrics-daemon-rfmv6" (UID: "38cd15ba-d0c7-4b4f-b220-f72981ccd9da") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:43.138960 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:43.138388 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd69d\" (UniqueName: \"kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d\") pod \"network-check-target-5bfpn\" (UID: \"fbb6a7fe-cc60-43c1-919d-78f0d38148cd\") " pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:43.138960 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:43.138546 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 17:33:43.138960 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:43.138563 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 17:33:43.138960 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:43.138576 2583 projected.go:194] Error preparing data for projected volume kube-api-access-jd69d for pod openshift-network-diagnostics/network-check-target-5bfpn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:43.138960 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:43.138633 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d podName:fbb6a7fe-cc60-43c1-919d-78f0d38148cd nodeName:}" failed. No retries permitted until 2026-04-21 17:33:51.138613987 +0000 UTC m=+18.278676750 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jd69d" (UniqueName: "kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d") pod "network-check-target-5bfpn" (UID: "fbb6a7fe-cc60-43c1-919d-78f0d38148cd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:43.460910 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:43.460877 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:43.461104 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:43.461000 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:33:44.046026 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:44.045979 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:44.046529 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:44.046212 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 17:33:44.046529 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:44.046288 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret podName:b0a2f124-319a-473e-9b27-5c36c13da638 nodeName:}" failed. No retries permitted until 2026-04-21 17:33:52.046263632 +0000 UTC m=+19.186326401 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret") pod "global-pull-secret-syncer-94ktg" (UID: "b0a2f124-319a-473e-9b27-5c36c13da638") : object "kube-system"/"original-pull-secret" not registered Apr 21 17:33:44.460347 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:44.460314 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:44.460514 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:44.460458 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:33:44.460901 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:44.460851 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:44.460990 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:44.460950 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:33:45.460102 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:45.460067 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:45.460501 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:45.460202 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:33:46.460380 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:46.460341 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:46.460835 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:46.460347 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:46.460835 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:46.460465 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:33:46.460835 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:46.460573 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:33:47.460855 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:47.460815 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:47.461329 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:47.460960 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:33:48.460805 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:48.460763 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:48.460998 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:48.460888 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:33:48.460998 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:48.460919 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:48.460998 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:48.460988 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:33:49.460516 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:49.460480 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:49.460674 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:49.460602 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:33:50.460798 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:50.460759 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:50.461236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:50.460760 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:50.461236 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:50.460903 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:33:50.461236 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:50.460944 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:33:51.098738 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:51.098698 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:51.098908 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:51.098880 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:51.098983 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:51.098963 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs podName:38cd15ba-d0c7-4b4f-b220-f72981ccd9da nodeName:}" failed. No retries permitted until 2026-04-21 17:34:07.098943429 +0000 UTC m=+34.239006179 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs") pod "network-metrics-daemon-rfmv6" (UID: "38cd15ba-d0c7-4b4f-b220-f72981ccd9da") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:51.199917 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:51.199875 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd69d\" (UniqueName: \"kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d\") pod \"network-check-target-5bfpn\" (UID: \"fbb6a7fe-cc60-43c1-919d-78f0d38148cd\") " pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:51.200103 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:51.200071 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 17:33:51.200103 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:51.200095 2583 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 17:33:51.200211 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:51.200107 2583 projected.go:194] Error preparing data for projected volume kube-api-access-jd69d for pod openshift-network-diagnostics/network-check-target-5bfpn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:51.200211 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:51.200187 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d podName:fbb6a7fe-cc60-43c1-919d-78f0d38148cd nodeName:}" failed. No retries permitted until 2026-04-21 17:34:07.200153946 +0000 UTC m=+34.340216694 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jd69d" (UniqueName: "kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d") pod "network-check-target-5bfpn" (UID: "fbb6a7fe-cc60-43c1-919d-78f0d38148cd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:51.460232 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:51.460193 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:51.460406 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:51.460346 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:33:52.106576 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:52.106520 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:52.107029 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:52.106688 2583 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 17:33:52.107029 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:52.106777 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret podName:b0a2f124-319a-473e-9b27-5c36c13da638 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:08.106755924 +0000 UTC m=+35.246818672 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret") pod "global-pull-secret-syncer-94ktg" (UID: "b0a2f124-319a-473e-9b27-5c36c13da638") : object "kube-system"/"original-pull-secret" not registered Apr 21 17:33:52.460700 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:52.460661 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:52.460892 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:52.460780 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:33:52.460892 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:52.460846 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:52.461013 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:52.460972 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:33:53.461841 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.461797 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:53.462253 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:53.461945 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:33:53.588821 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.588555 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4qt27" event={"ID":"47035621-4957-4280-94ce-ecd6810f7254","Type":"ContainerStarted","Data":"14b03af3b5a64a7feddb519b2223481689c48e1846f4cdbae2dffe30abc2c36a"} Apr 21 17:33:53.589899 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.589870 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sgc5c" event={"ID":"edc1db03-a462-4f21-bb36-369766777418","Type":"ContainerStarted","Data":"8b9ee90ae869a8d08c316ca5e1369c970c3d4f87fe51462e67e35cd8f20fb4c9"} Apr 21 17:33:53.591253 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.591232 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hmlbc" event={"ID":"d5d8e485-86a1-4255-912b-af222842087c","Type":"ContainerStarted","Data":"a0c7dc2758e3d378621a67b5cb15f383ca9d4c5b187a685dea2e7ed888bb5b09"} Apr 21 17:33:53.592755 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.592729 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" event={"ID":"aaab4344-56a1-42b9-9a96-5071b6e23282","Type":"ContainerStarted","Data":"2bac3576af1219a0a14b39c5ed36fcb7494c638a039c33558cfea52aae1c40d4"} Apr 21 17:33:53.594077 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.594052 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hccth" event={"ID":"16529597-1f2f-47de-ade5-9fb7b122147c","Type":"ContainerStarted","Data":"e4522295c12324bb6a05b82b1a289d604bfca146ad58e36aae3727f5c481d651"} Apr 21 17:33:53.596001 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.595968 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" event={"ID":"14778c8a-9e3f-4e53-aea1-4de908a64e9f","Type":"ContainerStarted","Data":"6d7f92254883c345cee58c5433cfc045a0c06e454c7b04678883d7df5e228eb7"} Apr 21 17:33:53.596001 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.595999 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" event={"ID":"14778c8a-9e3f-4e53-aea1-4de908a64e9f","Type":"ContainerStarted","Data":"228ccdb9212348329b5e9898f97874338f1a4f51763e4d34889ada844370d3eb"} Apr 21 17:33:53.596158 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.596014 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" event={"ID":"14778c8a-9e3f-4e53-aea1-4de908a64e9f","Type":"ContainerStarted","Data":"fcc896274f05331eb2fe2ec1b0803037e536e77cad655b4240001640cbc66de7"} Apr 21 17:33:53.597136 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.597117 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-sjdgx" event={"ID":"45e8c620-ac92-4664-985c-5abe0fc26bed","Type":"ContainerStarted","Data":"b884b5b80b90b46df5000f9c40997a43b3ba7ccc65ecd8c8e53623904614fea1"} Apr 21 17:33:53.598247 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.598224 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gblls" event={"ID":"b1a13fb4-794b-44ec-aaad-3da758847a9e","Type":"ContainerStarted","Data":"cd03010cc233e487adb97f45c4f4b8a56a72d964b4ab91fd07bdec4f43b86f28"} Apr 21 17:33:53.612958 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.612912 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-92.ec2.internal" podStartSLOduration=19.612900152 podStartE2EDuration="19.612900152s" podCreationTimestamp="2026-04-21 17:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:33:38.586390266 +0000 UTC m=+5.726453035" watchObservedRunningTime="2026-04-21 17:33:53.612900152 +0000 UTC m=+20.752962918" Apr 21 17:33:53.627717 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.627624 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hmlbc" podStartSLOduration=3.572621405 podStartE2EDuration="20.627604933s" podCreationTimestamp="2026-04-21 17:33:33 +0000 UTC" firstStartedPulling="2026-04-21 17:33:36.058068684 +0000 UTC m=+3.198131433" lastFinishedPulling="2026-04-21 17:33:53.113052199 +0000 UTC m=+20.253114961" observedRunningTime="2026-04-21 17:33:53.627325431 +0000 UTC m=+20.767388199" watchObservedRunningTime="2026-04-21 17:33:53.627604933 +0000 UTC m=+20.767667698" Apr 21 17:33:53.647893 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.647843 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-gblls" podStartSLOduration=3.549198003 podStartE2EDuration="20.647829454s" podCreationTimestamp="2026-04-21 17:33:33 +0000 UTC" firstStartedPulling="2026-04-21 17:33:36.051342365 +0000 UTC m=+3.191405110" lastFinishedPulling="2026-04-21 17:33:53.149973809 +0000 UTC m=+20.290036561" observedRunningTime="2026-04-21 17:33:53.647050208 +0000 UTC m=+20.787112976" watchObservedRunningTime="2026-04-21 17:33:53.647829454 +0000 UTC m=+20.787892221" Apr 21 17:33:53.662108 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.662071 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hccth" podStartSLOduration=2.611253848 podStartE2EDuration="19.662058138s" podCreationTimestamp="2026-04-21 17:33:34 +0000 UTC" firstStartedPulling="2026-04-21 17:33:36.062245543 +0000 UTC m=+3.202308288" lastFinishedPulling="2026-04-21 17:33:53.113049819 +0000 UTC m=+20.253112578" observedRunningTime="2026-04-21 17:33:53.661673733 +0000 UTC m=+20.801736501" watchObservedRunningTime="2026-04-21 17:33:53.662058138 +0000 UTC m=+20.802120904" Apr 21 17:33:53.677567 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.677514 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sgc5c" podStartSLOduration=3.543175072 podStartE2EDuration="20.677499543s" podCreationTimestamp="2026-04-21 17:33:33 +0000 UTC" firstStartedPulling="2026-04-21 17:33:36.060167482 +0000 UTC m=+3.200230227" lastFinishedPulling="2026-04-21 17:33:53.19449194 +0000 UTC m=+20.334554698" observedRunningTime="2026-04-21 17:33:53.677123289 +0000 UTC m=+20.817186057" watchObservedRunningTime="2026-04-21 17:33:53.677499543 +0000 UTC m=+20.817562310" Apr 21 17:33:53.694194 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:53.694130 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-sjdgx" podStartSLOduration=3.596243261 podStartE2EDuration="20.694116434s" podCreationTimestamp="2026-04-21 17:33:33 +0000 UTC" firstStartedPulling="2026-04-21 17:33:36.052099392 +0000 UTC m=+3.192162136" lastFinishedPulling="2026-04-21 17:33:53.149972562 +0000 UTC m=+20.290035309" observedRunningTime="2026-04-21 17:33:53.693689255 +0000 UTC m=+20.833752026" watchObservedRunningTime="2026-04-21 17:33:53.694116434 +0000 UTC m=+20.834179200" Apr 21 17:33:54.305844 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:54.305815 2583 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 17:33:54.428773 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:54.428689 2583 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T17:33:54.305836664Z","UUID":"3d596ffa-72ab-4f64-96a0-65f3f069194d","Handler":null,"Name":"","Endpoint":""} Apr 21 17:33:54.430311 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:54.430291 2583 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 17:33:54.430406 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:54.430318 2583 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 17:33:54.460017 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:54.459984 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:54.460196 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:54.459984 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:54.460196 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:54.460085 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:33:54.460293 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:54.460200 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:33:54.602401 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:54.602360 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" event={"ID":"14778c8a-9e3f-4e53-aea1-4de908a64e9f","Type":"ContainerStarted","Data":"4a21053240637eee8475fb0e66761d374bd220a5fc39467d36e8a15f82842a0f"} Apr 21 17:33:54.603049 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:54.602405 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" event={"ID":"14778c8a-9e3f-4e53-aea1-4de908a64e9f","Type":"ContainerStarted","Data":"890139cc50c5f4b3e71f734d69e4808802e714afbd7de1e8b200ac35d11189d2"} Apr 21 17:33:54.603049 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:54.602422 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" event={"ID":"14778c8a-9e3f-4e53-aea1-4de908a64e9f","Type":"ContainerStarted","Data":"8bcad5532a2134ec16f9226ac18ccdd4a9ff93c9f56965d963e5eec0c8914b7f"} Apr 21 17:33:54.603634 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:54.603612 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ksjjv" event={"ID":"eb995f1e-e69f-4fb1-b42d-0bc3ebbc7b56","Type":"ContainerStarted","Data":"8f90c09f04fdafc305bb4aabf4d86d828cacddc2dc0b8e9eb4b261e7fbd7ccb8"} Apr 21 17:33:54.604789 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:54.604765 2583 generic.go:358] "Generic (PLEG): container finished" podID="47035621-4957-4280-94ce-ecd6810f7254" containerID="14b03af3b5a64a7feddb519b2223481689c48e1846f4cdbae2dffe30abc2c36a" exitCode=0 Apr 21 17:33:54.604890 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:54.604821 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4qt27" event={"ID":"47035621-4957-4280-94ce-ecd6810f7254","Type":"ContainerDied","Data":"14b03af3b5a64a7feddb519b2223481689c48e1846f4cdbae2dffe30abc2c36a"} Apr 21 17:33:54.606418 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:54.606392 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" event={"ID":"aaab4344-56a1-42b9-9a96-5071b6e23282","Type":"ContainerStarted","Data":"e2b3e830d65c8d98d32c6c4462148ddbb65c8de64026cf08167b0c61531cc407"} Apr 21 17:33:54.621106 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:54.621065 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-ksjjv" podStartSLOduration=4.523814458 podStartE2EDuration="21.621052017s" podCreationTimestamp="2026-04-21 17:33:33 +0000 UTC" firstStartedPulling="2026-04-21 17:33:36.053066605 +0000 UTC m=+3.193129349" lastFinishedPulling="2026-04-21 17:33:53.150304157 +0000 UTC m=+20.290366908" observedRunningTime="2026-04-21 17:33:54.620955259 +0000 UTC m=+21.761018037" watchObservedRunningTime="2026-04-21 17:33:54.621052017 +0000 UTC m=+21.761114784" Apr 21 17:33:55.460863 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:55.460834 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:55.461058 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:55.460942 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:33:55.610082 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:55.610044 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" event={"ID":"aaab4344-56a1-42b9-9a96-5071b6e23282","Type":"ContainerStarted","Data":"7edf6bde692ef31ae6b16a50ac030fc57b4231a605ff2e9d093a3a42d506c00c"} Apr 21 17:33:55.638279 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:55.638234 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-msn6s" podStartSLOduration=3.634668943 podStartE2EDuration="22.638219857s" podCreationTimestamp="2026-04-21 17:33:33 +0000 UTC" firstStartedPulling="2026-04-21 17:33:36.049032289 +0000 UTC m=+3.189095051" lastFinishedPulling="2026-04-21 17:33:55.052583212 +0000 UTC m=+22.192645965" observedRunningTime="2026-04-21 17:33:55.63795584 +0000 UTC m=+22.778018619" watchObservedRunningTime="2026-04-21 17:33:55.638219857 +0000 UTC m=+22.778282629" Apr 21 17:33:56.460868 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:56.460671 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:56.461040 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:56.460694 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:56.461040 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:56.460964 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:33:56.461040 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:56.461001 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:33:56.615007 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:56.614970 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" event={"ID":"14778c8a-9e3f-4e53-aea1-4de908a64e9f","Type":"ContainerStarted","Data":"d5ca8271e519c35f0daec714dcee5337b8b6cf07786d444a158da55761e87393"} Apr 21 17:33:56.815433 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:56.815348 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-sjdgx" Apr 21 17:33:56.816032 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:56.816005 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-sjdgx" Apr 21 17:33:57.460645 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:57.460612 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:57.460853 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:57.460747 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:33:57.617355 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:57.617320 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-sjdgx" Apr 21 17:33:57.617897 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:57.617871 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-sjdgx" Apr 21 17:33:58.460025 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:58.459989 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:33:58.460236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:58.459989 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:33:58.460236 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:58.460155 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:33:58.460236 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:58.460218 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:33:59.460052 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:59.460018 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:33:59.460403 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:33:59.460127 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:33:59.624452 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:59.624410 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" event={"ID":"14778c8a-9e3f-4e53-aea1-4de908a64e9f","Type":"ContainerStarted","Data":"b29ed7bf6142c3e8c05bf1f42ce77cc72107c3b08257cb70f754249c0bdb5a8e"} Apr 21 17:33:59.624749 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:59.624729 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:59.624847 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:59.624759 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:59.626229 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:59.626197 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4qt27" event={"ID":"47035621-4957-4280-94ce-ecd6810f7254","Type":"ContainerStarted","Data":"a83601f0c359f48e7b0684d7e4db1e17fddaaf863a193613c595eaeedd628f99"} Apr 21 17:33:59.640440 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:59.640404 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:33:59.654785 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:33:59.654728 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" podStartSLOduration=9.447458557 podStartE2EDuration="26.654713611s" podCreationTimestamp="2026-04-21 17:33:33 +0000 UTC" firstStartedPulling="2026-04-21 17:33:36.056091384 +0000 UTC m=+3.196154143" lastFinishedPulling="2026-04-21 17:33:53.263346438 +0000 UTC m=+20.403409197" observedRunningTime="2026-04-21 17:33:59.654189679 +0000 UTC m=+26.794252437" watchObservedRunningTime="2026-04-21 17:33:59.654713611 +0000 UTC m=+26.794776377" Apr 21 17:34:00.460278 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:00.460250 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:34:00.460674 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:00.460258 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:34:00.460674 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:00.460355 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:34:00.460674 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:00.460460 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:34:00.629225 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:00.629192 2583 generic.go:358] "Generic (PLEG): container finished" podID="47035621-4957-4280-94ce-ecd6810f7254" containerID="a83601f0c359f48e7b0684d7e4db1e17fddaaf863a193613c595eaeedd628f99" exitCode=0 Apr 21 17:34:00.629439 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:00.629280 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4qt27" event={"ID":"47035621-4957-4280-94ce-ecd6810f7254","Type":"ContainerDied","Data":"a83601f0c359f48e7b0684d7e4db1e17fddaaf863a193613c595eaeedd628f99"} Apr 21 17:34:00.629759 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:00.629733 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:34:00.644780 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:00.644756 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:34:01.149763 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:01.149730 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-94ktg"] Apr 21 17:34:01.149942 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:01.149875 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:34:01.150007 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:01.149976 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:34:01.152864 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:01.152837 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rfmv6"] Apr 21 17:34:01.153049 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:01.152876 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5bfpn"] Apr 21 17:34:01.153049 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:01.152965 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:34:01.153145 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:01.153058 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:34:01.153145 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:01.153085 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:34:01.153239 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:01.153211 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:34:02.459866 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:02.459838 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:34:02.460614 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:02.459845 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:34:02.460614 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:02.459974 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:34:02.460614 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:02.459940 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:34:02.460614 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:02.460071 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:34:02.460614 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:02.460118 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:34:02.635217 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:02.634887 2583 generic.go:358] "Generic (PLEG): container finished" podID="47035621-4957-4280-94ce-ecd6810f7254" containerID="5a5028da7368dfd2732b5db9c02739989a1b7c7e1306647bc3e1599c92e23874" exitCode=0 Apr 21 17:34:02.635362 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:02.634961 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4qt27" event={"ID":"47035621-4957-4280-94ce-ecd6810f7254","Type":"ContainerDied","Data":"5a5028da7368dfd2732b5db9c02739989a1b7c7e1306647bc3e1599c92e23874"} Apr 21 17:34:04.460688 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:04.460644 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:34:04.461136 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:04.460739 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5bfpn" podUID="fbb6a7fe-cc60-43c1-919d-78f0d38148cd" Apr 21 17:34:04.461136 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:04.460765 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:34:04.461136 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:04.460788 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:34:04.461136 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:04.460858 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-94ktg" podUID="b0a2f124-319a-473e-9b27-5c36c13da638" Apr 21 17:34:04.461136 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:04.460924 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:34:04.641730 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:04.641700 2583 generic.go:358] "Generic (PLEG): container finished" podID="47035621-4957-4280-94ce-ecd6810f7254" containerID="a7f12dbf263abe094e76d9d344f92ce5aaa56a59b9c0a4b27129c1632bb31d22" exitCode=0 Apr 21 17:34:04.641899 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:04.641756 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4qt27" event={"ID":"47035621-4957-4280-94ce-ecd6810f7254","Type":"ContainerDied","Data":"a7f12dbf263abe094e76d9d344f92ce5aaa56a59b9c0a4b27129c1632bb31d22"} Apr 21 17:34:05.190070 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.189994 2583 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-92.ec2.internal" event="NodeReady" Apr 21 17:34:05.190243 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.190152 2583 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 17:34:05.250754 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.250715 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4cg4j"] Apr 21 17:34:05.271270 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.271242 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tk5wc"] Apr 21 17:34:05.271453 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.271417 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:34:05.273957 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.273931 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 17:34:05.274649 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.274628 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 17:34:05.274907 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.274890 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4fbh6\"" Apr 21 17:34:05.275111 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.275096 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 17:34:05.286280 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.286258 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4cg4j"] Apr 21 17:34:05.286391 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.286292 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tk5wc"] Apr 21 17:34:05.286448 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.286424 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:05.289432 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.289085 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 17:34:05.289432 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.289265 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 17:34:05.289432 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.289331 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mlvk8\"" Apr 21 17:34:05.401215 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.401165 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvxnr\" (UniqueName: \"kubernetes.io/projected/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-kube-api-access-bvxnr\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:05.401387 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.401227 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-tmp-dir\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:05.401387 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.401289 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:05.401387 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.401314 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-config-volume\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:05.401387 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.401358 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert\") pod \"ingress-canary-4cg4j\" (UID: \"53e51873-1fc7-47fe-b3f9-224f9c0521d4\") " pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:34:05.401387 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.401380 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkbhx\" (UniqueName: \"kubernetes.io/projected/53e51873-1fc7-47fe-b3f9-224f9c0521d4-kube-api-access-nkbhx\") pod \"ingress-canary-4cg4j\" (UID: \"53e51873-1fc7-47fe-b3f9-224f9c0521d4\") " pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:34:05.501980 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.501932 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:05.501980 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.501976 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-config-volume\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:05.502497 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.502029 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert\") pod \"ingress-canary-4cg4j\" (UID: \"53e51873-1fc7-47fe-b3f9-224f9c0521d4\") " pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:34:05.502497 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.502053 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkbhx\" (UniqueName: \"kubernetes.io/projected/53e51873-1fc7-47fe-b3f9-224f9c0521d4-kube-api-access-nkbhx\") pod \"ingress-canary-4cg4j\" (UID: \"53e51873-1fc7-47fe-b3f9-224f9c0521d4\") " pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:34:05.502497 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:05.502095 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:34:05.502497 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.502117 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvxnr\" (UniqueName: \"kubernetes.io/projected/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-kube-api-access-bvxnr\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:05.502497 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.502145 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-tmp-dir\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:05.502497 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:05.502163 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls podName:c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:06.00214281 +0000 UTC m=+33.142205577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls") pod "dns-default-tk5wc" (UID: "c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124") : secret "dns-default-metrics-tls" not found Apr 21 17:34:05.502497 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.502464 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-tmp-dir\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:05.502800 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:05.502567 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:34:05.502800 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:05.502612 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert podName:53e51873-1fc7-47fe-b3f9-224f9c0521d4 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:06.002597314 +0000 UTC m=+33.142660061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert") pod "ingress-canary-4cg4j" (UID: "53e51873-1fc7-47fe-b3f9-224f9c0521d4") : secret "canary-serving-cert" not found Apr 21 17:34:05.503126 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.503102 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-config-volume\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:05.526120 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.526091 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvxnr\" (UniqueName: \"kubernetes.io/projected/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-kube-api-access-bvxnr\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:05.526310 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:05.526251 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkbhx\" (UniqueName: \"kubernetes.io/projected/53e51873-1fc7-47fe-b3f9-224f9c0521d4-kube-api-access-nkbhx\") pod \"ingress-canary-4cg4j\" (UID: \"53e51873-1fc7-47fe-b3f9-224f9c0521d4\") " pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:34:06.005307 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:06.005259 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert\") pod \"ingress-canary-4cg4j\" (UID: \"53e51873-1fc7-47fe-b3f9-224f9c0521d4\") " pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:34:06.005501 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:06.005380 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:06.005501 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:06.005420 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:34:06.005501 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:06.005489 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:34:06.005501 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:06.005496 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert podName:53e51873-1fc7-47fe-b3f9-224f9c0521d4 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:07.005477731 +0000 UTC m=+34.145540485 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert") pod "ingress-canary-4cg4j" (UID: "53e51873-1fc7-47fe-b3f9-224f9c0521d4") : secret "canary-serving-cert" not found Apr 21 17:34:06.005726 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:06.005531 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls podName:c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:07.005519997 +0000 UTC m=+34.145582742 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls") pod "dns-default-tk5wc" (UID: "c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124") : secret "dns-default-metrics-tls" not found Apr 21 17:34:06.460206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:06.460153 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:34:06.460391 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:06.460153 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:34:06.460391 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:06.460161 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:34:06.463268 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:06.463242 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 17:34:06.464402 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:06.464253 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 17:34:06.464402 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:06.464266 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pqc26\"" Apr 21 17:34:06.464402 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:06.464286 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 17:34:06.464402 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:06.464307 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5lbmh\"" Apr 21 17:34:06.464402 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:06.464286 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 17:34:07.013036 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:07.012996 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:07.013587 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:07.013096 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert\") pod \"ingress-canary-4cg4j\" (UID: \"53e51873-1fc7-47fe-b3f9-224f9c0521d4\") " pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:34:07.013587 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:07.013249 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:34:07.013587 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:07.013337 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert podName:53e51873-1fc7-47fe-b3f9-224f9c0521d4 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:09.013316615 +0000 UTC m=+36.153379365 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert") pod "ingress-canary-4cg4j" (UID: "53e51873-1fc7-47fe-b3f9-224f9c0521d4") : secret "canary-serving-cert" not found Apr 21 17:34:07.013761 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:07.013735 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:34:07.013816 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:07.013772 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls podName:c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:09.013761928 +0000 UTC m=+36.153824674 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls") pod "dns-default-tk5wc" (UID: "c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124") : secret "dns-default-metrics-tls" not found Apr 21 17:34:07.113789 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:07.113750 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:34:07.113996 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:07.113938 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 17:34:07.114051 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:07.114043 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs podName:38cd15ba-d0c7-4b4f-b220-f72981ccd9da nodeName:}" failed. No retries permitted until 2026-04-21 17:34:39.114020789 +0000 UTC m=+66.254083538 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs") pod "network-metrics-daemon-rfmv6" (UID: "38cd15ba-d0c7-4b4f-b220-f72981ccd9da") : secret "metrics-daemon-secret" not found Apr 21 17:34:07.214334 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:07.214292 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd69d\" (UniqueName: \"kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d\") pod \"network-check-target-5bfpn\" (UID: \"fbb6a7fe-cc60-43c1-919d-78f0d38148cd\") " pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:34:07.217466 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:07.217435 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd69d\" (UniqueName: \"kubernetes.io/projected/fbb6a7fe-cc60-43c1-919d-78f0d38148cd-kube-api-access-jd69d\") pod \"network-check-target-5bfpn\" (UID: \"fbb6a7fe-cc60-43c1-919d-78f0d38148cd\") " pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:34:07.379079 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:07.379000 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:34:07.554875 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:07.554837 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5bfpn"] Apr 21 17:34:07.566462 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:34:07.566424 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbb6a7fe_cc60_43c1_919d_78f0d38148cd.slice/crio-b91b3a3dede586d9f1be65082695649ce4a72ffcc7ddd93019249e0de17a3b6a WatchSource:0}: Error finding container b91b3a3dede586d9f1be65082695649ce4a72ffcc7ddd93019249e0de17a3b6a: Status 404 returned error can't find the container with id b91b3a3dede586d9f1be65082695649ce4a72ffcc7ddd93019249e0de17a3b6a Apr 21 17:34:07.648815 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:07.648726 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5bfpn" event={"ID":"fbb6a7fe-cc60-43c1-919d-78f0d38148cd","Type":"ContainerStarted","Data":"b91b3a3dede586d9f1be65082695649ce4a72ffcc7ddd93019249e0de17a3b6a"} Apr 21 17:34:08.122029 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:08.121991 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:34:08.126514 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:08.126486 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b0a2f124-319a-473e-9b27-5c36c13da638-original-pull-secret\") pod \"global-pull-secret-syncer-94ktg\" (UID: \"b0a2f124-319a-473e-9b27-5c36c13da638\") " pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:34:08.285600 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:08.285343 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-94ktg" Apr 21 17:34:09.028708 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:09.028655 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:09.028916 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:09.028731 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert\") pod \"ingress-canary-4cg4j\" (UID: \"53e51873-1fc7-47fe-b3f9-224f9c0521d4\") " pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:34:09.028916 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:09.028837 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:34:09.028916 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:09.028853 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:34:09.028916 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:09.028902 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls podName:c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:13.028886414 +0000 UTC m=+40.168949162 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls") pod "dns-default-tk5wc" (UID: "c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124") : secret "dns-default-metrics-tls" not found Apr 21 17:34:09.028916 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:09.028916 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert podName:53e51873-1fc7-47fe-b3f9-224f9c0521d4 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:13.028910534 +0000 UTC m=+40.168973279 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert") pod "ingress-canary-4cg4j" (UID: "53e51873-1fc7-47fe-b3f9-224f9c0521d4") : secret "canary-serving-cert" not found Apr 21 17:34:11.868439 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:11.868411 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-94ktg"] Apr 21 17:34:11.873241 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:34:11.873213 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0a2f124_319a_473e_9b27_5c36c13da638.slice/crio-af657bb0588f9ddf20ac5ece0cc9dabaa9fa539326e23aae4bdb04b7e7c721aa WatchSource:0}: Error finding container af657bb0588f9ddf20ac5ece0cc9dabaa9fa539326e23aae4bdb04b7e7c721aa: Status 404 returned error can't find the container with id af657bb0588f9ddf20ac5ece0cc9dabaa9fa539326e23aae4bdb04b7e7c721aa Apr 21 17:34:12.660041 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:12.660003 2583 generic.go:358] "Generic (PLEG): container finished" podID="47035621-4957-4280-94ce-ecd6810f7254" containerID="7a17e8e30b4da43c22164b3e03e7991011a567b06411de1512f63ae04d0b5d68" exitCode=0 Apr 21 17:34:12.660235 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:12.660072 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4qt27" event={"ID":"47035621-4957-4280-94ce-ecd6810f7254","Type":"ContainerDied","Data":"7a17e8e30b4da43c22164b3e03e7991011a567b06411de1512f63ae04d0b5d68"} Apr 21 17:34:12.661493 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:12.661468 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5bfpn" event={"ID":"fbb6a7fe-cc60-43c1-919d-78f0d38148cd","Type":"ContainerStarted","Data":"c2d11e057dbdfd1f882ce41bd7bd7b5d7146284f4aaa10097a3f87023d0324a2"} Apr 21 17:34:12.661641 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:12.661626 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:34:12.662442 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:12.662420 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-94ktg" event={"ID":"b0a2f124-319a-473e-9b27-5c36c13da638","Type":"ContainerStarted","Data":"af657bb0588f9ddf20ac5ece0cc9dabaa9fa539326e23aae4bdb04b7e7c721aa"} Apr 21 17:34:12.699938 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:12.699880 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5bfpn" podStartSLOduration=35.518855985 podStartE2EDuration="39.699863183s" podCreationTimestamp="2026-04-21 17:33:33 +0000 UTC" firstStartedPulling="2026-04-21 17:34:07.568503887 +0000 UTC m=+34.708566647" lastFinishedPulling="2026-04-21 17:34:11.749511078 +0000 UTC m=+38.889573845" observedRunningTime="2026-04-21 17:34:12.699054972 +0000 UTC m=+39.839117732" watchObservedRunningTime="2026-04-21 17:34:12.699863183 +0000 UTC m=+39.839925940" Apr 21 17:34:13.061573 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:13.061529 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:13.062077 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:13.061600 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert\") pod \"ingress-canary-4cg4j\" (UID: \"53e51873-1fc7-47fe-b3f9-224f9c0521d4\") " pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:34:13.062077 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:13.061701 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:34:13.062077 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:13.061716 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:34:13.062077 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:13.061780 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert podName:53e51873-1fc7-47fe-b3f9-224f9c0521d4 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:21.061758729 +0000 UTC m=+48.201821479 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert") pod "ingress-canary-4cg4j" (UID: "53e51873-1fc7-47fe-b3f9-224f9c0521d4") : secret "canary-serving-cert" not found Apr 21 17:34:13.062340 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:13.062109 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls podName:c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:21.061789102 +0000 UTC m=+48.201851854 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls") pod "dns-default-tk5wc" (UID: "c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124") : secret "dns-default-metrics-tls" not found Apr 21 17:34:13.667926 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:13.667888 2583 generic.go:358] "Generic (PLEG): container finished" podID="47035621-4957-4280-94ce-ecd6810f7254" containerID="923260f5a0d9461c78d6d5fe7e761deed35ebffee1d7d94bbae5e8a9130d8cb0" exitCode=0 Apr 21 17:34:13.668134 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:13.667984 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4qt27" event={"ID":"47035621-4957-4280-94ce-ecd6810f7254","Type":"ContainerDied","Data":"923260f5a0d9461c78d6d5fe7e761deed35ebffee1d7d94bbae5e8a9130d8cb0"} Apr 21 17:34:14.673361 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:14.673315 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4qt27" event={"ID":"47035621-4957-4280-94ce-ecd6810f7254","Type":"ContainerStarted","Data":"3f63e63b686a2c194cd0ee1e8376c24767a20a81c797e2f6fdbea22048e28494"} Apr 21 17:34:14.699117 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:14.699050 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4qt27" podStartSLOduration=6.0222529399999996 podStartE2EDuration="41.699033251s" podCreationTimestamp="2026-04-21 17:33:33 +0000 UTC" firstStartedPulling="2026-04-21 17:33:36.06424795 +0000 UTC m=+3.204310709" lastFinishedPulling="2026-04-21 17:34:11.741028258 +0000 UTC m=+38.881091020" observedRunningTime="2026-04-21 17:34:14.69712815 +0000 UTC m=+41.837190930" watchObservedRunningTime="2026-04-21 17:34:14.699033251 +0000 UTC m=+41.839096018" Apr 21 17:34:16.679193 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:16.679131 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-94ktg" event={"ID":"b0a2f124-319a-473e-9b27-5c36c13da638","Type":"ContainerStarted","Data":"abb82523642b00f604e185ead5ff7a0b7c087368c01d47f802285e467db65cff"} Apr 21 17:34:16.694099 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:16.694003 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-94ktg" podStartSLOduration=36.13771777 podStartE2EDuration="40.693987591s" podCreationTimestamp="2026-04-21 17:33:36 +0000 UTC" firstStartedPulling="2026-04-21 17:34:11.875453156 +0000 UTC m=+39.015515912" lastFinishedPulling="2026-04-21 17:34:16.431722986 +0000 UTC m=+43.571785733" observedRunningTime="2026-04-21 17:34:16.693792229 +0000 UTC m=+43.833854996" watchObservedRunningTime="2026-04-21 17:34:16.693987591 +0000 UTC m=+43.834050358" Apr 21 17:34:21.117582 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:21.117547 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:21.117995 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:21.117592 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert\") pod \"ingress-canary-4cg4j\" (UID: \"53e51873-1fc7-47fe-b3f9-224f9c0521d4\") " pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:34:21.117995 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:21.117685 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:34:21.117995 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:21.117692 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:34:21.117995 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:21.117746 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert podName:53e51873-1fc7-47fe-b3f9-224f9c0521d4 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:37.117732774 +0000 UTC m=+64.257795520 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert") pod "ingress-canary-4cg4j" (UID: "53e51873-1fc7-47fe-b3f9-224f9c0521d4") : secret "canary-serving-cert" not found Apr 21 17:34:21.117995 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:21.117759 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls podName:c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:37.117753064 +0000 UTC m=+64.257815809 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls") pod "dns-default-tk5wc" (UID: "c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124") : secret "dns-default-metrics-tls" not found Apr 21 17:34:32.651206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:32.651149 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xfgp5" Apr 21 17:34:37.216337 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:37.216292 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:34:37.216337 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:37.216343 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert\") pod \"ingress-canary-4cg4j\" (UID: \"53e51873-1fc7-47fe-b3f9-224f9c0521d4\") " pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:34:37.216801 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:37.216431 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:34:37.216801 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:37.216434 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:34:37.216801 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:37.216484 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert podName:53e51873-1fc7-47fe-b3f9-224f9c0521d4 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:09.216470873 +0000 UTC m=+96.356533619 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert") pod "ingress-canary-4cg4j" (UID: "53e51873-1fc7-47fe-b3f9-224f9c0521d4") : secret "canary-serving-cert" not found Apr 21 17:34:37.216801 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:37.216495 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls podName:c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:09.216489982 +0000 UTC m=+96.356552726 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls") pod "dns-default-tk5wc" (UID: "c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124") : secret "dns-default-metrics-tls" not found Apr 21 17:34:39.125459 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:39.125402 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:34:39.125860 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:39.125550 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 17:34:39.125860 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:34:39.125617 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs podName:38cd15ba-d0c7-4b4f-b220-f72981ccd9da nodeName:}" failed. No retries permitted until 2026-04-21 17:35:43.125600641 +0000 UTC m=+130.265663390 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs") pod "network-metrics-daemon-rfmv6" (UID: "38cd15ba-d0c7-4b4f-b220-f72981ccd9da") : secret "metrics-daemon-secret" not found Apr 21 17:34:43.670330 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:34:43.670296 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5bfpn" Apr 21 17:35:09.237580 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:09.237516 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:35:09.237580 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:09.237587 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert\") pod \"ingress-canary-4cg4j\" (UID: \"53e51873-1fc7-47fe-b3f9-224f9c0521d4\") " pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:35:09.238130 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:09.237678 2583 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:35:09.238130 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:09.237694 2583 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:35:09.238130 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:09.237744 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls podName:c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124 nodeName:}" failed. No retries permitted until 2026-04-21 17:36:13.237729372 +0000 UTC m=+160.377792117 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls") pod "dns-default-tk5wc" (UID: "c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124") : secret "dns-default-metrics-tls" not found Apr 21 17:35:09.238130 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:09.237758 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert podName:53e51873-1fc7-47fe-b3f9-224f9c0521d4 nodeName:}" failed. No retries permitted until 2026-04-21 17:36:13.237752256 +0000 UTC m=+160.377815000 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert") pod "ingress-canary-4cg4j" (UID: "53e51873-1fc7-47fe-b3f9-224f9c0521d4") : secret "canary-serving-cert" not found Apr 21 17:35:40.006704 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.006669 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-bcf746f86-2s4sn"] Apr 21 17:35:40.009625 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.009608 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.012296 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.012262 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 17:35:40.012442 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.012373 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 17:35:40.012442 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.012403 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 17:35:40.012575 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.012501 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-99xzp\"" Apr 21 17:35:40.012627 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.012601 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 17:35:40.012719 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.012660 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 17:35:40.012719 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.012690 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 17:35:40.018778 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.018758 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-bcf746f86-2s4sn"] Apr 21 17:35:40.050392 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.050357 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-default-certificate\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.050573 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.050425 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqfgn\" (UniqueName: \"kubernetes.io/projected/3a283e8b-7dfc-4c49-9afd-4adb1c192587-kube-api-access-bqfgn\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.050573 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.050464 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.050573 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.050485 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.050679 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.050566 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-stats-auth\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.151839 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.151797 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.151839 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.151838 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.152042 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:40.151981 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle podName:3a283e8b-7dfc-4c49-9afd-4adb1c192587 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:40.651961822 +0000 UTC m=+127.792024566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle") pod "router-default-bcf746f86-2s4sn" (UID: "3a283e8b-7dfc-4c49-9afd-4adb1c192587") : configmap references non-existent config key: service-ca.crt Apr 21 17:35:40.152042 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:40.152009 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 17:35:40.152132 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:40.152066 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs podName:3a283e8b-7dfc-4c49-9afd-4adb1c192587 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:40.65204889 +0000 UTC m=+127.792111644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs") pod "router-default-bcf746f86-2s4sn" (UID: "3a283e8b-7dfc-4c49-9afd-4adb1c192587") : secret "router-metrics-certs-default" not found Apr 21 17:35:40.152132 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.152106 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-stats-auth\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.152275 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.152161 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-default-certificate\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.152275 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.152214 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqfgn\" (UniqueName: \"kubernetes.io/projected/3a283e8b-7dfc-4c49-9afd-4adb1c192587-kube-api-access-bqfgn\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.154962 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.154936 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-stats-auth\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.155087 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.154983 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-default-certificate\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.162062 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.162036 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqfgn\" (UniqueName: \"kubernetes.io/projected/3a283e8b-7dfc-4c49-9afd-4adb1c192587-kube-api-access-bqfgn\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.656909 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.656874 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.656909 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:40.656911 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:40.657470 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:40.657421 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 17:35:40.660126 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:40.657390 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle podName:3a283e8b-7dfc-4c49-9afd-4adb1c192587 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:41.657035471 +0000 UTC m=+128.797098227 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle") pod "router-default-bcf746f86-2s4sn" (UID: "3a283e8b-7dfc-4c49-9afd-4adb1c192587") : configmap references non-existent config key: service-ca.crt Apr 21 17:35:40.660126 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:40.658078 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs podName:3a283e8b-7dfc-4c49-9afd-4adb1c192587 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:41.658046289 +0000 UTC m=+128.798109055 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs") pod "router-default-bcf746f86-2s4sn" (UID: "3a283e8b-7dfc-4c49-9afd-4adb1c192587") : secret "router-metrics-certs-default" not found Apr 21 17:35:41.664425 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:41.664380 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:41.664425 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:41.664431 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:41.664866 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:41.664521 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 17:35:41.664866 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:41.664579 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs podName:3a283e8b-7dfc-4c49-9afd-4adb1c192587 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:43.664564165 +0000 UTC m=+130.804626910 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs") pod "router-default-bcf746f86-2s4sn" (UID: "3a283e8b-7dfc-4c49-9afd-4adb1c192587") : secret "router-metrics-certs-default" not found Apr 21 17:35:41.664866 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:41.664621 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle podName:3a283e8b-7dfc-4c49-9afd-4adb1c192587 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:43.664615493 +0000 UTC m=+130.804678238 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle") pod "router-default-bcf746f86-2s4sn" (UID: "3a283e8b-7dfc-4c49-9afd-4adb1c192587") : configmap references non-existent config key: service-ca.crt Apr 21 17:35:43.121072 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.121033 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps"] Apr 21 17:35:43.124221 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.124198 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" Apr 21 17:35:43.127112 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.127088 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-ks2jk\"" Apr 21 17:35:43.127265 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.127092 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 17:35:43.128080 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.128063 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 17:35:43.128147 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.128109 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 17:35:43.130545 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.130526 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps"] Apr 21 17:35:43.176141 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.176093 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:35:43.176141 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.176147 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dz8ps\" (UID: \"8ae4e677-ac3a-478a-b4bd-4472d232259b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" Apr 21 17:35:43.176364 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.176188 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz9wv\" (UniqueName: \"kubernetes.io/projected/8ae4e677-ac3a-478a-b4bd-4472d232259b-kube-api-access-dz9wv\") pod \"cluster-samples-operator-6dc5bdb6b4-dz8ps\" (UID: \"8ae4e677-ac3a-478a-b4bd-4472d232259b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" Apr 21 17:35:43.176364 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:43.176265 2583 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 17:35:43.176364 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:43.176349 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs podName:38cd15ba-d0c7-4b4f-b220-f72981ccd9da nodeName:}" failed. No retries permitted until 2026-04-21 17:37:45.176332464 +0000 UTC m=+252.316395210 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs") pod "network-metrics-daemon-rfmv6" (UID: "38cd15ba-d0c7-4b4f-b220-f72981ccd9da") : secret "metrics-daemon-secret" not found Apr 21 17:35:43.277453 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.277419 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dz9wv\" (UniqueName: \"kubernetes.io/projected/8ae4e677-ac3a-478a-b4bd-4472d232259b-kube-api-access-dz9wv\") pod \"cluster-samples-operator-6dc5bdb6b4-dz8ps\" (UID: \"8ae4e677-ac3a-478a-b4bd-4472d232259b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" Apr 21 17:35:43.277713 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.277692 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dz8ps\" (UID: \"8ae4e677-ac3a-478a-b4bd-4472d232259b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" Apr 21 17:35:43.277838 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:43.277813 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 17:35:43.277910 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:43.277900 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls podName:8ae4e677-ac3a-478a-b4bd-4472d232259b nodeName:}" failed. No retries permitted until 2026-04-21 17:35:43.777873415 +0000 UTC m=+130.917936183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dz8ps" (UID: "8ae4e677-ac3a-478a-b4bd-4472d232259b") : secret "samples-operator-tls" not found Apr 21 17:35:43.289377 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.289337 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz9wv\" (UniqueName: \"kubernetes.io/projected/8ae4e677-ac3a-478a-b4bd-4472d232259b-kube-api-access-dz9wv\") pod \"cluster-samples-operator-6dc5bdb6b4-dz8ps\" (UID: \"8ae4e677-ac3a-478a-b4bd-4472d232259b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" Apr 21 17:35:43.681520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.681484 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:43.681520 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.681520 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:43.681728 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:43.681641 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle podName:3a283e8b-7dfc-4c49-9afd-4adb1c192587 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:47.681621871 +0000 UTC m=+134.821684639 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle") pod "router-default-bcf746f86-2s4sn" (UID: "3a283e8b-7dfc-4c49-9afd-4adb1c192587") : configmap references non-existent config key: service-ca.crt Apr 21 17:35:43.681728 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:43.681692 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 17:35:43.681810 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:43.681757 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs podName:3a283e8b-7dfc-4c49-9afd-4adb1c192587 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:47.681742411 +0000 UTC m=+134.821805156 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs") pod "router-default-bcf746f86-2s4sn" (UID: "3a283e8b-7dfc-4c49-9afd-4adb1c192587") : secret "router-metrics-certs-default" not found Apr 21 17:35:43.782717 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:43.782679 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dz8ps\" (UID: \"8ae4e677-ac3a-478a-b4bd-4472d232259b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" Apr 21 17:35:43.782876 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:43.782826 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 17:35:43.782915 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:43.782884 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls podName:8ae4e677-ac3a-478a-b4bd-4472d232259b nodeName:}" failed. No retries permitted until 2026-04-21 17:35:44.78287003 +0000 UTC m=+131.922932775 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dz8ps" (UID: "8ae4e677-ac3a-478a-b4bd-4472d232259b") : secret "samples-operator-tls" not found Apr 21 17:35:44.790719 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:44.790684 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dz8ps\" (UID: \"8ae4e677-ac3a-478a-b4bd-4472d232259b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" Apr 21 17:35:44.791091 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:44.790804 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 17:35:44.791091 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:44.790858 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls podName:8ae4e677-ac3a-478a-b4bd-4472d232259b nodeName:}" failed. No retries permitted until 2026-04-21 17:35:46.790843418 +0000 UTC m=+133.930906164 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dz8ps" (UID: "8ae4e677-ac3a-478a-b4bd-4472d232259b") : secret "samples-operator-tls" not found Apr 21 17:35:45.589663 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:45.589633 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hccth_16529597-1f2f-47de-ade5-9fb7b122147c/dns-node-resolver/0.log" Apr 21 17:35:46.394623 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:46.394593 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hmlbc_d5d8e485-86a1-4255-912b-af222842087c/node-ca/0.log" Apr 21 17:35:46.808056 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:46.808024 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dz8ps\" (UID: \"8ae4e677-ac3a-478a-b4bd-4472d232259b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" Apr 21 17:35:46.808233 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:46.808187 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 17:35:46.808281 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:46.808257 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls podName:8ae4e677-ac3a-478a-b4bd-4472d232259b nodeName:}" failed. No retries permitted until 2026-04-21 17:35:50.8082418 +0000 UTC m=+137.948304548 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dz8ps" (UID: "8ae4e677-ac3a-478a-b4bd-4472d232259b") : secret "samples-operator-tls" not found Apr 21 17:35:47.715556 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:47.715515 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:47.715556 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:47.715555 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:47.715974 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:47.715650 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 17:35:47.715974 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:47.715685 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle podName:3a283e8b-7dfc-4c49-9afd-4adb1c192587 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:55.715665851 +0000 UTC m=+142.855728596 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle") pod "router-default-bcf746f86-2s4sn" (UID: "3a283e8b-7dfc-4c49-9afd-4adb1c192587") : configmap references non-existent config key: service-ca.crt Apr 21 17:35:47.715974 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:47.715710 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs podName:3a283e8b-7dfc-4c49-9afd-4adb1c192587 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:55.715703119 +0000 UTC m=+142.855765864 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs") pod "router-default-bcf746f86-2s4sn" (UID: "3a283e8b-7dfc-4c49-9afd-4adb1c192587") : secret "router-metrics-certs-default" not found Apr 21 17:35:48.034828 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.034753 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4"] Apr 21 17:35:48.037567 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.037550 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" Apr 21 17:35:48.040144 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.040111 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 17:35:48.040268 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.040208 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 17:35:48.041267 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.041250 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 17:35:48.041347 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.041265 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-w8chm\"" Apr 21 17:35:48.041347 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.041253 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 17:35:48.046323 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.046305 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4"] Apr 21 17:35:48.118370 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.118317 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nghrp\" (UniqueName: \"kubernetes.io/projected/343fcef3-240d-459d-8c84-7164f0722f10-kube-api-access-nghrp\") pod \"kube-storage-version-migrator-operator-6769c5d45-h7cx4\" (UID: \"343fcef3-240d-459d-8c84-7164f0722f10\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" Apr 21 17:35:48.118558 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.118394 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343fcef3-240d-459d-8c84-7164f0722f10-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-h7cx4\" (UID: \"343fcef3-240d-459d-8c84-7164f0722f10\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" Apr 21 17:35:48.118558 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.118455 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/343fcef3-240d-459d-8c84-7164f0722f10-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-h7cx4\" (UID: \"343fcef3-240d-459d-8c84-7164f0722f10\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" Apr 21 17:35:48.219062 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.219006 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nghrp\" (UniqueName: \"kubernetes.io/projected/343fcef3-240d-459d-8c84-7164f0722f10-kube-api-access-nghrp\") pod \"kube-storage-version-migrator-operator-6769c5d45-h7cx4\" (UID: \"343fcef3-240d-459d-8c84-7164f0722f10\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" Apr 21 17:35:48.219226 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.219097 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343fcef3-240d-459d-8c84-7164f0722f10-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-h7cx4\" (UID: \"343fcef3-240d-459d-8c84-7164f0722f10\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" Apr 21 17:35:48.219226 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.219206 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/343fcef3-240d-459d-8c84-7164f0722f10-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-h7cx4\" (UID: \"343fcef3-240d-459d-8c84-7164f0722f10\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" Apr 21 17:35:48.219705 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.219681 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/343fcef3-240d-459d-8c84-7164f0722f10-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-h7cx4\" (UID: \"343fcef3-240d-459d-8c84-7164f0722f10\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" Apr 21 17:35:48.221790 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.221771 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/343fcef3-240d-459d-8c84-7164f0722f10-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-h7cx4\" (UID: \"343fcef3-240d-459d-8c84-7164f0722f10\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" Apr 21 17:35:48.227699 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.227678 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nghrp\" (UniqueName: \"kubernetes.io/projected/343fcef3-240d-459d-8c84-7164f0722f10-kube-api-access-nghrp\") pod \"kube-storage-version-migrator-operator-6769c5d45-h7cx4\" (UID: \"343fcef3-240d-459d-8c84-7164f0722f10\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" Apr 21 17:35:48.346364 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.346285 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" Apr 21 17:35:48.458725 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.458687 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4"] Apr 21 17:35:48.461703 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:35:48.461673 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod343fcef3_240d_459d_8c84_7164f0722f10.slice/crio-f8ad6f66f853e8f3b7f64a2b2767a5dfc374f621fffce62f05e42fe52b5ede52 WatchSource:0}: Error finding container f8ad6f66f853e8f3b7f64a2b2767a5dfc374f621fffce62f05e42fe52b5ede52: Status 404 returned error can't find the container with id f8ad6f66f853e8f3b7f64a2b2767a5dfc374f621fffce62f05e42fe52b5ede52 Apr 21 17:35:48.857670 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:48.857637 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" event={"ID":"343fcef3-240d-459d-8c84-7164f0722f10","Type":"ContainerStarted","Data":"f8ad6f66f853e8f3b7f64a2b2767a5dfc374f621fffce62f05e42fe52b5ede52"} Apr 21 17:35:50.103038 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.103005 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz"] Apr 21 17:35:50.105925 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.105903 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" Apr 21 17:35:50.108430 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.108409 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 17:35:50.108562 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.108532 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 17:35:50.108849 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.108833 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 17:35:50.109519 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.109504 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 17:35:50.109581 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.109504 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-mrkrh\"" Apr 21 17:35:50.112771 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.112747 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz"] Apr 21 17:35:50.237458 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.237420 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/accd670b-edfb-4f84-9bb3-c72f1ca32432-serving-cert\") pod \"service-ca-operator-d6fc45fc5-m4dlz\" (UID: \"accd670b-edfb-4f84-9bb3-c72f1ca32432\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" Apr 21 17:35:50.237625 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.237546 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnjw9\" (UniqueName: \"kubernetes.io/projected/accd670b-edfb-4f84-9bb3-c72f1ca32432-kube-api-access-gnjw9\") pod \"service-ca-operator-d6fc45fc5-m4dlz\" (UID: \"accd670b-edfb-4f84-9bb3-c72f1ca32432\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" Apr 21 17:35:50.237625 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.237608 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/accd670b-edfb-4f84-9bb3-c72f1ca32432-config\") pod \"service-ca-operator-d6fc45fc5-m4dlz\" (UID: \"accd670b-edfb-4f84-9bb3-c72f1ca32432\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" Apr 21 17:35:50.338389 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.338355 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnjw9\" (UniqueName: \"kubernetes.io/projected/accd670b-edfb-4f84-9bb3-c72f1ca32432-kube-api-access-gnjw9\") pod \"service-ca-operator-d6fc45fc5-m4dlz\" (UID: \"accd670b-edfb-4f84-9bb3-c72f1ca32432\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" Apr 21 17:35:50.338546 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.338407 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/accd670b-edfb-4f84-9bb3-c72f1ca32432-config\") pod \"service-ca-operator-d6fc45fc5-m4dlz\" (UID: \"accd670b-edfb-4f84-9bb3-c72f1ca32432\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" Apr 21 17:35:50.338621 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.338598 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/accd670b-edfb-4f84-9bb3-c72f1ca32432-serving-cert\") pod \"service-ca-operator-d6fc45fc5-m4dlz\" (UID: \"accd670b-edfb-4f84-9bb3-c72f1ca32432\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" Apr 21 17:35:50.339039 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.339018 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/accd670b-edfb-4f84-9bb3-c72f1ca32432-config\") pod \"service-ca-operator-d6fc45fc5-m4dlz\" (UID: \"accd670b-edfb-4f84-9bb3-c72f1ca32432\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" Apr 21 17:35:50.340957 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.340930 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/accd670b-edfb-4f84-9bb3-c72f1ca32432-serving-cert\") pod \"service-ca-operator-d6fc45fc5-m4dlz\" (UID: \"accd670b-edfb-4f84-9bb3-c72f1ca32432\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" Apr 21 17:35:50.346454 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.346424 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnjw9\" (UniqueName: \"kubernetes.io/projected/accd670b-edfb-4f84-9bb3-c72f1ca32432-kube-api-access-gnjw9\") pod \"service-ca-operator-d6fc45fc5-m4dlz\" (UID: \"accd670b-edfb-4f84-9bb3-c72f1ca32432\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" Apr 21 17:35:50.416463 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.416383 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" Apr 21 17:35:50.537332 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.537299 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz"] Apr 21 17:35:50.843340 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.843295 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dz8ps\" (UID: \"8ae4e677-ac3a-478a-b4bd-4472d232259b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" Apr 21 17:35:50.843525 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:50.843474 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 17:35:50.843596 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:50.843556 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls podName:8ae4e677-ac3a-478a-b4bd-4472d232259b nodeName:}" failed. No retries permitted until 2026-04-21 17:35:58.843534447 +0000 UTC m=+145.983597204 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dz8ps" (UID: "8ae4e677-ac3a-478a-b4bd-4472d232259b") : secret "samples-operator-tls" not found Apr 21 17:35:50.862532 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:50.862501 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" event={"ID":"accd670b-edfb-4f84-9bb3-c72f1ca32432","Type":"ContainerStarted","Data":"edd052df2472259d603949062eeb48418cfe32cac060e03d032cb271738bc789"} Apr 21 17:35:51.865457 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:51.865421 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" event={"ID":"343fcef3-240d-459d-8c84-7164f0722f10","Type":"ContainerStarted","Data":"69c1acfa0d8ae5f48b1166c7b88c9b6b6246433dded32e3baec56a94b72e9068"} Apr 21 17:35:51.882070 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:51.882013 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" podStartSLOduration=1.26623496 podStartE2EDuration="3.881995232s" podCreationTimestamp="2026-04-21 17:35:48 +0000 UTC" firstStartedPulling="2026-04-21 17:35:48.463371553 +0000 UTC m=+135.603434298" lastFinishedPulling="2026-04-21 17:35:51.079131822 +0000 UTC m=+138.219194570" observedRunningTime="2026-04-21 17:35:51.880978845 +0000 UTC m=+139.021041626" watchObservedRunningTime="2026-04-21 17:35:51.881995232 +0000 UTC m=+139.022058002" Apr 21 17:35:53.870504 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:53.870462 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" event={"ID":"accd670b-edfb-4f84-9bb3-c72f1ca32432","Type":"ContainerStarted","Data":"836879d8dfcd28791c2dfee2be5d60e274fd6c08793c8f4616a5c2563187203c"} Apr 21 17:35:53.889783 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:53.889736 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" podStartSLOduration=1.161804611 podStartE2EDuration="3.889720942s" podCreationTimestamp="2026-04-21 17:35:50 +0000 UTC" firstStartedPulling="2026-04-21 17:35:50.548958135 +0000 UTC m=+137.689020883" lastFinishedPulling="2026-04-21 17:35:53.276874455 +0000 UTC m=+140.416937214" observedRunningTime="2026-04-21 17:35:53.888821828 +0000 UTC m=+141.028884595" watchObservedRunningTime="2026-04-21 17:35:53.889720942 +0000 UTC m=+141.029783709" Apr 21 17:35:55.784688 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:55.784623 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:55.784688 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:55.784696 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:35:55.785278 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:55.784817 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle podName:3a283e8b-7dfc-4c49-9afd-4adb1c192587 nodeName:}" failed. No retries permitted until 2026-04-21 17:36:11.784792267 +0000 UTC m=+158.924855016 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle") pod "router-default-bcf746f86-2s4sn" (UID: "3a283e8b-7dfc-4c49-9afd-4adb1c192587") : configmap references non-existent config key: service-ca.crt Apr 21 17:35:55.785278 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:55.784856 2583 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 17:35:55.785278 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:55.784911 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs podName:3a283e8b-7dfc-4c49-9afd-4adb1c192587 nodeName:}" failed. No retries permitted until 2026-04-21 17:36:11.784898164 +0000 UTC m=+158.924960914 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs") pod "router-default-bcf746f86-2s4sn" (UID: "3a283e8b-7dfc-4c49-9afd-4adb1c192587") : secret "router-metrics-certs-default" not found Apr 21 17:35:56.789499 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.789462 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kwqkm"] Apr 21 17:35:56.792383 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.792367 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-kwqkm" Apr 21 17:35:56.794781 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.794757 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 17:35:56.795839 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.795812 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-qbvtz\"" Apr 21 17:35:56.795954 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.795815 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 17:35:56.795954 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.795827 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 17:35:56.795954 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.795928 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 17:35:56.800625 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.800602 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kwqkm"] Apr 21 17:35:56.893548 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.893514 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a91aab9b-c4c3-4a2a-bd4d-04ab08463e07-signing-key\") pod \"service-ca-865cb79987-kwqkm\" (UID: \"a91aab9b-c4c3-4a2a-bd4d-04ab08463e07\") " pod="openshift-service-ca/service-ca-865cb79987-kwqkm" Apr 21 17:35:56.893548 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.893544 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5ljk\" (UniqueName: \"kubernetes.io/projected/a91aab9b-c4c3-4a2a-bd4d-04ab08463e07-kube-api-access-l5ljk\") pod \"service-ca-865cb79987-kwqkm\" (UID: \"a91aab9b-c4c3-4a2a-bd4d-04ab08463e07\") " pod="openshift-service-ca/service-ca-865cb79987-kwqkm" Apr 21 17:35:56.893748 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.893565 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a91aab9b-c4c3-4a2a-bd4d-04ab08463e07-signing-cabundle\") pod \"service-ca-865cb79987-kwqkm\" (UID: \"a91aab9b-c4c3-4a2a-bd4d-04ab08463e07\") " pod="openshift-service-ca/service-ca-865cb79987-kwqkm" Apr 21 17:35:56.994665 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.994625 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a91aab9b-c4c3-4a2a-bd4d-04ab08463e07-signing-key\") pod \"service-ca-865cb79987-kwqkm\" (UID: \"a91aab9b-c4c3-4a2a-bd4d-04ab08463e07\") " pod="openshift-service-ca/service-ca-865cb79987-kwqkm" Apr 21 17:35:56.994665 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.994662 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5ljk\" (UniqueName: \"kubernetes.io/projected/a91aab9b-c4c3-4a2a-bd4d-04ab08463e07-kube-api-access-l5ljk\") pod \"service-ca-865cb79987-kwqkm\" (UID: \"a91aab9b-c4c3-4a2a-bd4d-04ab08463e07\") " pod="openshift-service-ca/service-ca-865cb79987-kwqkm" Apr 21 17:35:56.994863 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.994681 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a91aab9b-c4c3-4a2a-bd4d-04ab08463e07-signing-cabundle\") pod \"service-ca-865cb79987-kwqkm\" (UID: \"a91aab9b-c4c3-4a2a-bd4d-04ab08463e07\") " pod="openshift-service-ca/service-ca-865cb79987-kwqkm" Apr 21 17:35:56.995305 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.995281 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a91aab9b-c4c3-4a2a-bd4d-04ab08463e07-signing-cabundle\") pod \"service-ca-865cb79987-kwqkm\" (UID: \"a91aab9b-c4c3-4a2a-bd4d-04ab08463e07\") " pod="openshift-service-ca/service-ca-865cb79987-kwqkm" Apr 21 17:35:56.997183 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:56.997158 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a91aab9b-c4c3-4a2a-bd4d-04ab08463e07-signing-key\") pod \"service-ca-865cb79987-kwqkm\" (UID: \"a91aab9b-c4c3-4a2a-bd4d-04ab08463e07\") " pod="openshift-service-ca/service-ca-865cb79987-kwqkm" Apr 21 17:35:57.004101 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.004078 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5ljk\" (UniqueName: \"kubernetes.io/projected/a91aab9b-c4c3-4a2a-bd4d-04ab08463e07-kube-api-access-l5ljk\") pod \"service-ca-865cb79987-kwqkm\" (UID: \"a91aab9b-c4c3-4a2a-bd4d-04ab08463e07\") " pod="openshift-service-ca/service-ca-865cb79987-kwqkm" Apr 21 17:35:57.020806 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.020777 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dllbm"] Apr 21 17:35:57.024083 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.024063 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.026663 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.026641 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 17:35:57.026774 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.026641 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 17:35:57.026774 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.026644 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 17:35:57.026885 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.026804 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-55shw\"" Apr 21 17:35:57.027054 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.027039 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 17:35:57.033106 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.033087 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dllbm"] Apr 21 17:35:57.095323 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.095236 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.095323 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.095281 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c73551f6-0e85-4d52-beca-b016ec94ad43-data-volume\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.095323 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.095299 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c73551f6-0e85-4d52-beca-b016ec94ad43-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.095323 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.095317 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8px8\" (UniqueName: \"kubernetes.io/projected/c73551f6-0e85-4d52-beca-b016ec94ad43-kube-api-access-l8px8\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.095728 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.095468 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c73551f6-0e85-4d52-beca-b016ec94ad43-crio-socket\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.101240 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.101204 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-kwqkm" Apr 21 17:35:57.195943 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.195917 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c73551f6-0e85-4d52-beca-b016ec94ad43-crio-socket\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.196110 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.195964 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.196110 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.195998 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c73551f6-0e85-4d52-beca-b016ec94ad43-data-volume\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.196110 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.196014 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c73551f6-0e85-4d52-beca-b016ec94ad43-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.196110 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.196030 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8px8\" (UniqueName: \"kubernetes.io/projected/c73551f6-0e85-4d52-beca-b016ec94ad43-kube-api-access-l8px8\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.196110 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.196037 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/c73551f6-0e85-4d52-beca-b016ec94ad43-crio-socket\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.196368 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:57.196150 2583 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 17:35:57.196368 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:57.196232 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls podName:c73551f6-0e85-4d52-beca-b016ec94ad43 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:57.696213992 +0000 UTC m=+144.836276738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls") pod "insights-runtime-extractor-dllbm" (UID: "c73551f6-0e85-4d52-beca-b016ec94ad43") : secret "insights-runtime-extractor-tls" not found Apr 21 17:35:57.196477 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.196381 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/c73551f6-0e85-4d52-beca-b016ec94ad43-data-volume\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.196724 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.196703 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/c73551f6-0e85-4d52-beca-b016ec94ad43-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.205148 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.205082 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8px8\" (UniqueName: \"kubernetes.io/projected/c73551f6-0e85-4d52-beca-b016ec94ad43-kube-api-access-l8px8\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.221784 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.221750 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kwqkm"] Apr 21 17:35:57.224828 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:35:57.224798 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda91aab9b_c4c3_4a2a_bd4d_04ab08463e07.slice/crio-dec13fbb067c8be05e78caf79774f1dbe1d798a708d339dd11978e303db608ac WatchSource:0}: Error finding container dec13fbb067c8be05e78caf79774f1dbe1d798a708d339dd11978e303db608ac: Status 404 returned error can't find the container with id dec13fbb067c8be05e78caf79774f1dbe1d798a708d339dd11978e303db608ac Apr 21 17:35:57.700309 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.700260 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:57.700488 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:57.700402 2583 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 17:35:57.700488 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:57.700471 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls podName:c73551f6-0e85-4d52-beca-b016ec94ad43 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:58.700455374 +0000 UTC m=+145.840518119 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls") pod "insights-runtime-extractor-dllbm" (UID: "c73551f6-0e85-4d52-beca-b016ec94ad43") : secret "insights-runtime-extractor-tls" not found Apr 21 17:35:57.880209 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.880151 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-kwqkm" event={"ID":"a91aab9b-c4c3-4a2a-bd4d-04ab08463e07","Type":"ContainerStarted","Data":"d551f8d9d1a1319ce2d02f865f0667924d967bfb700510fbfa965fbb56fe420a"} Apr 21 17:35:57.880209 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:57.880212 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-kwqkm" event={"ID":"a91aab9b-c4c3-4a2a-bd4d-04ab08463e07","Type":"ContainerStarted","Data":"dec13fbb067c8be05e78caf79774f1dbe1d798a708d339dd11978e303db608ac"} Apr 21 17:35:58.707684 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:58.707650 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:35:58.707836 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:58.707814 2583 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 17:35:58.707901 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:58.707889 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls podName:c73551f6-0e85-4d52-beca-b016ec94ad43 nodeName:}" failed. No retries permitted until 2026-04-21 17:36:00.707868839 +0000 UTC m=+147.847931590 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls") pod "insights-runtime-extractor-dllbm" (UID: "c73551f6-0e85-4d52-beca-b016ec94ad43") : secret "insights-runtime-extractor-tls" not found Apr 21 17:35:58.909434 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:35:58.909389 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dz8ps\" (UID: \"8ae4e677-ac3a-478a-b4bd-4472d232259b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" Apr 21 17:35:58.909901 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:58.909565 2583 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 17:35:58.909901 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:35:58.909645 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls podName:8ae4e677-ac3a-478a-b4bd-4472d232259b nodeName:}" failed. No retries permitted until 2026-04-21 17:36:14.909626288 +0000 UTC m=+162.049689037 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dz8ps" (UID: "8ae4e677-ac3a-478a-b4bd-4472d232259b") : secret "samples-operator-tls" not found Apr 21 17:36:00.723390 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:00.723344 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:36:00.723870 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:36:00.723512 2583 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 17:36:00.723870 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:36:00.723597 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls podName:c73551f6-0e85-4d52-beca-b016ec94ad43 nodeName:}" failed. No retries permitted until 2026-04-21 17:36:04.723575486 +0000 UTC m=+151.863638231 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls") pod "insights-runtime-extractor-dllbm" (UID: "c73551f6-0e85-4d52-beca-b016ec94ad43") : secret "insights-runtime-extractor-tls" not found Apr 21 17:36:04.757990 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:04.757951 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:36:04.760763 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:04.760733 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/c73551f6-0e85-4d52-beca-b016ec94ad43-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dllbm\" (UID: \"c73551f6-0e85-4d52-beca-b016ec94ad43\") " pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:36:04.833078 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:04.833039 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dllbm" Apr 21 17:36:04.953642 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:04.953556 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-kwqkm" podStartSLOduration=8.953538005 podStartE2EDuration="8.953538005s" podCreationTimestamp="2026-04-21 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:35:57.900326353 +0000 UTC m=+145.040389119" watchObservedRunningTime="2026-04-21 17:36:04.953538005 +0000 UTC m=+152.093600774" Apr 21 17:36:04.954926 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:04.954901 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dllbm"] Apr 21 17:36:04.957506 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:36:04.957473 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc73551f6_0e85_4d52_beca_b016ec94ad43.slice/crio-24070f98a78de5039c985ebbbad935828c4f8a60c02c33c1a8af2e9aee7fff6e WatchSource:0}: Error finding container 24070f98a78de5039c985ebbbad935828c4f8a60c02c33c1a8af2e9aee7fff6e: Status 404 returned error can't find the container with id 24070f98a78de5039c985ebbbad935828c4f8a60c02c33c1a8af2e9aee7fff6e Apr 21 17:36:05.899046 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:05.898956 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dllbm" event={"ID":"c73551f6-0e85-4d52-beca-b016ec94ad43","Type":"ContainerStarted","Data":"0bda0513ec21e03cdf00d86d7c57f4c8644ddfdcb53284fffd902c41fd76d866"} Apr 21 17:36:05.899046 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:05.899001 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dllbm" event={"ID":"c73551f6-0e85-4d52-beca-b016ec94ad43","Type":"ContainerStarted","Data":"57a878230212cd5cc3d386e9aa5e987025f7dd8d4443ad26e7251f84b2b2cefe"} Apr 21 17:36:05.899046 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:05.899014 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dllbm" event={"ID":"c73551f6-0e85-4d52-beca-b016ec94ad43","Type":"ContainerStarted","Data":"24070f98a78de5039c985ebbbad935828c4f8a60c02c33c1a8af2e9aee7fff6e"} Apr 21 17:36:07.906581 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:07.906533 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dllbm" event={"ID":"c73551f6-0e85-4d52-beca-b016ec94ad43","Type":"ContainerStarted","Data":"5f9af0210d3712b1533d3e414a051a58353f12cd548194e0f0b520cd0ca1d5d0"} Apr 21 17:36:07.926309 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:07.926265 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dllbm" podStartSLOduration=8.717688924 podStartE2EDuration="10.926253035s" podCreationTimestamp="2026-04-21 17:35:57 +0000 UTC" firstStartedPulling="2026-04-21 17:36:05.015015528 +0000 UTC m=+152.155078288" lastFinishedPulling="2026-04-21 17:36:07.223579644 +0000 UTC m=+154.363642399" observedRunningTime="2026-04-21 17:36:07.925026291 +0000 UTC m=+155.065089059" watchObservedRunningTime="2026-04-21 17:36:07.926253035 +0000 UTC m=+155.066315865" Apr 21 17:36:08.283559 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:36:08.283512 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-4cg4j" podUID="53e51873-1fc7-47fe-b3f9-224f9c0521d4" Apr 21 17:36:08.299847 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:36:08.299811 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-tk5wc" podUID="c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124" Apr 21 17:36:08.909436 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:08.909401 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:36:09.472265 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:36:09.472229 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-rfmv6" podUID="38cd15ba-d0c7-4b4f-b220-f72981ccd9da" Apr 21 17:36:11.819471 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:11.819417 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:36:11.819471 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:11.819464 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:36:11.820534 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:11.820512 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a283e8b-7dfc-4c49-9afd-4adb1c192587-service-ca-bundle\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:36:11.821985 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:11.821963 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3a283e8b-7dfc-4c49-9afd-4adb1c192587-metrics-certs\") pod \"router-default-bcf746f86-2s4sn\" (UID: \"3a283e8b-7dfc-4c49-9afd-4adb1c192587\") " pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:36:12.119502 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:12.119414 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:36:12.235466 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:12.235442 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-bcf746f86-2s4sn"] Apr 21 17:36:12.237736 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:36:12.237715 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a283e8b_7dfc_4c49_9afd_4adb1c192587.slice/crio-ad40e39c04dbb76faad28f3a5229713d01bb27c21e718657a4f5c9ac03e33db4 WatchSource:0}: Error finding container ad40e39c04dbb76faad28f3a5229713d01bb27c21e718657a4f5c9ac03e33db4: Status 404 returned error can't find the container with id ad40e39c04dbb76faad28f3a5229713d01bb27c21e718657a4f5c9ac03e33db4 Apr 21 17:36:12.920809 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:12.920770 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-bcf746f86-2s4sn" event={"ID":"3a283e8b-7dfc-4c49-9afd-4adb1c192587","Type":"ContainerStarted","Data":"f7fb673102fb123d8decabb79dfea0aa63f73d2939e5426f5cfa3a8420e2c535"} Apr 21 17:36:12.920809 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:12.920807 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-bcf746f86-2s4sn" event={"ID":"3a283e8b-7dfc-4c49-9afd-4adb1c192587","Type":"ContainerStarted","Data":"ad40e39c04dbb76faad28f3a5229713d01bb27c21e718657a4f5c9ac03e33db4"} Apr 21 17:36:12.939519 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:12.939470 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-bcf746f86-2s4sn" podStartSLOduration=33.939457553 podStartE2EDuration="33.939457553s" podCreationTimestamp="2026-04-21 17:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:36:12.938599027 +0000 UTC m=+160.078661791" watchObservedRunningTime="2026-04-21 17:36:12.939457553 +0000 UTC m=+160.079520319" Apr 21 17:36:13.119755 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:13.119707 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:36:13.122278 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:13.122255 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:36:13.332261 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:13.332218 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:36:13.332429 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:13.332307 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert\") pod \"ingress-canary-4cg4j\" (UID: \"53e51873-1fc7-47fe-b3f9-224f9c0521d4\") " pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:36:13.334715 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:13.334694 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53e51873-1fc7-47fe-b3f9-224f9c0521d4-cert\") pod \"ingress-canary-4cg4j\" (UID: \"53e51873-1fc7-47fe-b3f9-224f9c0521d4\") " pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:36:13.334779 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:13.334729 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124-metrics-tls\") pod \"dns-default-tk5wc\" (UID: \"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124\") " pod="openshift-dns/dns-default-tk5wc" Apr 21 17:36:13.413521 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:13.413488 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4fbh6\"" Apr 21 17:36:13.421651 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:13.421632 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4cg4j" Apr 21 17:36:13.539590 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:13.539555 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4cg4j"] Apr 21 17:36:13.542256 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:36:13.542227 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53e51873_1fc7_47fe_b3f9_224f9c0521d4.slice/crio-58f9c4f02027bc531a520af1afd15254afe45eb772f1ca87cb440d540be31c7b WatchSource:0}: Error finding container 58f9c4f02027bc531a520af1afd15254afe45eb772f1ca87cb440d540be31c7b: Status 404 returned error can't find the container with id 58f9c4f02027bc531a520af1afd15254afe45eb772f1ca87cb440d540be31c7b Apr 21 17:36:13.924718 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:13.924633 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4cg4j" event={"ID":"53e51873-1fc7-47fe-b3f9-224f9c0521d4","Type":"ContainerStarted","Data":"58f9c4f02027bc531a520af1afd15254afe45eb772f1ca87cb440d540be31c7b"} Apr 21 17:36:13.925093 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:13.924793 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:36:13.926094 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:13.926073 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-bcf746f86-2s4sn" Apr 21 17:36:14.946010 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:14.945977 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dz8ps\" (UID: \"8ae4e677-ac3a-478a-b4bd-4472d232259b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" Apr 21 17:36:14.949002 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:14.948978 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8ae4e677-ac3a-478a-b4bd-4472d232259b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dz8ps\" (UID: \"8ae4e677-ac3a-478a-b4bd-4472d232259b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" Apr 21 17:36:15.233655 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:15.233618 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" Apr 21 17:36:15.354950 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:15.354916 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps"] Apr 21 17:36:15.931021 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:15.930980 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4cg4j" event={"ID":"53e51873-1fc7-47fe-b3f9-224f9c0521d4","Type":"ContainerStarted","Data":"409353885cf11cfd905bbba7be08585b7708573056a5435ff779f54299a346f9"} Apr 21 17:36:15.931977 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:15.931950 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" event={"ID":"8ae4e677-ac3a-478a-b4bd-4472d232259b","Type":"ContainerStarted","Data":"bb9f048996f7d4b2ac04933a64a8522d844426563e3dc1cc9d2a90dc00f487b8"} Apr 21 17:36:15.948380 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:15.948330 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4cg4j" podStartSLOduration=129.386437727 podStartE2EDuration="2m10.948315698s" podCreationTimestamp="2026-04-21 17:34:05 +0000 UTC" firstStartedPulling="2026-04-21 17:36:13.544059304 +0000 UTC m=+160.684122062" lastFinishedPulling="2026-04-21 17:36:15.105937285 +0000 UTC m=+162.246000033" observedRunningTime="2026-04-21 17:36:15.947435331 +0000 UTC m=+163.087498097" watchObservedRunningTime="2026-04-21 17:36:15.948315698 +0000 UTC m=+163.088378462" Apr 21 17:36:17.939969 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:17.939936 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" event={"ID":"8ae4e677-ac3a-478a-b4bd-4472d232259b","Type":"ContainerStarted","Data":"537d4f31a22ff96f67319c878d3acbe40340a1bbb6766b0097597d9e1b10de4c"} Apr 21 17:36:17.939969 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:17.939972 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" event={"ID":"8ae4e677-ac3a-478a-b4bd-4472d232259b","Type":"ContainerStarted","Data":"55a40cf678ebb3ac28cbb787f946fa574ee18ad03d49e7da407d7b303780f5cb"} Apr 21 17:36:17.956859 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:17.956811 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dz8ps" podStartSLOduration=32.551993825 podStartE2EDuration="34.956797168s" podCreationTimestamp="2026-04-21 17:35:43 +0000 UTC" firstStartedPulling="2026-04-21 17:36:15.402737556 +0000 UTC m=+162.542800300" lastFinishedPulling="2026-04-21 17:36:17.807540897 +0000 UTC m=+164.947603643" observedRunningTime="2026-04-21 17:36:17.956252809 +0000 UTC m=+165.096315576" watchObservedRunningTime="2026-04-21 17:36:17.956797168 +0000 UTC m=+165.096859934" Apr 21 17:36:18.888695 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:18.888656 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nzxcj"] Apr 21 17:36:18.891665 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:18.891644 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nzxcj" Apr 21 17:36:18.895307 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:18.895285 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 17:36:18.895408 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:18.895343 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-hhrbk\"" Apr 21 17:36:18.899617 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:18.899594 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nzxcj"] Apr 21 17:36:19.011904 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.011871 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-86bffdb85-f9pnq"] Apr 21 17:36:19.015361 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.015340 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.022821 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.022798 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-n6n9k\"" Apr 21 17:36:19.023528 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.023509 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 17:36:19.023713 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.023520 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 17:36:19.024956 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.024938 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 17:36:19.034465 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.030904 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 17:36:19.036840 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.036817 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86bffdb85-f9pnq"] Apr 21 17:36:19.079102 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.079061 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/82faf712-4dd3-4d34-8d74-ef5e8b3aa92c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-nzxcj\" (UID: \"82faf712-4dd3-4d34-8d74-ef5e8b3aa92c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nzxcj" Apr 21 17:36:19.179958 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.179871 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-installation-pull-secrets\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.179958 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.179910 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-image-registry-private-configuration\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.179958 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.179933 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjr5h\" (UniqueName: \"kubernetes.io/projected/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-kube-api-access-mjr5h\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.180167 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.180003 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-registry-tls\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.180167 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.180035 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-bound-sa-token\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.180167 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.180080 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-registry-certificates\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.180167 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.180101 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/82faf712-4dd3-4d34-8d74-ef5e8b3aa92c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-nzxcj\" (UID: \"82faf712-4dd3-4d34-8d74-ef5e8b3aa92c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nzxcj" Apr 21 17:36:19.180167 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.180125 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-ca-trust-extracted\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.180167 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.180141 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-trusted-ca\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.182770 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.182735 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/82faf712-4dd3-4d34-8d74-ef5e8b3aa92c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-nzxcj\" (UID: \"82faf712-4dd3-4d34-8d74-ef5e8b3aa92c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nzxcj" Apr 21 17:36:19.209590 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.204303 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nzxcj" Apr 21 17:36:19.280891 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.280854 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-registry-tls\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.281055 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.280904 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-bound-sa-token\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.281055 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.280941 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-registry-certificates\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.281055 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.280974 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-ca-trust-extracted\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.281055 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.280996 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-trusted-ca\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.281055 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.281049 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-installation-pull-secrets\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.281336 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.281076 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-image-registry-private-configuration\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.281336 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.281107 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjr5h\" (UniqueName: \"kubernetes.io/projected/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-kube-api-access-mjr5h\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.282122 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.281880 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-ca-trust-extracted\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.282513 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.282484 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-trusted-ca\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.282648 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.282623 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-registry-certificates\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.285337 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.285314 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-registry-tls\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.285782 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.285699 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-image-registry-private-configuration\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.285782 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.285742 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-installation-pull-secrets\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.292216 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.292196 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjr5h\" (UniqueName: \"kubernetes.io/projected/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-kube-api-access-mjr5h\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.293280 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.293228 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec7cf3c5-c440-4bc6-a03d-076cc402b14a-bound-sa-token\") pod \"image-registry-86bffdb85-f9pnq\" (UID: \"ec7cf3c5-c440-4bc6-a03d-076cc402b14a\") " pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.324835 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.324801 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.328345 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.328320 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nzxcj"] Apr 21 17:36:19.331108 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:36:19.331086 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82faf712_4dd3_4d34_8d74_ef5e8b3aa92c.slice/crio-97fd33d6912b227677323adbf9c299458a610848c67f31b6b7efa32406c8e03d WatchSource:0}: Error finding container 97fd33d6912b227677323adbf9c299458a610848c67f31b6b7efa32406c8e03d: Status 404 returned error can't find the container with id 97fd33d6912b227677323adbf9c299458a610848c67f31b6b7efa32406c8e03d Apr 21 17:36:19.444626 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.444540 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86bffdb85-f9pnq"] Apr 21 17:36:19.447891 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:36:19.447865 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec7cf3c5_c440_4bc6_a03d_076cc402b14a.slice/crio-25e29e066b4ae5bf7970b89ade15e0ef953001e8b3693d176207ed84ab76f875 WatchSource:0}: Error finding container 25e29e066b4ae5bf7970b89ade15e0ef953001e8b3693d176207ed84ab76f875: Status 404 returned error can't find the container with id 25e29e066b4ae5bf7970b89ade15e0ef953001e8b3693d176207ed84ab76f875 Apr 21 17:36:19.459955 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.459933 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tk5wc" Apr 21 17:36:19.462438 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.462420 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mlvk8\"" Apr 21 17:36:19.470442 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.470421 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tk5wc" Apr 21 17:36:19.595900 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.595866 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tk5wc"] Apr 21 17:36:19.600529 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:36:19.600501 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3cd5d3c_ce71_40e7_aaf4_cef9eac5f124.slice/crio-110ce14e096bfb76b0d6f84b6655d38e0a3ce5ec7b806655e5d7121e7fa26f1a WatchSource:0}: Error finding container 110ce14e096bfb76b0d6f84b6655d38e0a3ce5ec7b806655e5d7121e7fa26f1a: Status 404 returned error can't find the container with id 110ce14e096bfb76b0d6f84b6655d38e0a3ce5ec7b806655e5d7121e7fa26f1a Apr 21 17:36:19.946322 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.946280 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tk5wc" event={"ID":"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124","Type":"ContainerStarted","Data":"110ce14e096bfb76b0d6f84b6655d38e0a3ce5ec7b806655e5d7121e7fa26f1a"} Apr 21 17:36:19.947665 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.947625 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" event={"ID":"ec7cf3c5-c440-4bc6-a03d-076cc402b14a","Type":"ContainerStarted","Data":"529a96549399d675afa365cb77a26fdabff4e4032ba46a1327dea3b41519ea9e"} Apr 21 17:36:19.947665 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.947659 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" event={"ID":"ec7cf3c5-c440-4bc6-a03d-076cc402b14a","Type":"ContainerStarted","Data":"25e29e066b4ae5bf7970b89ade15e0ef953001e8b3693d176207ed84ab76f875"} Apr 21 17:36:19.947848 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.947751 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:19.948784 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.948758 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nzxcj" event={"ID":"82faf712-4dd3-4d34-8d74-ef5e8b3aa92c","Type":"ContainerStarted","Data":"97fd33d6912b227677323adbf9c299458a610848c67f31b6b7efa32406c8e03d"} Apr 21 17:36:19.965861 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:19.965818 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" podStartSLOduration=1.965800415 podStartE2EDuration="1.965800415s" podCreationTimestamp="2026-04-21 17:36:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:36:19.965563858 +0000 UTC m=+167.105626637" watchObservedRunningTime="2026-04-21 17:36:19.965800415 +0000 UTC m=+167.105863183" Apr 21 17:36:20.460412 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:20.460374 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:36:21.956703 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:21.956666 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tk5wc" event={"ID":"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124","Type":"ContainerStarted","Data":"52e7c29e75fb9a580a7605e0a41fbb8cb82153392a2dc5506a7f332d6eb28fda"} Apr 21 17:36:21.956703 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:21.956701 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tk5wc" event={"ID":"c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124","Type":"ContainerStarted","Data":"e35a23074ba0fd7cadd957bb61c1b271de26794806fe9d849bdc09b0d7909357"} Apr 21 17:36:21.957215 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:21.956762 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-tk5wc" Apr 21 17:36:21.958003 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:21.957983 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nzxcj" event={"ID":"82faf712-4dd3-4d34-8d74-ef5e8b3aa92c","Type":"ContainerStarted","Data":"4151d198dbdf7c0dbecd42e29567094ffcedb028c4983d028f13d05a134e0394"} Apr 21 17:36:21.958151 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:21.958137 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nzxcj" Apr 21 17:36:21.962822 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:21.962803 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nzxcj" Apr 21 17:36:21.983240 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:21.983196 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tk5wc" podStartSLOduration=135.565053235 podStartE2EDuration="2m16.983159205s" podCreationTimestamp="2026-04-21 17:34:05 +0000 UTC" firstStartedPulling="2026-04-21 17:36:19.602449347 +0000 UTC m=+166.742512092" lastFinishedPulling="2026-04-21 17:36:21.020555315 +0000 UTC m=+168.160618062" observedRunningTime="2026-04-21 17:36:21.982784018 +0000 UTC m=+169.122846811" watchObservedRunningTime="2026-04-21 17:36:21.983159205 +0000 UTC m=+169.123221973" Apr 21 17:36:22.000837 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:22.000790 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-nzxcj" podStartSLOduration=2.316658943 podStartE2EDuration="4.000777757s" podCreationTimestamp="2026-04-21 17:36:18 +0000 UTC" firstStartedPulling="2026-04-21 17:36:19.332916652 +0000 UTC m=+166.472979414" lastFinishedPulling="2026-04-21 17:36:21.017035464 +0000 UTC m=+168.157098228" observedRunningTime="2026-04-21 17:36:22.000025024 +0000 UTC m=+169.140087829" watchObservedRunningTime="2026-04-21 17:36:22.000777757 +0000 UTC m=+169.140840524" Apr 21 17:36:26.454675 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.454630 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp"] Apr 21 17:36:26.459394 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.459370 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" Apr 21 17:36:26.461770 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.461743 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 17:36:26.461886 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.461854 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 17:36:26.462053 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.462022 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-fwmf9\"" Apr 21 17:36:26.462810 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.462783 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 17:36:26.462896 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.462832 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 17:36:26.462896 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.462870 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 17:36:26.471573 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.470605 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ns9q7"] Apr 21 17:36:26.473838 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.473820 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp"] Apr 21 17:36:26.473957 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.473945 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.476844 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.476826 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xm58w\"" Apr 21 17:36:26.477142 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.476826 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 17:36:26.477142 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.476990 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 17:36:26.477142 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.476932 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 17:36:26.643243 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.643210 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-wtmp\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.643243 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.643255 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-textfile\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.643481 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.643272 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-accelerators-collector-config\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.643481 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.643295 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b27107b2-5afd-4ffb-9d8b-7d03e1202c52-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hmmfp\" (UID: \"b27107b2-5afd-4ffb-9d8b-7d03e1202c52\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" Apr 21 17:36:26.643481 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.643316 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-root\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.643481 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.643351 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9gvg\" (UniqueName: \"kubernetes.io/projected/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-kube-api-access-x9gvg\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.643481 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.643384 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b27107b2-5afd-4ffb-9d8b-7d03e1202c52-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hmmfp\" (UID: \"b27107b2-5afd-4ffb-9d8b-7d03e1202c52\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" Apr 21 17:36:26.643481 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.643437 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-sys\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.643683 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.643482 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-tls\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.643683 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.643502 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-metrics-client-ca\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.643683 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.643531 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b27107b2-5afd-4ffb-9d8b-7d03e1202c52-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hmmfp\" (UID: \"b27107b2-5afd-4ffb-9d8b-7d03e1202c52\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" Apr 21 17:36:26.643683 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.643554 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.643683 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.643570 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs78b\" (UniqueName: \"kubernetes.io/projected/b27107b2-5afd-4ffb-9d8b-7d03e1202c52-kube-api-access-hs78b\") pod \"openshift-state-metrics-9d44df66c-hmmfp\" (UID: \"b27107b2-5afd-4ffb-9d8b-7d03e1202c52\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" Apr 21 17:36:26.744462 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.744421 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b27107b2-5afd-4ffb-9d8b-7d03e1202c52-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hmmfp\" (UID: \"b27107b2-5afd-4ffb-9d8b-7d03e1202c52\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" Apr 21 17:36:26.744462 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.744460 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-sys\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.744672 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.744488 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-tls\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.744672 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.744566 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-sys\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.744672 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.744609 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-metrics-client-ca\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.744672 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.744650 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b27107b2-5afd-4ffb-9d8b-7d03e1202c52-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hmmfp\" (UID: \"b27107b2-5afd-4ffb-9d8b-7d03e1202c52\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" Apr 21 17:36:26.744848 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.744689 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.744848 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.744717 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs78b\" (UniqueName: \"kubernetes.io/projected/b27107b2-5afd-4ffb-9d8b-7d03e1202c52-kube-api-access-hs78b\") pod \"openshift-state-metrics-9d44df66c-hmmfp\" (UID: \"b27107b2-5afd-4ffb-9d8b-7d03e1202c52\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" Apr 21 17:36:26.744950 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.744892 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-wtmp\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.745008 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.744973 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-textfile\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.745096 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.745075 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-wtmp\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.745161 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.745004 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-accelerators-collector-config\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.745161 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.745137 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b27107b2-5afd-4ffb-9d8b-7d03e1202c52-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hmmfp\" (UID: \"b27107b2-5afd-4ffb-9d8b-7d03e1202c52\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" Apr 21 17:36:26.745287 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.745166 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-root\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.745287 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.745222 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9gvg\" (UniqueName: \"kubernetes.io/projected/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-kube-api-access-x9gvg\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.745385 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.745364 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-textfile\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.745385 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.745375 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-metrics-client-ca\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.745505 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.745434 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-root\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.745505 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.745447 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b27107b2-5afd-4ffb-9d8b-7d03e1202c52-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-hmmfp\" (UID: \"b27107b2-5afd-4ffb-9d8b-7d03e1202c52\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" Apr 21 17:36:26.746015 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.745991 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-accelerators-collector-config\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.747672 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.747639 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-tls\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.747777 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.747729 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b27107b2-5afd-4ffb-9d8b-7d03e1202c52-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-hmmfp\" (UID: \"b27107b2-5afd-4ffb-9d8b-7d03e1202c52\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" Apr 21 17:36:26.748084 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.748054 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.748084 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.748082 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b27107b2-5afd-4ffb-9d8b-7d03e1202c52-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-hmmfp\" (UID: \"b27107b2-5afd-4ffb-9d8b-7d03e1202c52\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" Apr 21 17:36:26.757690 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.757662 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9gvg\" (UniqueName: \"kubernetes.io/projected/edb54a22-e9b8-49a2-b5f3-e4273f5b4400-kube-api-access-x9gvg\") pod \"node-exporter-ns9q7\" (UID: \"edb54a22-e9b8-49a2-b5f3-e4273f5b4400\") " pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.758281 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.758261 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs78b\" (UniqueName: \"kubernetes.io/projected/b27107b2-5afd-4ffb-9d8b-7d03e1202c52-kube-api-access-hs78b\") pod \"openshift-state-metrics-9d44df66c-hmmfp\" (UID: \"b27107b2-5afd-4ffb-9d8b-7d03e1202c52\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" Apr 21 17:36:26.770099 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.770080 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" Apr 21 17:36:26.784011 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.783986 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ns9q7" Apr 21 17:36:26.793221 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:36:26.793192 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedb54a22_e9b8_49a2_b5f3_e4273f5b4400.slice/crio-afb613198e05169d18efb67fb7cc98aab53a4c0cc861ad36893928a45451ac0c WatchSource:0}: Error finding container afb613198e05169d18efb67fb7cc98aab53a4c0cc861ad36893928a45451ac0c: Status 404 returned error can't find the container with id afb613198e05169d18efb67fb7cc98aab53a4c0cc861ad36893928a45451ac0c Apr 21 17:36:26.899658 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.899610 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp"] Apr 21 17:36:26.904273 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:36:26.904243 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb27107b2_5afd_4ffb_9d8b_7d03e1202c52.slice/crio-96fe3032c0fefdfa95724fa79a080c4ea0993e5dc8c29f51df727886996961d6 WatchSource:0}: Error finding container 96fe3032c0fefdfa95724fa79a080c4ea0993e5dc8c29f51df727886996961d6: Status 404 returned error can't find the container with id 96fe3032c0fefdfa95724fa79a080c4ea0993e5dc8c29f51df727886996961d6 Apr 21 17:36:26.973075 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.973047 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" event={"ID":"b27107b2-5afd-4ffb-9d8b-7d03e1202c52","Type":"ContainerStarted","Data":"c89fe6b32ba4ec467edbcb9f34b61c0253a70ad697a677c416b16fda69168c1c"} Apr 21 17:36:26.973210 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.973086 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" event={"ID":"b27107b2-5afd-4ffb-9d8b-7d03e1202c52","Type":"ContainerStarted","Data":"96fe3032c0fefdfa95724fa79a080c4ea0993e5dc8c29f51df727886996961d6"} Apr 21 17:36:26.974552 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:26.974522 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ns9q7" event={"ID":"edb54a22-e9b8-49a2-b5f3-e4273f5b4400","Type":"ContainerStarted","Data":"afb613198e05169d18efb67fb7cc98aab53a4c0cc861ad36893928a45451ac0c"} Apr 21 17:36:27.505037 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.505007 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 17:36:27.510323 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.510305 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.512885 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.512855 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 17:36:27.512885 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.512878 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 17:36:27.513044 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.512917 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 17:36:27.513044 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.512862 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 17:36:27.513044 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.512855 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 17:36:27.513356 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.513340 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 17:36:27.513453 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.513357 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 17:36:27.513453 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.513368 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 17:36:27.513453 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.513378 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-krwn2\"" Apr 21 17:36:27.513453 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.513341 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 17:36:27.524317 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.524294 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 17:36:27.653787 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.653757 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.653920 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.653810 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-config-out\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.653920 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.653844 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.653920 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.653906 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.654089 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.653935 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.654089 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.653997 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-web-config\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.654089 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.654023 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-678hd\" (UniqueName: \"kubernetes.io/projected/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-kube-api-access-678hd\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.654089 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.654050 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.654089 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.654075 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.654313 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.654103 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.654313 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.654138 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.654313 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.654230 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.654313 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.654266 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-config-volume\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.755066 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.754968 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.755066 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.755050 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.755299 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:36:27.755103 2583 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 21 17:36:27.755299 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:36:27.755204 2583 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-main-tls podName:340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f nodeName:}" failed. No retries permitted until 2026-04-21 17:36:28.255157645 +0000 UTC m=+175.395220395 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f") : secret "alertmanager-main-tls" not found Apr 21 17:36:27.755578 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.755082 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-config-volume\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.755777 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.755756 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.755929 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.755913 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-config-out\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.756045 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.756030 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.756189 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.756152 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.756310 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.756294 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.756452 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.756438 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-web-config\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.756551 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.756538 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-678hd\" (UniqueName: \"kubernetes.io/projected/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-kube-api-access-678hd\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.756647 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.756635 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.756743 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.756729 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.756847 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.756832 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.757822 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.757489 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.758390 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.758274 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.758653 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.758626 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.758951 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.758633 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.759020 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.758949 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.759083 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.759021 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-config-volume\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.759549 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.759528 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.760524 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.760495 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.760964 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.760940 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-web-config\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.761069 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.761048 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-config-out\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.761518 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.761499 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.765587 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.765565 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-678hd\" (UniqueName: \"kubernetes.io/projected/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-kube-api-access-678hd\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:27.979146 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.979107 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" event={"ID":"b27107b2-5afd-4ffb-9d8b-7d03e1202c52","Type":"ContainerStarted","Data":"32b205141abe66b437d145ae61440857217639fc1ce85ea2359d5a79a7ea857f"} Apr 21 17:36:27.980433 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.980401 2583 generic.go:358] "Generic (PLEG): container finished" podID="edb54a22-e9b8-49a2-b5f3-e4273f5b4400" containerID="9ad0f52bafd0f43b8a8d567553b837e19a5ca9f94d6793f717f9381989f710a7" exitCode=0 Apr 21 17:36:27.980558 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:27.980484 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ns9q7" event={"ID":"edb54a22-e9b8-49a2-b5f3-e4273f5b4400","Type":"ContainerDied","Data":"9ad0f52bafd0f43b8a8d567553b837e19a5ca9f94d6793f717f9381989f710a7"} Apr 21 17:36:28.263867 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:28.263781 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:28.266690 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:28.266666 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:28.421412 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:28.421366 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:36:28.579511 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:28.579419 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 17:36:28.584044 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:36:28.584018 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod340cdb7d_6dfd_4ee4_b7f4_ab15731a6c7f.slice/crio-fff0690568355649cfb88b0f2639bd75c648c76d85697b064108c84277620edc WatchSource:0}: Error finding container fff0690568355649cfb88b0f2639bd75c648c76d85697b064108c84277620edc: Status 404 returned error can't find the container with id fff0690568355649cfb88b0f2639bd75c648c76d85697b064108c84277620edc Apr 21 17:36:28.985485 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:28.985439 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" event={"ID":"b27107b2-5afd-4ffb-9d8b-7d03e1202c52","Type":"ContainerStarted","Data":"64b70b413586eac13b93763c47b145c298e957b4869cf990931ead20b2c3a2df"} Apr 21 17:36:28.987328 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:28.987299 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ns9q7" event={"ID":"edb54a22-e9b8-49a2-b5f3-e4273f5b4400","Type":"ContainerStarted","Data":"9f5be420ff410a0b22be3095e3eaea1c9e18ec5de9ff551f6ef8a4c7ac3119e9"} Apr 21 17:36:28.987328 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:28.987331 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ns9q7" event={"ID":"edb54a22-e9b8-49a2-b5f3-e4273f5b4400","Type":"ContainerStarted","Data":"03c8880c507ac32cc166d735932a9f2a2807d66314ea93926bfc5445210e32ab"} Apr 21 17:36:28.988292 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:28.988268 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerStarted","Data":"fff0690568355649cfb88b0f2639bd75c648c76d85697b064108c84277620edc"} Apr 21 17:36:29.002887 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:29.002844 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-hmmfp" podStartSLOduration=1.687562277 podStartE2EDuration="3.002833779s" podCreationTimestamp="2026-04-21 17:36:26 +0000 UTC" firstStartedPulling="2026-04-21 17:36:27.034533677 +0000 UTC m=+174.174596422" lastFinishedPulling="2026-04-21 17:36:28.34980517 +0000 UTC m=+175.489867924" observedRunningTime="2026-04-21 17:36:29.001884353 +0000 UTC m=+176.141947120" watchObservedRunningTime="2026-04-21 17:36:29.002833779 +0000 UTC m=+176.142896610" Apr 21 17:36:29.020222 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:29.020132 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ns9q7" podStartSLOduration=2.358181791 podStartE2EDuration="3.020119198s" podCreationTimestamp="2026-04-21 17:36:26 +0000 UTC" firstStartedPulling="2026-04-21 17:36:26.795394136 +0000 UTC m=+173.935456884" lastFinishedPulling="2026-04-21 17:36:27.45733154 +0000 UTC m=+174.597394291" observedRunningTime="2026-04-21 17:36:29.019230335 +0000 UTC m=+176.159293102" watchObservedRunningTime="2026-04-21 17:36:29.020119198 +0000 UTC m=+176.160182001" Apr 21 17:36:29.992725 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:29.992689 2583 generic.go:358] "Generic (PLEG): container finished" podID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerID="46f02efe8d2caaee323c3a7117e49f668372fc859428e9bb130e322888c5bc3a" exitCode=0 Apr 21 17:36:29.993148 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:29.992775 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerDied","Data":"46f02efe8d2caaee323c3a7117e49f668372fc859428e9bb130e322888c5bc3a"} Apr 21 17:36:30.780029 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.779985 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-58f9c865df-m8vrm"] Apr 21 17:36:30.783330 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.783302 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.785964 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.785938 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 17:36:30.787008 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.786895 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-ekj9udjnb4g98\"" Apr 21 17:36:30.787008 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.786928 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 17:36:30.787008 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.786976 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-2czws\"" Apr 21 17:36:30.787008 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.786995 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 17:36:30.787306 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.786928 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 17:36:30.787306 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.787112 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d73ac7ba-d87d-428c-a3df-8217c7ce9412-secret-metrics-server-client-certs\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.787306 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.787149 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d73ac7ba-d87d-428c-a3df-8217c7ce9412-secret-metrics-server-tls\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.787306 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.787199 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73ac7ba-d87d-428c-a3df-8217c7ce9412-client-ca-bundle\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.787306 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.787245 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d73ac7ba-d87d-428c-a3df-8217c7ce9412-metrics-server-audit-profiles\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.787306 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.787271 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d73ac7ba-d87d-428c-a3df-8217c7ce9412-audit-log\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.787600 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.787347 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cf77\" (UniqueName: \"kubernetes.io/projected/d73ac7ba-d87d-428c-a3df-8217c7ce9412-kube-api-access-9cf77\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.787600 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.787419 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d73ac7ba-d87d-428c-a3df-8217c7ce9412-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.792563 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.792537 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-58f9c865df-m8vrm"] Apr 21 17:36:30.888254 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.888212 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d73ac7ba-d87d-428c-a3df-8217c7ce9412-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.888440 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.888285 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d73ac7ba-d87d-428c-a3df-8217c7ce9412-secret-metrics-server-client-certs\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.888440 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.888313 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d73ac7ba-d87d-428c-a3df-8217c7ce9412-secret-metrics-server-tls\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.888440 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.888330 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73ac7ba-d87d-428c-a3df-8217c7ce9412-client-ca-bundle\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.888440 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.888387 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d73ac7ba-d87d-428c-a3df-8217c7ce9412-metrics-server-audit-profiles\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.888440 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.888422 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d73ac7ba-d87d-428c-a3df-8217c7ce9412-audit-log\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.888705 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.888469 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cf77\" (UniqueName: \"kubernetes.io/projected/d73ac7ba-d87d-428c-a3df-8217c7ce9412-kube-api-access-9cf77\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.889058 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.889033 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d73ac7ba-d87d-428c-a3df-8217c7ce9412-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.889058 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.889048 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d73ac7ba-d87d-428c-a3df-8217c7ce9412-audit-log\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.890238 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.890213 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d73ac7ba-d87d-428c-a3df-8217c7ce9412-metrics-server-audit-profiles\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.891298 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.891278 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d73ac7ba-d87d-428c-a3df-8217c7ce9412-secret-metrics-server-client-certs\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.891484 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.891462 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d73ac7ba-d87d-428c-a3df-8217c7ce9412-secret-metrics-server-tls\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.891637 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.891612 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d73ac7ba-d87d-428c-a3df-8217c7ce9412-client-ca-bundle\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:30.903067 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:30.903035 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cf77\" (UniqueName: \"kubernetes.io/projected/d73ac7ba-d87d-428c-a3df-8217c7ce9412-kube-api-access-9cf77\") pod \"metrics-server-58f9c865df-m8vrm\" (UID: \"d73ac7ba-d87d-428c-a3df-8217c7ce9412\") " pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:31.095822 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:31.095731 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:31.487524 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:31.487494 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-58f9c865df-m8vrm"] Apr 21 17:36:31.490551 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:36:31.490515 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd73ac7ba_d87d_428c_a3df_8217c7ce9412.slice/crio-50758e02acf04bbda8d0a9090d5cce14547c29e11b436cd7b9fbd3a99a6318ff WatchSource:0}: Error finding container 50758e02acf04bbda8d0a9090d5cce14547c29e11b436cd7b9fbd3a99a6318ff: Status 404 returned error can't find the container with id 50758e02acf04bbda8d0a9090d5cce14547c29e11b436cd7b9fbd3a99a6318ff Apr 21 17:36:31.496239 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:31.495581 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-rf84h"] Apr 21 17:36:31.504810 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:31.504780 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rf84h" Apr 21 17:36:31.506099 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:31.506038 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rf84h"] Apr 21 17:36:31.508356 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:31.508335 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 17:36:31.508793 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:31.508385 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-4lbxp\"" Apr 21 17:36:31.508793 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:31.508427 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 17:36:31.595475 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:31.595421 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7v6z\" (UniqueName: \"kubernetes.io/projected/b381afa1-d1a3-4590-97dd-05ddf6b9551c-kube-api-access-p7v6z\") pod \"downloads-6bcc868b7-rf84h\" (UID: \"b381afa1-d1a3-4590-97dd-05ddf6b9551c\") " pod="openshift-console/downloads-6bcc868b7-rf84h" Apr 21 17:36:31.696047 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:31.696016 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7v6z\" (UniqueName: \"kubernetes.io/projected/b381afa1-d1a3-4590-97dd-05ddf6b9551c-kube-api-access-p7v6z\") pod \"downloads-6bcc868b7-rf84h\" (UID: \"b381afa1-d1a3-4590-97dd-05ddf6b9551c\") " pod="openshift-console/downloads-6bcc868b7-rf84h" Apr 21 17:36:31.705202 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:31.705157 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7v6z\" (UniqueName: \"kubernetes.io/projected/b381afa1-d1a3-4590-97dd-05ddf6b9551c-kube-api-access-p7v6z\") pod \"downloads-6bcc868b7-rf84h\" (UID: \"b381afa1-d1a3-4590-97dd-05ddf6b9551c\") " pod="openshift-console/downloads-6bcc868b7-rf84h" Apr 21 17:36:31.818713 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:31.818677 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rf84h" Apr 21 17:36:31.940033 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:31.940007 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rf84h"] Apr 21 17:36:31.962993 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:31.962970 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tk5wc" Apr 21 17:36:32.000918 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.000883 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rf84h" event={"ID":"b381afa1-d1a3-4590-97dd-05ddf6b9551c","Type":"ContainerStarted","Data":"217a79d78d2e61d889dd9afcd532c1bc2385dccdcfc399c3a076107d0db98706"} Apr 21 17:36:32.002198 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.002147 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" event={"ID":"d73ac7ba-d87d-428c-a3df-8217c7ce9412","Type":"ContainerStarted","Data":"50758e02acf04bbda8d0a9090d5cce14547c29e11b436cd7b9fbd3a99a6318ff"} Apr 21 17:36:32.005134 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.005111 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerStarted","Data":"c11fb6335b74eff289093123c54af78b221ce463186f5671cb0642637b7cf43d"} Apr 21 17:36:32.005258 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.005138 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerStarted","Data":"81a268ca40bf5b2e6fb96811914856ad7b213c871a60253b1443c6eb4ddd8fec"} Apr 21 17:36:32.005258 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.005147 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerStarted","Data":"5e2f0ca86127f7a363e5c40ec139360645218f737f980d3d7f6de6348492d0e7"} Apr 21 17:36:32.005258 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.005157 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerStarted","Data":"f6ce093b35c635938fd6866427a4639f98bef936f507ec5d37a62da51173e2fa"} Apr 21 17:36:32.005258 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.005166 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerStarted","Data":"1ee496ab60072880e548e013a10188c21cad6dee953ae32b018f6a1f01b991fb"} Apr 21 17:36:32.664256 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.664225 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 17:36:32.668498 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.668473 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.671168 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.671070 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 17:36:32.671319 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.671232 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-8b9mh\"" Apr 21 17:36:32.671319 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.671232 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 17:36:32.671415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.671332 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 17:36:32.671621 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.671589 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 17:36:32.671699 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.671647 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 17:36:32.671874 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.671857 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 17:36:32.672486 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.672435 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 17:36:32.673236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.672898 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 17:36:32.673236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.672988 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 17:36:32.673236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.673151 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 17:36:32.674096 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.674044 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5mthhaug7i2fa\"" Apr 21 17:36:32.675128 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.675109 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 17:36:32.679649 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.679628 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 17:36:32.681004 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.680983 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 17:36:32.706288 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.706249 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.706463 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.706306 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.706463 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.706351 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.706463 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.706379 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.706642 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.706604 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.706700 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.706656 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.706700 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.706689 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.706800 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.706716 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba310721-5ccc-4564-b8bd-54ae583077ea-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.706800 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.706748 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.706800 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.706777 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.706937 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.706814 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-config\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.706937 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.706835 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba310721-5ccc-4564-b8bd-54ae583077ea-config-out\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.707036 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.706950 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.707036 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.707022 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.707109 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.707057 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-web-config\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.707109 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.707084 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.707197 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.707122 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n9cd\" (UniqueName: \"kubernetes.io/projected/ba310721-5ccc-4564-b8bd-54ae583077ea-kube-api-access-7n9cd\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.707197 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.707147 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.808556 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.808510 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.808740 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.808567 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.808740 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.808606 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-config\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.808740 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.808651 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba310721-5ccc-4564-b8bd-54ae583077ea-config-out\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.808910 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.808830 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.808910 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.808874 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.808910 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.808904 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-web-config\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.809054 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.808930 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.809054 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.808972 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n9cd\" (UniqueName: \"kubernetes.io/projected/ba310721-5ccc-4564-b8bd-54ae583077ea-kube-api-access-7n9cd\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.809054 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.809002 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.809054 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.809033 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.809280 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.809065 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.809280 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.809106 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.809280 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.809130 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.809280 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.809228 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.809280 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.809268 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.809526 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.809296 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.809526 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.809322 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba310721-5ccc-4564-b8bd-54ae583077ea-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.809526 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.809463 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.809740 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.809720 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.811029 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.810722 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.811029 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.810781 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.811899 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.811867 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.811993 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.811917 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.812562 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.812215 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba310721-5ccc-4564-b8bd-54ae583077ea-config-out\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.813847 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.813819 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.814077 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.814051 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-config\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.814286 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.814245 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.816133 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.816111 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.816720 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.816698 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba310721-5ccc-4564-b8bd-54ae583077ea-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.817325 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.817283 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-web-config\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.818110 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.818069 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.818235 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.818218 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.818298 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.818246 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.819001 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.818981 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.823514 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.823492 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n9cd\" (UniqueName: \"kubernetes.io/projected/ba310721-5ccc-4564-b8bd-54ae583077ea-kube-api-access-7n9cd\") pod \"prometheus-k8s-0\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:32.981559 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:32.981527 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:33.348634 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:33.348597 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 17:36:34.014075 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:34.014038 2583 generic.go:358] "Generic (PLEG): container finished" podID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerID="ffbd01a6351c7fee0461bfa19d252553902871557907817296315582873fa808" exitCode=0 Apr 21 17:36:34.014571 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:34.014130 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerDied","Data":"ffbd01a6351c7fee0461bfa19d252553902871557907817296315582873fa808"} Apr 21 17:36:34.014571 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:34.014155 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerStarted","Data":"c5b1e78dafc5eeb5112e66e795961eb4ae84ed970c8a0c0d6e8afc8d00b91892"} Apr 21 17:36:34.018096 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:34.018064 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerStarted","Data":"682bc1a335241a84b7700a023974469fe4657c56b1e6673d5f26f757b87da407"} Apr 21 17:36:34.020040 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:34.019852 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" event={"ID":"d73ac7ba-d87d-428c-a3df-8217c7ce9412","Type":"ContainerStarted","Data":"c1d0ebe0dbbe24ef66f1144f786baf63a0547748b7670d60aa911c8c46486e85"} Apr 21 17:36:34.058534 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:34.058473 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" podStartSLOduration=2.344371154 podStartE2EDuration="4.058458523s" podCreationTimestamp="2026-04-21 17:36:30 +0000 UTC" firstStartedPulling="2026-04-21 17:36:31.494308357 +0000 UTC m=+178.634371119" lastFinishedPulling="2026-04-21 17:36:33.208395737 +0000 UTC m=+180.348458488" observedRunningTime="2026-04-21 17:36:34.056953261 +0000 UTC m=+181.197016041" watchObservedRunningTime="2026-04-21 17:36:34.058458523 +0000 UTC m=+181.198521289" Apr 21 17:36:34.084068 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:34.084019 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.507084815 podStartE2EDuration="7.08400593s" podCreationTimestamp="2026-04-21 17:36:27 +0000 UTC" firstStartedPulling="2026-04-21 17:36:28.586238761 +0000 UTC m=+175.726301512" lastFinishedPulling="2026-04-21 17:36:33.163159872 +0000 UTC m=+180.303222627" observedRunningTime="2026-04-21 17:36:34.082858588 +0000 UTC m=+181.222921356" watchObservedRunningTime="2026-04-21 17:36:34.08400593 +0000 UTC m=+181.224068697" Apr 21 17:36:38.037646 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:38.037605 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerStarted","Data":"558d3e5dbcdca1b1a28a58c85f743e6482622eff966e2c5406933257cd2d4aa0"} Apr 21 17:36:38.038103 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:38.037656 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerStarted","Data":"cd969509d181f4fbc0c959435b7ba3ab3bc4accd5ea11b5c454011f14b5132d3"} Apr 21 17:36:39.329715 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:39.329678 2583 patch_prober.go:28] interesting pod/image-registry-86bffdb85-f9pnq container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 17:36:39.330104 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:39.329742 2583 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" podUID="ec7cf3c5-c440-4bc6-a03d-076cc402b14a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 17:36:40.048336 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:40.048301 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerStarted","Data":"71aea83c78475ef48ee94c94d4aa38934adfa7bf7d7aa926f18d69a09b7e4921"} Apr 21 17:36:40.048336 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:40.048342 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerStarted","Data":"1d5b78226227ca502c35b362e3be78e6e5072eaf7ccce6c0ab6235205ced4fb4"} Apr 21 17:36:40.048595 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:40.048358 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerStarted","Data":"a30e2d322aa223d0d6d2b15ad9e74895e896dbb16363a669053ace6b3ee63dff"} Apr 21 17:36:40.048595 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:40.048370 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerStarted","Data":"91c0dcfa0814c5b459e99aab82a04881787e7d0f6b8c9008303a1cca7042f20b"} Apr 21 17:36:40.082769 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:40.082718 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.844098512 podStartE2EDuration="8.082699924s" podCreationTimestamp="2026-04-21 17:36:32 +0000 UTC" firstStartedPulling="2026-04-21 17:36:34.015647782 +0000 UTC m=+181.155710529" lastFinishedPulling="2026-04-21 17:36:39.254249192 +0000 UTC m=+186.394311941" observedRunningTime="2026-04-21 17:36:40.079606012 +0000 UTC m=+187.219668782" watchObservedRunningTime="2026-04-21 17:36:40.082699924 +0000 UTC m=+187.222762691" Apr 21 17:36:40.956325 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:40.956291 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-86bffdb85-f9pnq" Apr 21 17:36:42.982542 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:42.982498 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:36:49.080405 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:49.080357 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rf84h" event={"ID":"b381afa1-d1a3-4590-97dd-05ddf6b9551c","Type":"ContainerStarted","Data":"3df8855d8009f09442362a7c0f69d84e66d603202dcb8d74369f75ed3f9e6ab3"} Apr 21 17:36:49.080884 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:49.080598 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-rf84h" Apr 21 17:36:49.092070 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:49.092042 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-rf84h" Apr 21 17:36:49.104222 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:49.104147 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-rf84h" podStartSLOduration=1.593807978 podStartE2EDuration="18.104132887s" podCreationTimestamp="2026-04-21 17:36:31 +0000 UTC" firstStartedPulling="2026-04-21 17:36:31.946145531 +0000 UTC m=+179.086208276" lastFinishedPulling="2026-04-21 17:36:48.456470438 +0000 UTC m=+195.596533185" observedRunningTime="2026-04-21 17:36:49.102719358 +0000 UTC m=+196.242782159" watchObservedRunningTime="2026-04-21 17:36:49.104132887 +0000 UTC m=+196.244195653" Apr 21 17:36:51.096659 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:51.096619 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:36:51.096659 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:36:51.096663 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:37:11.101552 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:11.101464 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:37:11.105371 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:11.105346 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-58f9c865df-m8vrm" Apr 21 17:37:17.168828 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:17.168746 2583 generic.go:358] "Generic (PLEG): container finished" podID="343fcef3-240d-459d-8c84-7164f0722f10" containerID="69c1acfa0d8ae5f48b1166c7b88c9b6b6246433dded32e3baec56a94b72e9068" exitCode=0 Apr 21 17:37:17.169225 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:17.168822 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" event={"ID":"343fcef3-240d-459d-8c84-7164f0722f10","Type":"ContainerDied","Data":"69c1acfa0d8ae5f48b1166c7b88c9b6b6246433dded32e3baec56a94b72e9068"} Apr 21 17:37:17.169225 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:17.169152 2583 scope.go:117] "RemoveContainer" containerID="69c1acfa0d8ae5f48b1166c7b88c9b6b6246433dded32e3baec56a94b72e9068" Apr 21 17:37:18.172646 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:18.172609 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-h7cx4" event={"ID":"343fcef3-240d-459d-8c84-7164f0722f10","Type":"ContainerStarted","Data":"59d99483e29bb0fa9ee431f0b68a5b3f449794167b282031d8da0db75852b29a"} Apr 21 17:37:19.176540 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:19.176502 2583 generic.go:358] "Generic (PLEG): container finished" podID="accd670b-edfb-4f84-9bb3-c72f1ca32432" containerID="836879d8dfcd28791c2dfee2be5d60e274fd6c08793c8f4616a5c2563187203c" exitCode=0 Apr 21 17:37:19.176905 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:19.176579 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" event={"ID":"accd670b-edfb-4f84-9bb3-c72f1ca32432","Type":"ContainerDied","Data":"836879d8dfcd28791c2dfee2be5d60e274fd6c08793c8f4616a5c2563187203c"} Apr 21 17:37:19.176958 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:19.176942 2583 scope.go:117] "RemoveContainer" containerID="836879d8dfcd28791c2dfee2be5d60e274fd6c08793c8f4616a5c2563187203c" Apr 21 17:37:20.180786 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:20.180743 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-m4dlz" event={"ID":"accd670b-edfb-4f84-9bb3-c72f1ca32432","Type":"ContainerStarted","Data":"d5f1524223e941fd3cea7e27ce2ba8102aec974d1b5a161dfad18d5aa461959a"} Apr 21 17:37:32.984477 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:32.983020 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:33.001301 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:33.001274 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:33.235471 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:33.235395 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:45.189866 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:45.189755 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:37:45.192364 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:45.192338 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/38cd15ba-d0c7-4b4f-b220-f72981ccd9da-metrics-certs\") pod \"network-metrics-daemon-rfmv6\" (UID: \"38cd15ba-d0c7-4b4f-b220-f72981ccd9da\") " pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:37:45.363688 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:45.363655 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pqc26\"" Apr 21 17:37:45.371910 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:45.371882 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rfmv6" Apr 21 17:37:45.495058 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:45.495027 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rfmv6"] Apr 21 17:37:45.498134 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:37:45.498106 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38cd15ba_d0c7_4b4f_b220_f72981ccd9da.slice/crio-f4217ac6d9e84e25a90c74ce73a8a4b9a54671982a4296edd2c47a40400ed51f WatchSource:0}: Error finding container f4217ac6d9e84e25a90c74ce73a8a4b9a54671982a4296edd2c47a40400ed51f: Status 404 returned error can't find the container with id f4217ac6d9e84e25a90c74ce73a8a4b9a54671982a4296edd2c47a40400ed51f Apr 21 17:37:46.262931 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:46.262892 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rfmv6" event={"ID":"38cd15ba-d0c7-4b4f-b220-f72981ccd9da","Type":"ContainerStarted","Data":"f4217ac6d9e84e25a90c74ce73a8a4b9a54671982a4296edd2c47a40400ed51f"} Apr 21 17:37:46.716632 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:46.716594 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 17:37:46.717126 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:46.717080 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="alertmanager" containerID="cri-o://1ee496ab60072880e548e013a10188c21cad6dee953ae32b018f6a1f01b991fb" gracePeriod=120 Apr 21 17:37:46.717212 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:46.717136 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="kube-rbac-proxy-metric" containerID="cri-o://c11fb6335b74eff289093123c54af78b221ce463186f5671cb0642637b7cf43d" gracePeriod=120 Apr 21 17:37:46.717273 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:46.717201 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="prom-label-proxy" containerID="cri-o://682bc1a335241a84b7700a023974469fe4657c56b1e6673d5f26f757b87da407" gracePeriod=120 Apr 21 17:37:46.717273 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:46.717223 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="kube-rbac-proxy-web" containerID="cri-o://5e2f0ca86127f7a363e5c40ec139360645218f737f980d3d7f6de6348492d0e7" gracePeriod=120 Apr 21 17:37:46.717374 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:46.717267 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="kube-rbac-proxy" containerID="cri-o://81a268ca40bf5b2e6fb96811914856ad7b213c871a60253b1443c6eb4ddd8fec" gracePeriod=120 Apr 21 17:37:46.717374 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:46.717277 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="config-reloader" containerID="cri-o://f6ce093b35c635938fd6866427a4639f98bef936f507ec5d37a62da51173e2fa" gracePeriod=120 Apr 21 17:37:47.269585 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:47.269553 2583 generic.go:358] "Generic (PLEG): container finished" podID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerID="682bc1a335241a84b7700a023974469fe4657c56b1e6673d5f26f757b87da407" exitCode=0 Apr 21 17:37:47.269585 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:47.269579 2583 generic.go:358] "Generic (PLEG): container finished" podID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerID="c11fb6335b74eff289093123c54af78b221ce463186f5671cb0642637b7cf43d" exitCode=0 Apr 21 17:37:47.269585 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:47.269586 2583 generic.go:358] "Generic (PLEG): container finished" podID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerID="81a268ca40bf5b2e6fb96811914856ad7b213c871a60253b1443c6eb4ddd8fec" exitCode=0 Apr 21 17:37:47.269585 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:47.269593 2583 generic.go:358] "Generic (PLEG): container finished" podID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerID="f6ce093b35c635938fd6866427a4639f98bef936f507ec5d37a62da51173e2fa" exitCode=0 Apr 21 17:37:47.269585 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:47.269599 2583 generic.go:358] "Generic (PLEG): container finished" podID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerID="1ee496ab60072880e548e013a10188c21cad6dee953ae32b018f6a1f01b991fb" exitCode=0 Apr 21 17:37:47.270218 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:47.269630 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerDied","Data":"682bc1a335241a84b7700a023974469fe4657c56b1e6673d5f26f757b87da407"} Apr 21 17:37:47.270218 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:47.269665 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerDied","Data":"c11fb6335b74eff289093123c54af78b221ce463186f5671cb0642637b7cf43d"} Apr 21 17:37:47.270218 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:47.269675 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerDied","Data":"81a268ca40bf5b2e6fb96811914856ad7b213c871a60253b1443c6eb4ddd8fec"} Apr 21 17:37:47.270218 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:47.269684 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerDied","Data":"f6ce093b35c635938fd6866427a4639f98bef936f507ec5d37a62da51173e2fa"} Apr 21 17:37:47.270218 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:47.269694 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerDied","Data":"1ee496ab60072880e548e013a10188c21cad6dee953ae32b018f6a1f01b991fb"} Apr 21 17:37:47.271332 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:47.271309 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rfmv6" event={"ID":"38cd15ba-d0c7-4b4f-b220-f72981ccd9da","Type":"ContainerStarted","Data":"ed5322c9e0e548cd42e6b71bf402472cf494e27deb7cb47d49b5fcbf65672fa5"} Apr 21 17:37:47.271456 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:47.271337 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rfmv6" event={"ID":"38cd15ba-d0c7-4b4f-b220-f72981ccd9da","Type":"ContainerStarted","Data":"f41c7aae241e815f5a85bd49cf49b05af55c945117fc1715a6caa76621f5d94b"} Apr 21 17:37:47.288838 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:47.288793 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rfmv6" podStartSLOduration=253.408305105 podStartE2EDuration="4m14.288780793s" podCreationTimestamp="2026-04-21 17:33:33 +0000 UTC" firstStartedPulling="2026-04-21 17:37:45.499918836 +0000 UTC m=+252.639981582" lastFinishedPulling="2026-04-21 17:37:46.380394525 +0000 UTC m=+253.520457270" observedRunningTime="2026-04-21 17:37:47.287376772 +0000 UTC m=+254.427439575" watchObservedRunningTime="2026-04-21 17:37:47.288780793 +0000 UTC m=+254.428843559" Apr 21 17:37:47.953602 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:47.953578 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.011474 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.011429 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-cluster-tls-config\") pod \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " Apr 21 17:37:48.011474 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.011478 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-config-volume\") pod \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " Apr 21 17:37:48.011718 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.011506 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-678hd\" (UniqueName: \"kubernetes.io/projected/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-kube-api-access-678hd\") pod \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " Apr 21 17:37:48.011718 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.011532 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy\") pod \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " Apr 21 17:37:48.011718 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.011565 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-tls-assets\") pod \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " Apr 21 17:37:48.011718 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.011592 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-alertmanager-main-db\") pod \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " Apr 21 17:37:48.011718 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.011617 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy-web\") pod \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " Apr 21 17:37:48.011718 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.011658 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-main-tls\") pod \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " Apr 21 17:37:48.011718 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.011700 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-metrics-client-ca\") pod \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " Apr 21 17:37:48.012124 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.011725 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " Apr 21 17:37:48.012124 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.011754 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-alertmanager-trusted-ca-bundle\") pod \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " Apr 21 17:37:48.012124 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.011780 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-config-out\") pod \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " Apr 21 17:37:48.012124 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.011800 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-web-config\") pod \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\" (UID: \"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f\") " Apr 21 17:37:48.013621 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.013271 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" (UID: "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:37:48.013886 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.013859 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" (UID: "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:37:48.014309 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.014266 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" (UID: "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:37:48.016896 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.016844 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" (UID: "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:48.017019 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.016907 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-kube-api-access-678hd" (OuterVolumeSpecName: "kube-api-access-678hd") pod "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" (UID: "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f"). InnerVolumeSpecName "kube-api-access-678hd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:37:48.018467 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.018356 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-config-volume" (OuterVolumeSpecName: "config-volume") pod "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" (UID: "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:48.018467 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.018443 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" (UID: "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:48.018634 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.018489 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" (UID: "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:48.018963 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.018812 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" (UID: "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:37:48.018963 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.018919 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" (UID: "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:48.020426 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.020379 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-config-out" (OuterVolumeSpecName: "config-out") pod "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" (UID: "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:37:48.021601 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.021580 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" (UID: "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:48.029743 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.029720 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-web-config" (OuterVolumeSpecName: "web-config") pod "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" (UID: "340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:48.112680 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.112563 2583 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-metrics-client-ca\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:48.112680 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.112611 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:48.112680 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.112628 2583 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:48.112680 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.112641 2583 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-config-out\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:48.112680 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.112655 2583 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-web-config\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:48.112680 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.112668 2583 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-cluster-tls-config\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:48.112680 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.112682 2583 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-config-volume\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:48.112680 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.112694 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-678hd\" (UniqueName: \"kubernetes.io/projected/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-kube-api-access-678hd\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:48.113038 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.112707 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:48.113038 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.112719 2583 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-tls-assets\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:48.113038 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.112730 2583 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-alertmanager-main-db\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:48.113038 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.112742 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:48.113038 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.112754 2583 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f-secret-alertmanager-main-tls\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:48.277901 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.277864 2583 generic.go:358] "Generic (PLEG): container finished" podID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerID="5e2f0ca86127f7a363e5c40ec139360645218f737f980d3d7f6de6348492d0e7" exitCode=0 Apr 21 17:37:48.278392 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.277950 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerDied","Data":"5e2f0ca86127f7a363e5c40ec139360645218f737f980d3d7f6de6348492d0e7"} Apr 21 17:37:48.278392 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.277997 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f","Type":"ContainerDied","Data":"fff0690568355649cfb88b0f2639bd75c648c76d85697b064108c84277620edc"} Apr 21 17:37:48.278392 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.278001 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.278392 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.278018 2583 scope.go:117] "RemoveContainer" containerID="682bc1a335241a84b7700a023974469fe4657c56b1e6673d5f26f757b87da407" Apr 21 17:37:48.285741 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.285711 2583 scope.go:117] "RemoveContainer" containerID="c11fb6335b74eff289093123c54af78b221ce463186f5671cb0642637b7cf43d" Apr 21 17:37:48.292794 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.292777 2583 scope.go:117] "RemoveContainer" containerID="81a268ca40bf5b2e6fb96811914856ad7b213c871a60253b1443c6eb4ddd8fec" Apr 21 17:37:48.299583 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.299566 2583 scope.go:117] "RemoveContainer" containerID="5e2f0ca86127f7a363e5c40ec139360645218f737f980d3d7f6de6348492d0e7" Apr 21 17:37:48.305824 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.305807 2583 scope.go:117] "RemoveContainer" containerID="f6ce093b35c635938fd6866427a4639f98bef936f507ec5d37a62da51173e2fa" Apr 21 17:37:48.312079 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.312050 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 17:37:48.313835 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.313812 2583 scope.go:117] "RemoveContainer" containerID="1ee496ab60072880e548e013a10188c21cad6dee953ae32b018f6a1f01b991fb" Apr 21 17:37:48.321423 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.321376 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 17:37:48.324539 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.324523 2583 scope.go:117] "RemoveContainer" containerID="46f02efe8d2caaee323c3a7117e49f668372fc859428e9bb130e322888c5bc3a" Apr 21 17:37:48.331114 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.331099 2583 scope.go:117] "RemoveContainer" containerID="682bc1a335241a84b7700a023974469fe4657c56b1e6673d5f26f757b87da407" Apr 21 17:37:48.331378 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:37:48.331360 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"682bc1a335241a84b7700a023974469fe4657c56b1e6673d5f26f757b87da407\": container with ID starting with 682bc1a335241a84b7700a023974469fe4657c56b1e6673d5f26f757b87da407 not found: ID does not exist" containerID="682bc1a335241a84b7700a023974469fe4657c56b1e6673d5f26f757b87da407" Apr 21 17:37:48.331435 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.331392 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682bc1a335241a84b7700a023974469fe4657c56b1e6673d5f26f757b87da407"} err="failed to get container status \"682bc1a335241a84b7700a023974469fe4657c56b1e6673d5f26f757b87da407\": rpc error: code = NotFound desc = could not find container \"682bc1a335241a84b7700a023974469fe4657c56b1e6673d5f26f757b87da407\": container with ID starting with 682bc1a335241a84b7700a023974469fe4657c56b1e6673d5f26f757b87da407 not found: ID does not exist" Apr 21 17:37:48.331435 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.331427 2583 scope.go:117] "RemoveContainer" containerID="c11fb6335b74eff289093123c54af78b221ce463186f5671cb0642637b7cf43d" Apr 21 17:37:48.331625 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:37:48.331608 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c11fb6335b74eff289093123c54af78b221ce463186f5671cb0642637b7cf43d\": container with ID starting with c11fb6335b74eff289093123c54af78b221ce463186f5671cb0642637b7cf43d not found: ID does not exist" containerID="c11fb6335b74eff289093123c54af78b221ce463186f5671cb0642637b7cf43d" Apr 21 17:37:48.331667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.331632 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c11fb6335b74eff289093123c54af78b221ce463186f5671cb0642637b7cf43d"} err="failed to get container status \"c11fb6335b74eff289093123c54af78b221ce463186f5671cb0642637b7cf43d\": rpc error: code = NotFound desc = could not find container \"c11fb6335b74eff289093123c54af78b221ce463186f5671cb0642637b7cf43d\": container with ID starting with c11fb6335b74eff289093123c54af78b221ce463186f5671cb0642637b7cf43d not found: ID does not exist" Apr 21 17:37:48.331667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.331649 2583 scope.go:117] "RemoveContainer" containerID="81a268ca40bf5b2e6fb96811914856ad7b213c871a60253b1443c6eb4ddd8fec" Apr 21 17:37:48.331875 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:37:48.331860 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81a268ca40bf5b2e6fb96811914856ad7b213c871a60253b1443c6eb4ddd8fec\": container with ID starting with 81a268ca40bf5b2e6fb96811914856ad7b213c871a60253b1443c6eb4ddd8fec not found: ID does not exist" containerID="81a268ca40bf5b2e6fb96811914856ad7b213c871a60253b1443c6eb4ddd8fec" Apr 21 17:37:48.331936 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.331878 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a268ca40bf5b2e6fb96811914856ad7b213c871a60253b1443c6eb4ddd8fec"} err="failed to get container status \"81a268ca40bf5b2e6fb96811914856ad7b213c871a60253b1443c6eb4ddd8fec\": rpc error: code = NotFound desc = could not find container \"81a268ca40bf5b2e6fb96811914856ad7b213c871a60253b1443c6eb4ddd8fec\": container with ID starting with 81a268ca40bf5b2e6fb96811914856ad7b213c871a60253b1443c6eb4ddd8fec not found: ID does not exist" Apr 21 17:37:48.331936 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.331890 2583 scope.go:117] "RemoveContainer" containerID="5e2f0ca86127f7a363e5c40ec139360645218f737f980d3d7f6de6348492d0e7" Apr 21 17:37:48.332090 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:37:48.332072 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e2f0ca86127f7a363e5c40ec139360645218f737f980d3d7f6de6348492d0e7\": container with ID starting with 5e2f0ca86127f7a363e5c40ec139360645218f737f980d3d7f6de6348492d0e7 not found: ID does not exist" containerID="5e2f0ca86127f7a363e5c40ec139360645218f737f980d3d7f6de6348492d0e7" Apr 21 17:37:48.332127 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.332094 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e2f0ca86127f7a363e5c40ec139360645218f737f980d3d7f6de6348492d0e7"} err="failed to get container status \"5e2f0ca86127f7a363e5c40ec139360645218f737f980d3d7f6de6348492d0e7\": rpc error: code = NotFound desc = could not find container \"5e2f0ca86127f7a363e5c40ec139360645218f737f980d3d7f6de6348492d0e7\": container with ID starting with 5e2f0ca86127f7a363e5c40ec139360645218f737f980d3d7f6de6348492d0e7 not found: ID does not exist" Apr 21 17:37:48.332127 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.332107 2583 scope.go:117] "RemoveContainer" containerID="f6ce093b35c635938fd6866427a4639f98bef936f507ec5d37a62da51173e2fa" Apr 21 17:37:48.332361 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:37:48.332342 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6ce093b35c635938fd6866427a4639f98bef936f507ec5d37a62da51173e2fa\": container with ID starting with f6ce093b35c635938fd6866427a4639f98bef936f507ec5d37a62da51173e2fa not found: ID does not exist" containerID="f6ce093b35c635938fd6866427a4639f98bef936f507ec5d37a62da51173e2fa" Apr 21 17:37:48.332410 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.332365 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6ce093b35c635938fd6866427a4639f98bef936f507ec5d37a62da51173e2fa"} err="failed to get container status \"f6ce093b35c635938fd6866427a4639f98bef936f507ec5d37a62da51173e2fa\": rpc error: code = NotFound desc = could not find container \"f6ce093b35c635938fd6866427a4639f98bef936f507ec5d37a62da51173e2fa\": container with ID starting with f6ce093b35c635938fd6866427a4639f98bef936f507ec5d37a62da51173e2fa not found: ID does not exist" Apr 21 17:37:48.332410 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.332377 2583 scope.go:117] "RemoveContainer" containerID="1ee496ab60072880e548e013a10188c21cad6dee953ae32b018f6a1f01b991fb" Apr 21 17:37:48.332598 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:37:48.332582 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ee496ab60072880e548e013a10188c21cad6dee953ae32b018f6a1f01b991fb\": container with ID starting with 1ee496ab60072880e548e013a10188c21cad6dee953ae32b018f6a1f01b991fb not found: ID does not exist" containerID="1ee496ab60072880e548e013a10188c21cad6dee953ae32b018f6a1f01b991fb" Apr 21 17:37:48.332639 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.332604 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ee496ab60072880e548e013a10188c21cad6dee953ae32b018f6a1f01b991fb"} err="failed to get container status \"1ee496ab60072880e548e013a10188c21cad6dee953ae32b018f6a1f01b991fb\": rpc error: code = NotFound desc = could not find container \"1ee496ab60072880e548e013a10188c21cad6dee953ae32b018f6a1f01b991fb\": container with ID starting with 1ee496ab60072880e548e013a10188c21cad6dee953ae32b018f6a1f01b991fb not found: ID does not exist" Apr 21 17:37:48.332639 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.332617 2583 scope.go:117] "RemoveContainer" containerID="46f02efe8d2caaee323c3a7117e49f668372fc859428e9bb130e322888c5bc3a" Apr 21 17:37:48.332839 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:37:48.332824 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46f02efe8d2caaee323c3a7117e49f668372fc859428e9bb130e322888c5bc3a\": container with ID starting with 46f02efe8d2caaee323c3a7117e49f668372fc859428e9bb130e322888c5bc3a not found: ID does not exist" containerID="46f02efe8d2caaee323c3a7117e49f668372fc859428e9bb130e322888c5bc3a" Apr 21 17:37:48.332887 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.332842 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46f02efe8d2caaee323c3a7117e49f668372fc859428e9bb130e322888c5bc3a"} err="failed to get container status \"46f02efe8d2caaee323c3a7117e49f668372fc859428e9bb130e322888c5bc3a\": rpc error: code = NotFound desc = could not find container \"46f02efe8d2caaee323c3a7117e49f668372fc859428e9bb130e322888c5bc3a\": container with ID starting with 46f02efe8d2caaee323c3a7117e49f668372fc859428e9bb130e322888c5bc3a not found: ID does not exist" Apr 21 17:37:48.361850 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.361820 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 17:37:48.362113 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362101 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="kube-rbac-proxy-metric" Apr 21 17:37:48.362155 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362126 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="kube-rbac-proxy-metric" Apr 21 17:37:48.362155 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362139 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="alertmanager" Apr 21 17:37:48.362155 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362144 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="alertmanager" Apr 21 17:37:48.362155 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362154 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="init-config-reloader" Apr 21 17:37:48.362299 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362160 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="init-config-reloader" Apr 21 17:37:48.362299 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362168 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="config-reloader" Apr 21 17:37:48.362299 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362189 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="config-reloader" Apr 21 17:37:48.362299 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362196 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="kube-rbac-proxy" Apr 21 17:37:48.362299 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362201 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="kube-rbac-proxy" Apr 21 17:37:48.362299 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362209 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="kube-rbac-proxy-web" Apr 21 17:37:48.362299 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362214 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="kube-rbac-proxy-web" Apr 21 17:37:48.362299 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362221 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="prom-label-proxy" Apr 21 17:37:48.362299 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362226 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="prom-label-proxy" Apr 21 17:37:48.362299 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362272 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="alertmanager" Apr 21 17:37:48.362299 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362279 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="kube-rbac-proxy" Apr 21 17:37:48.362299 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362286 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="kube-rbac-proxy-metric" Apr 21 17:37:48.362299 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362293 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="prom-label-proxy" Apr 21 17:37:48.362299 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362300 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="config-reloader" Apr 21 17:37:48.362661 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.362307 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" containerName="kube-rbac-proxy-web" Apr 21 17:37:48.365658 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.365615 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.368455 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.368433 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 17:37:48.368548 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.368433 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 17:37:48.368667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.368633 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 17:37:48.368667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.368661 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 17:37:48.368829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.368701 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 17:37:48.368829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.368712 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 17:37:48.368829 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.368713 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-krwn2\"" Apr 21 17:37:48.368985 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.368970 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 17:37:48.369046 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.369033 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 17:37:48.379882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.376303 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 17:37:48.384151 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.384127 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 17:37:48.415389 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.415354 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.415541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.415411 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e4f8e89a-b68a-4807-9aca-4424ccedd246-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.415541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.415446 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e4f8e89a-b68a-4807-9aca-4424ccedd246-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.415541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.415466 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4f8e89a-b68a-4807-9aca-4424ccedd246-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.415704 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.415538 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmm7l\" (UniqueName: \"kubernetes.io/projected/e4f8e89a-b68a-4807-9aca-4424ccedd246-kube-api-access-zmm7l\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.415704 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.415624 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4f8e89a-b68a-4807-9aca-4424ccedd246-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.415704 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.415657 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.415704 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.415688 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e4f8e89a-b68a-4807-9aca-4424ccedd246-config-out\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.415844 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.415712 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.415844 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.415758 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.415844 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.415797 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-web-config\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.415844 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.415818 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-config-volume\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.415844 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.415833 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.516684 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.516645 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.516865 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.516699 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e4f8e89a-b68a-4807-9aca-4424ccedd246-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.516865 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.516736 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e4f8e89a-b68a-4807-9aca-4424ccedd246-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.516865 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.516761 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4f8e89a-b68a-4807-9aca-4424ccedd246-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.516865 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.516792 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmm7l\" (UniqueName: \"kubernetes.io/projected/e4f8e89a-b68a-4807-9aca-4424ccedd246-kube-api-access-zmm7l\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.516865 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.516844 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4f8e89a-b68a-4807-9aca-4424ccedd246-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.517111 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.516873 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.517111 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.516902 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e4f8e89a-b68a-4807-9aca-4424ccedd246-config-out\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.517111 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.516923 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.517288 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.517256 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e4f8e89a-b68a-4807-9aca-4424ccedd246-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.517396 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.517380 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.517469 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.517425 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-web-config\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.517522 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.517471 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-config-volume\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.517522 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.517497 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.517715 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.517691 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4f8e89a-b68a-4807-9aca-4424ccedd246-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.518398 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.518370 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4f8e89a-b68a-4807-9aca-4424ccedd246-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.519999 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.519970 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.520120 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.519975 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.520285 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.520238 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.520531 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.520508 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e4f8e89a-b68a-4807-9aca-4424ccedd246-config-out\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.520604 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.520516 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.520853 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.520834 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-web-config\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.520973 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.520952 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e4f8e89a-b68a-4807-9aca-4424ccedd246-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.521222 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.521206 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.522089 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.522073 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e4f8e89a-b68a-4807-9aca-4424ccedd246-config-volume\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.525562 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.525541 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmm7l\" (UniqueName: \"kubernetes.io/projected/e4f8e89a-b68a-4807-9aca-4424ccedd246-kube-api-access-zmm7l\") pod \"alertmanager-main-0\" (UID: \"e4f8e89a-b68a-4807-9aca-4424ccedd246\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.675658 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.675568 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 17:37:48.803469 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:48.803446 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 17:37:48.806109 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:37:48.806081 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4f8e89a_b68a_4807_9aca_4424ccedd246.slice/crio-81d8ccb008cdd8458ed80a6304015c19bc7ae123fd84f344b27c4c90821f5c1e WatchSource:0}: Error finding container 81d8ccb008cdd8458ed80a6304015c19bc7ae123fd84f344b27c4c90821f5c1e: Status 404 returned error can't find the container with id 81d8ccb008cdd8458ed80a6304015c19bc7ae123fd84f344b27c4c90821f5c1e Apr 21 17:37:49.283337 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:49.283294 2583 generic.go:358] "Generic (PLEG): container finished" podID="e4f8e89a-b68a-4807-9aca-4424ccedd246" containerID="56c77678d61741496957ab4b0eead943daff31889cb91fe24d5d2c031c52bb8e" exitCode=0 Apr 21 17:37:49.283719 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:49.283383 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4f8e89a-b68a-4807-9aca-4424ccedd246","Type":"ContainerDied","Data":"56c77678d61741496957ab4b0eead943daff31889cb91fe24d5d2c031c52bb8e"} Apr 21 17:37:49.283719 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:49.283418 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4f8e89a-b68a-4807-9aca-4424ccedd246","Type":"ContainerStarted","Data":"81d8ccb008cdd8458ed80a6304015c19bc7ae123fd84f344b27c4c90821f5c1e"} Apr 21 17:37:49.465069 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:49.465040 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f" path="/var/lib/kubelet/pods/340cdb7d-6dfd-4ee4-b7f4-ab15731a6c7f/volumes" Apr 21 17:37:50.294931 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:50.294890 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4f8e89a-b68a-4807-9aca-4424ccedd246","Type":"ContainerStarted","Data":"705e4215c16d2c3fcb0f9ea7682935d0b568ebc68e8babe92bd5c04d1d8d4d18"} Apr 21 17:37:50.294931 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:50.294931 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4f8e89a-b68a-4807-9aca-4424ccedd246","Type":"ContainerStarted","Data":"cf67b1f0c6a83c9acd7749255726e4dde055123d6067f789d5b6bf466430e475"} Apr 21 17:37:50.295423 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:50.294944 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4f8e89a-b68a-4807-9aca-4424ccedd246","Type":"ContainerStarted","Data":"e789bd77845c186846acaa5f53af6924fbb9847ba0c52beeefe9ab191c4fbc2a"} Apr 21 17:37:50.295423 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:50.294957 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4f8e89a-b68a-4807-9aca-4424ccedd246","Type":"ContainerStarted","Data":"21f66b60f4f0597033da6b12799bd42268b351783aee1ca84e596a89486b48c0"} Apr 21 17:37:50.295423 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:50.294967 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4f8e89a-b68a-4807-9aca-4424ccedd246","Type":"ContainerStarted","Data":"d382a1bf153830755ebc3bd10ce93c5de75dbc87045e92ea17be36f970ad8c22"} Apr 21 17:37:50.295423 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:50.294977 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e4f8e89a-b68a-4807-9aca-4424ccedd246","Type":"ContainerStarted","Data":"0b5466d82dec4fc61eccaeac6f38b7ec687100df77d2646a7d330318129d0bb1"} Apr 21 17:37:50.339197 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:50.339134 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.339120989 podStartE2EDuration="2.339120989s" podCreationTimestamp="2026-04-21 17:37:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:37:50.33882094 +0000 UTC m=+257.478883729" watchObservedRunningTime="2026-04-21 17:37:50.339120989 +0000 UTC m=+257.479183756" Apr 21 17:37:51.133779 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.133745 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 17:37:51.134211 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.134163 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="kube-rbac-proxy" containerID="cri-o://1d5b78226227ca502c35b362e3be78e6e5072eaf7ccce6c0ab6235205ced4fb4" gracePeriod=600 Apr 21 17:37:51.134310 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.134189 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="thanos-sidecar" containerID="cri-o://91c0dcfa0814c5b459e99aab82a04881787e7d0f6b8c9008303a1cca7042f20b" gracePeriod=600 Apr 21 17:37:51.134310 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.134224 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="config-reloader" containerID="cri-o://558d3e5dbcdca1b1a28a58c85f743e6482622eff966e2c5406933257cd2d4aa0" gracePeriod=600 Apr 21 17:37:51.134310 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.134235 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="kube-rbac-proxy-thanos" containerID="cri-o://71aea83c78475ef48ee94c94d4aa38934adfa7bf7d7aa926f18d69a09b7e4921" gracePeriod=600 Apr 21 17:37:51.134310 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.134148 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="prometheus" containerID="cri-o://cd969509d181f4fbc0c959435b7ba3ab3bc4accd5ea11b5c454011f14b5132d3" gracePeriod=600 Apr 21 17:37:51.134518 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.134414 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="kube-rbac-proxy-web" containerID="cri-o://a30e2d322aa223d0d6d2b15ad9e74895e896dbb16363a669053ace6b3ee63dff" gracePeriod=600 Apr 21 17:37:51.301778 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.301746 2583 generic.go:358] "Generic (PLEG): container finished" podID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerID="71aea83c78475ef48ee94c94d4aa38934adfa7bf7d7aa926f18d69a09b7e4921" exitCode=0 Apr 21 17:37:51.301778 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.301771 2583 generic.go:358] "Generic (PLEG): container finished" podID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerID="1d5b78226227ca502c35b362e3be78e6e5072eaf7ccce6c0ab6235205ced4fb4" exitCode=0 Apr 21 17:37:51.301778 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.301778 2583 generic.go:358] "Generic (PLEG): container finished" podID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerID="91c0dcfa0814c5b459e99aab82a04881787e7d0f6b8c9008303a1cca7042f20b" exitCode=0 Apr 21 17:37:51.301778 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.301784 2583 generic.go:358] "Generic (PLEG): container finished" podID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerID="558d3e5dbcdca1b1a28a58c85f743e6482622eff966e2c5406933257cd2d4aa0" exitCode=0 Apr 21 17:37:51.302261 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.301789 2583 generic.go:358] "Generic (PLEG): container finished" podID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerID="cd969509d181f4fbc0c959435b7ba3ab3bc4accd5ea11b5c454011f14b5132d3" exitCode=0 Apr 21 17:37:51.302261 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.301827 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerDied","Data":"71aea83c78475ef48ee94c94d4aa38934adfa7bf7d7aa926f18d69a09b7e4921"} Apr 21 17:37:51.302261 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.301871 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerDied","Data":"1d5b78226227ca502c35b362e3be78e6e5072eaf7ccce6c0ab6235205ced4fb4"} Apr 21 17:37:51.302261 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.301886 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerDied","Data":"91c0dcfa0814c5b459e99aab82a04881787e7d0f6b8c9008303a1cca7042f20b"} Apr 21 17:37:51.302261 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.301900 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerDied","Data":"558d3e5dbcdca1b1a28a58c85f743e6482622eff966e2c5406933257cd2d4aa0"} Apr 21 17:37:51.302261 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:51.301914 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerDied","Data":"cd969509d181f4fbc0c959435b7ba3ab3bc4accd5ea11b5c454011f14b5132d3"} Apr 21 17:37:52.307951 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.307920 2583 generic.go:358] "Generic (PLEG): container finished" podID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerID="a30e2d322aa223d0d6d2b15ad9e74895e896dbb16363a669053ace6b3ee63dff" exitCode=0 Apr 21 17:37:52.308333 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.307980 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerDied","Data":"a30e2d322aa223d0d6d2b15ad9e74895e896dbb16363a669053ace6b3ee63dff"} Apr 21 17:37:52.382745 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.382718 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:52.454882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.454802 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba310721-5ccc-4564-b8bd-54ae583077ea-tls-assets\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.454882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.454846 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-kube-rbac-proxy\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455089 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.454908 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-tls\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455089 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.454987 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-thanos-prometheus-http-client-file\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455089 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455049 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-serving-certs-ca-bundle\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455246 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455094 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-metrics-client-certs\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455246 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455126 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-k8s-rulefiles-0\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455246 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455203 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-metrics-client-ca\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455246 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455234 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455437 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455277 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-web-config\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455437 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455320 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455437 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455361 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba310721-5ccc-4564-b8bd-54ae583077ea-config-out\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455437 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455388 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-config\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455437 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455428 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n9cd\" (UniqueName: \"kubernetes.io/projected/ba310721-5ccc-4564-b8bd-54ae583077ea-kube-api-access-7n9cd\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455671 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455482 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-k8s-db\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455671 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455509 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-grpc-tls\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455671 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455573 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-kubelet-serving-ca-bundle\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455671 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455599 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-trusted-ca-bundle\") pod \"ba310721-5ccc-4564-b8bd-54ae583077ea\" (UID: \"ba310721-5ccc-4564-b8bd-54ae583077ea\") " Apr 21 17:37:52.455859 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455753 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:37:52.455990 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.455974 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-metrics-client-ca\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.457797 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.456302 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:37:52.458256 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.458025 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:37:52.458256 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.458032 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:37:52.459524 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.458547 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:52.459524 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.458615 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:37:52.459524 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.458947 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:52.459524 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.459486 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:37:52.461239 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.461123 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:52.461239 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.461192 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:52.461239 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.461160 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:52.461562 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.461523 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:52.461562 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.461541 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba310721-5ccc-4564-b8bd-54ae583077ea-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:37:52.461702 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.461562 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba310721-5ccc-4564-b8bd-54ae583077ea-kube-api-access-7n9cd" (OuterVolumeSpecName: "kube-api-access-7n9cd") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "kube-api-access-7n9cd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:37:52.461952 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.461918 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-config" (OuterVolumeSpecName: "config") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:52.462033 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.462001 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:52.463151 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.463129 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba310721-5ccc-4564-b8bd-54ae583077ea-config-out" (OuterVolumeSpecName: "config-out") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:37:52.472845 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.472823 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-web-config" (OuterVolumeSpecName: "web-config") pod "ba310721-5ccc-4564-b8bd-54ae583077ea" (UID: "ba310721-5ccc-4564-b8bd-54ae583077ea"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:52.556397 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556351 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556397 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556387 2583 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-web-config\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556397 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556403 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556414 2583 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba310721-5ccc-4564-b8bd-54ae583077ea-config-out\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556427 2583 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-config\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556439 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7n9cd\" (UniqueName: \"kubernetes.io/projected/ba310721-5ccc-4564-b8bd-54ae583077ea-kube-api-access-7n9cd\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556452 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-k8s-db\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556463 2583 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-grpc-tls\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556475 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556488 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-trusted-ca-bundle\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556500 2583 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba310721-5ccc-4564-b8bd-54ae583077ea-tls-assets\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556511 2583 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-kube-rbac-proxy\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556524 2583 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-prometheus-k8s-tls\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556536 2583 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-thanos-prometheus-http-client-file\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556548 2583 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556560 2583 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ba310721-5ccc-4564-b8bd-54ae583077ea-secret-metrics-client-certs\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:52.556667 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:52.556576 2583 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ba310721-5ccc-4564-b8bd-54ae583077ea-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:37:53.316370 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.316327 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ba310721-5ccc-4564-b8bd-54ae583077ea","Type":"ContainerDied","Data":"c5b1e78dafc5eeb5112e66e795961eb4ae84ed970c8a0c0d6e8afc8d00b91892"} Apr 21 17:37:53.316763 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.316375 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.316763 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.316386 2583 scope.go:117] "RemoveContainer" containerID="71aea83c78475ef48ee94c94d4aa38934adfa7bf7d7aa926f18d69a09b7e4921" Apr 21 17:37:53.324681 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.324662 2583 scope.go:117] "RemoveContainer" containerID="1d5b78226227ca502c35b362e3be78e6e5072eaf7ccce6c0ab6235205ced4fb4" Apr 21 17:37:53.334525 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.334500 2583 scope.go:117] "RemoveContainer" containerID="a30e2d322aa223d0d6d2b15ad9e74895e896dbb16363a669053ace6b3ee63dff" Apr 21 17:37:53.341025 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.340996 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 17:37:53.342107 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.342086 2583 scope.go:117] "RemoveContainer" containerID="91c0dcfa0814c5b459e99aab82a04881787e7d0f6b8c9008303a1cca7042f20b" Apr 21 17:37:53.345415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.345369 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 17:37:53.350646 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.350627 2583 scope.go:117] "RemoveContainer" containerID="558d3e5dbcdca1b1a28a58c85f743e6482622eff966e2c5406933257cd2d4aa0" Apr 21 17:37:53.357162 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.357144 2583 scope.go:117] "RemoveContainer" containerID="cd969509d181f4fbc0c959435b7ba3ab3bc4accd5ea11b5c454011f14b5132d3" Apr 21 17:37:53.364145 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.364122 2583 scope.go:117] "RemoveContainer" containerID="ffbd01a6351c7fee0461bfa19d252553902871557907817296315582873fa808" Apr 21 17:37:53.372784 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.372760 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 17:37:53.373086 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373072 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="prometheus" Apr 21 17:37:53.373133 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373088 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="prometheus" Apr 21 17:37:53.373133 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373096 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="thanos-sidecar" Apr 21 17:37:53.373133 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373102 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="thanos-sidecar" Apr 21 17:37:53.373133 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373110 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="config-reloader" Apr 21 17:37:53.373133 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373115 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="config-reloader" Apr 21 17:37:53.373133 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373122 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="kube-rbac-proxy" Apr 21 17:37:53.373133 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373128 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="kube-rbac-proxy" Apr 21 17:37:53.373415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373139 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="kube-rbac-proxy-web" Apr 21 17:37:53.373415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373145 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="kube-rbac-proxy-web" Apr 21 17:37:53.373415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373156 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="kube-rbac-proxy-thanos" Apr 21 17:37:53.373415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373162 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="kube-rbac-proxy-thanos" Apr 21 17:37:53.373415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373203 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="init-config-reloader" Apr 21 17:37:53.373415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373215 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="init-config-reloader" Apr 21 17:37:53.373415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373267 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="kube-rbac-proxy" Apr 21 17:37:53.373415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373277 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="kube-rbac-proxy-thanos" Apr 21 17:37:53.373415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373284 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="thanos-sidecar" Apr 21 17:37:53.373415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373291 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="prometheus" Apr 21 17:37:53.373415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373298 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="kube-rbac-proxy-web" Apr 21 17:37:53.373415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.373304 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" containerName="config-reloader" Apr 21 17:37:53.378429 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.378412 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.381069 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.381042 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5mthhaug7i2fa\"" Apr 21 17:37:53.381194 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.381091 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 17:37:53.381271 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.381218 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 17:37:53.381271 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.381239 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 17:37:53.381379 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.381364 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 17:37:53.381426 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.381416 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 17:37:53.381493 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.381432 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 17:37:53.381493 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.381445 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 17:37:53.381493 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.381432 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 17:37:53.381493 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.381473 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 17:37:53.381677 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.381542 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-8b9mh\"" Apr 21 17:37:53.381904 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.381887 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 17:37:53.384820 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.384800 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 17:37:53.386605 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.386371 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 17:37:53.389730 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.389710 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 17:37:53.463789 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.463707 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.463789 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.463738 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.463789 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.463758 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.463789 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.463775 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrk5s\" (UniqueName: \"kubernetes.io/projected/485192ea-97bf-4e77-a113-a90abb8a1ff2-kube-api-access-jrk5s\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.464090 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.463881 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/485192ea-97bf-4e77-a113-a90abb8a1ff2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.464090 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.463916 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.464090 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.463943 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.464090 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.464025 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.464090 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.464072 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/485192ea-97bf-4e77-a113-a90abb8a1ff2-config-out\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.464415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.464105 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/485192ea-97bf-4e77-a113-a90abb8a1ff2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.464415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.464160 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-config\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.464415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.464206 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.464415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.464233 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.464415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.464283 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-web-config\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.464415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.464320 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.464415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.464346 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.464415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.464387 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.464415 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.464413 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.465615 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.465594 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba310721-5ccc-4564-b8bd-54ae583077ea" path="/var/lib/kubelet/pods/ba310721-5ccc-4564-b8bd-54ae583077ea/volumes" Apr 21 17:37:53.565282 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565250 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/485192ea-97bf-4e77-a113-a90abb8a1ff2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.565431 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565299 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-config\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.565431 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565322 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.565540 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565450 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.565540 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565502 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-web-config\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.565642 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565541 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.565642 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565565 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.565642 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565607 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.565784 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565640 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.565784 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565674 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.565784 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565691 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/485192ea-97bf-4e77-a113-a90abb8a1ff2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.565784 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565699 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.565784 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565760 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.566027 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565791 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrk5s\" (UniqueName: \"kubernetes.io/projected/485192ea-97bf-4e77-a113-a90abb8a1ff2-kube-api-access-jrk5s\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.566027 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565857 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/485192ea-97bf-4e77-a113-a90abb8a1ff2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.566027 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565882 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.566027 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565908 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.566027 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565949 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.566027 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.565986 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/485192ea-97bf-4e77-a113-a90abb8a1ff2-config-out\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.566512 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.566497 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.567338 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.567311 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.568633 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.568220 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.568633 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.568226 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.568633 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.568513 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-web-config\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.568862 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.568651 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.568862 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.568725 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.569058 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.568978 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-config\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.569505 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.569435 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/485192ea-97bf-4e77-a113-a90abb8a1ff2-config-out\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.569814 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.569788 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.570809 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.570786 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/485192ea-97bf-4e77-a113-a90abb8a1ff2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.570809 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.570801 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.571524 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.571499 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.571616 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.571553 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.572063 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.572040 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/485192ea-97bf-4e77-a113-a90abb8a1ff2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.572376 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.572356 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/485192ea-97bf-4e77-a113-a90abb8a1ff2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.579015 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.578995 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrk5s\" (UniqueName: \"kubernetes.io/projected/485192ea-97bf-4e77-a113-a90abb8a1ff2-kube-api-access-jrk5s\") pod \"prometheus-k8s-0\" (UID: \"485192ea-97bf-4e77-a113-a90abb8a1ff2\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.706426 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.706385 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:37:53.843612 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:53.843587 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 17:37:53.846152 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:37:53.846121 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod485192ea_97bf_4e77_a113_a90abb8a1ff2.slice/crio-fd78f0656f71a7cf0430abd06b947e4fc816c1ac21f59223a6b0df27ea8e72dd WatchSource:0}: Error finding container fd78f0656f71a7cf0430abd06b947e4fc816c1ac21f59223a6b0df27ea8e72dd: Status 404 returned error can't find the container with id fd78f0656f71a7cf0430abd06b947e4fc816c1ac21f59223a6b0df27ea8e72dd Apr 21 17:37:54.322040 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:54.322003 2583 generic.go:358] "Generic (PLEG): container finished" podID="485192ea-97bf-4e77-a113-a90abb8a1ff2" containerID="cff7d4905e019fda3f4f2f34711db05f77aae4360bbebf15ec4e3463f51c74a7" exitCode=0 Apr 21 17:37:54.322530 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:54.322076 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"485192ea-97bf-4e77-a113-a90abb8a1ff2","Type":"ContainerDied","Data":"cff7d4905e019fda3f4f2f34711db05f77aae4360bbebf15ec4e3463f51c74a7"} Apr 21 17:37:54.322530 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:54.322112 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"485192ea-97bf-4e77-a113-a90abb8a1ff2","Type":"ContainerStarted","Data":"fd78f0656f71a7cf0430abd06b947e4fc816c1ac21f59223a6b0df27ea8e72dd"} Apr 21 17:37:55.330101 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:55.330065 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"485192ea-97bf-4e77-a113-a90abb8a1ff2","Type":"ContainerStarted","Data":"89eac72e1174f95a851e9f9f85d9dfc2480d9614f204cc86b3b49c275931c992"} Apr 21 17:37:55.330101 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:55.330105 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"485192ea-97bf-4e77-a113-a90abb8a1ff2","Type":"ContainerStarted","Data":"cf3a04abb8e599c6c3c4e5b2ab9b0c05b442bfff6b9044076c18de3cebfeae28"} Apr 21 17:37:55.330101 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:55.330114 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"485192ea-97bf-4e77-a113-a90abb8a1ff2","Type":"ContainerStarted","Data":"69e6b4f833912e5705d66e025ac471937010b6f8318e1670d5ec97046b9a1683"} Apr 21 17:37:55.330101 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:55.330123 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"485192ea-97bf-4e77-a113-a90abb8a1ff2","Type":"ContainerStarted","Data":"13db0334393e1c174c5a64539ee460a32cf1e28f11f3e475dc677667e89dec24"} Apr 21 17:37:55.330683 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:55.330137 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"485192ea-97bf-4e77-a113-a90abb8a1ff2","Type":"ContainerStarted","Data":"75964f3c6fd74fe201af488a60441e29724eb6338f2a6b211bd8642d2c43bade"} Apr 21 17:37:55.330683 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:55.330149 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"485192ea-97bf-4e77-a113-a90abb8a1ff2","Type":"ContainerStarted","Data":"d9a0a582654d799c86df2fb403e4f0ee7d6a1dc9fce160dd6eb28ffcd3285dbc"} Apr 21 17:37:55.387300 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:55.387250 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.387235077 podStartE2EDuration="2.387235077s" podCreationTimestamp="2026-04-21 17:37:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:37:55.384948688 +0000 UTC m=+262.525011466" watchObservedRunningTime="2026-04-21 17:37:55.387235077 +0000 UTC m=+262.527297877" Apr 21 17:37:58.707118 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:37:58.707081 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:38:33.380463 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:38:33.380433 2583 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 17:38:53.707620 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:38:53.707584 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:38:53.723638 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:38:53.723611 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:38:54.519584 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:38:54.519557 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 17:41:19.129353 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.129315 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g"] Apr 21 17:41:19.132328 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.132309 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" Apr 21 17:41:19.135138 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.135112 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 17:41:19.135138 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.135128 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 17:41:19.136960 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.136944 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 17:41:19.137064 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.137044 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 17:41:19.137194 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.137161 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-gf9sq\"" Apr 21 17:41:19.150577 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.150547 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g"] Apr 21 17:41:19.210795 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.210755 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f61381e7-9639-4b7b-8596-4906556a0b03-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g\" (UID: \"f61381e7-9639-4b7b-8596-4906556a0b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" Apr 21 17:41:19.210998 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.210844 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpg8l\" (UniqueName: \"kubernetes.io/projected/f61381e7-9639-4b7b-8596-4906556a0b03-kube-api-access-rpg8l\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g\" (UID: \"f61381e7-9639-4b7b-8596-4906556a0b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" Apr 21 17:41:19.210998 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.210890 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f61381e7-9639-4b7b-8596-4906556a0b03-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g\" (UID: \"f61381e7-9639-4b7b-8596-4906556a0b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" Apr 21 17:41:19.311787 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.311744 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpg8l\" (UniqueName: \"kubernetes.io/projected/f61381e7-9639-4b7b-8596-4906556a0b03-kube-api-access-rpg8l\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g\" (UID: \"f61381e7-9639-4b7b-8596-4906556a0b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" Apr 21 17:41:19.311991 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.311800 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f61381e7-9639-4b7b-8596-4906556a0b03-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g\" (UID: \"f61381e7-9639-4b7b-8596-4906556a0b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" Apr 21 17:41:19.311991 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.311841 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f61381e7-9639-4b7b-8596-4906556a0b03-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g\" (UID: \"f61381e7-9639-4b7b-8596-4906556a0b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" Apr 21 17:41:19.314442 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.314417 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f61381e7-9639-4b7b-8596-4906556a0b03-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g\" (UID: \"f61381e7-9639-4b7b-8596-4906556a0b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" Apr 21 17:41:19.314573 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.314465 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f61381e7-9639-4b7b-8596-4906556a0b03-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g\" (UID: \"f61381e7-9639-4b7b-8596-4906556a0b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" Apr 21 17:41:19.330104 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.330076 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpg8l\" (UniqueName: \"kubernetes.io/projected/f61381e7-9639-4b7b-8596-4906556a0b03-kube-api-access-rpg8l\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g\" (UID: \"f61381e7-9639-4b7b-8596-4906556a0b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" Apr 21 17:41:19.443254 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.443147 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" Apr 21 17:41:19.587481 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.587451 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g"] Apr 21 17:41:19.590439 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:41:19.590408 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf61381e7_9639_4b7b_8596_4906556a0b03.slice/crio-7add3c5ca2f2ce69355d265ca5ab6b5e915c1ac231b59b1d49d8806fb87de1f0 WatchSource:0}: Error finding container 7add3c5ca2f2ce69355d265ca5ab6b5e915c1ac231b59b1d49d8806fb87de1f0: Status 404 returned error can't find the container with id 7add3c5ca2f2ce69355d265ca5ab6b5e915c1ac231b59b1d49d8806fb87de1f0 Apr 21 17:41:19.592128 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.592106 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 17:41:19.939530 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:19.939494 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" event={"ID":"f61381e7-9639-4b7b-8596-4906556a0b03","Type":"ContainerStarted","Data":"7add3c5ca2f2ce69355d265ca5ab6b5e915c1ac231b59b1d49d8806fb87de1f0"} Apr 21 17:41:22.952883 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:22.952840 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" event={"ID":"f61381e7-9639-4b7b-8596-4906556a0b03","Type":"ContainerStarted","Data":"95310e795ed17dcdfecfd8423cad8f829be2ec794e204cc56b1bc3687512fd12"} Apr 21 17:41:22.953381 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:22.952952 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" Apr 21 17:41:22.975290 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:22.975237 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" podStartSLOduration=1.41579761 podStartE2EDuration="3.975221376s" podCreationTimestamp="2026-04-21 17:41:19 +0000 UTC" firstStartedPulling="2026-04-21 17:41:19.592321174 +0000 UTC m=+466.732383922" lastFinishedPulling="2026-04-21 17:41:22.151744941 +0000 UTC m=+469.291807688" observedRunningTime="2026-04-21 17:41:22.972620781 +0000 UTC m=+470.112683585" watchObservedRunningTime="2026-04-21 17:41:22.975221376 +0000 UTC m=+470.115284143" Apr 21 17:41:33.961436 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:33.961405 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g" Apr 21 17:41:38.903297 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:38.903259 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td"] Apr 21 17:41:38.906954 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:38.906938 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:38.909875 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:38.909854 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-x6qtw\"" Apr 21 17:41:38.911616 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:38.911599 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 17:41:38.912156 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:38.912119 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 17:41:38.912286 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:38.912210 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 17:41:38.912910 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:38.912892 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 17:41:38.917504 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:38.917480 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 17:41:38.930278 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:38.930247 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td"] Apr 21 17:41:38.980139 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:38.980105 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eec124d-d377-4906-b479-77f34f5fe45a-cert\") pod \"lws-controller-manager-bdd4f6877-q58td\" (UID: \"6eec124d-d377-4906-b479-77f34f5fe45a\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:38.980316 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:38.980151 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6eec124d-d377-4906-b479-77f34f5fe45a-metrics-cert\") pod \"lws-controller-manager-bdd4f6877-q58td\" (UID: \"6eec124d-d377-4906-b479-77f34f5fe45a\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:38.980316 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:38.980230 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6eec124d-d377-4906-b479-77f34f5fe45a-manager-config\") pod \"lws-controller-manager-bdd4f6877-q58td\" (UID: \"6eec124d-d377-4906-b479-77f34f5fe45a\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:38.980316 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:38.980293 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nrfj\" (UniqueName: \"kubernetes.io/projected/6eec124d-d377-4906-b479-77f34f5fe45a-kube-api-access-6nrfj\") pod \"lws-controller-manager-bdd4f6877-q58td\" (UID: \"6eec124d-d377-4906-b479-77f34f5fe45a\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:39.080876 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:39.080834 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nrfj\" (UniqueName: \"kubernetes.io/projected/6eec124d-d377-4906-b479-77f34f5fe45a-kube-api-access-6nrfj\") pod \"lws-controller-manager-bdd4f6877-q58td\" (UID: \"6eec124d-d377-4906-b479-77f34f5fe45a\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:39.081075 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:39.080903 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eec124d-d377-4906-b479-77f34f5fe45a-cert\") pod \"lws-controller-manager-bdd4f6877-q58td\" (UID: \"6eec124d-d377-4906-b479-77f34f5fe45a\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:39.081075 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:39.080949 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6eec124d-d377-4906-b479-77f34f5fe45a-metrics-cert\") pod \"lws-controller-manager-bdd4f6877-q58td\" (UID: \"6eec124d-d377-4906-b479-77f34f5fe45a\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:39.081075 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:39.080980 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6eec124d-d377-4906-b479-77f34f5fe45a-manager-config\") pod \"lws-controller-manager-bdd4f6877-q58td\" (UID: \"6eec124d-d377-4906-b479-77f34f5fe45a\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:39.081771 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:39.081739 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6eec124d-d377-4906-b479-77f34f5fe45a-manager-config\") pod \"lws-controller-manager-bdd4f6877-q58td\" (UID: \"6eec124d-d377-4906-b479-77f34f5fe45a\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:39.084303 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:39.084279 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6eec124d-d377-4906-b479-77f34f5fe45a-metrics-cert\") pod \"lws-controller-manager-bdd4f6877-q58td\" (UID: \"6eec124d-d377-4906-b479-77f34f5fe45a\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:39.084465 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:39.084443 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eec124d-d377-4906-b479-77f34f5fe45a-cert\") pod \"lws-controller-manager-bdd4f6877-q58td\" (UID: \"6eec124d-d377-4906-b479-77f34f5fe45a\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:39.110349 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:39.110310 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nrfj\" (UniqueName: \"kubernetes.io/projected/6eec124d-d377-4906-b479-77f34f5fe45a-kube-api-access-6nrfj\") pod \"lws-controller-manager-bdd4f6877-q58td\" (UID: \"6eec124d-d377-4906-b479-77f34f5fe45a\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:39.216002 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:39.215974 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:39.398606 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:39.398574 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td"] Apr 21 17:41:39.400698 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:41:39.400670 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eec124d_d377_4906_b479_77f34f5fe45a.slice/crio-c43d23426c1ceaf9fb9a607a88a2c3fe2807395df251b0c215a0bdeca7589fc9 WatchSource:0}: Error finding container c43d23426c1ceaf9fb9a607a88a2c3fe2807395df251b0c215a0bdeca7589fc9: Status 404 returned error can't find the container with id c43d23426c1ceaf9fb9a607a88a2c3fe2807395df251b0c215a0bdeca7589fc9 Apr 21 17:41:40.008462 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:40.008425 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" event={"ID":"6eec124d-d377-4906-b479-77f34f5fe45a","Type":"ContainerStarted","Data":"c43d23426c1ceaf9fb9a607a88a2c3fe2807395df251b0c215a0bdeca7589fc9"} Apr 21 17:41:43.019960 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:43.019920 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" event={"ID":"6eec124d-d377-4906-b479-77f34f5fe45a","Type":"ContainerStarted","Data":"180526306ebb748872ba3d3ee9cd280db30dd174fc8a5d2f3b1747cc269f0680"} Apr 21 17:41:43.020409 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:43.020039 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:41:43.042189 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:43.042127 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" podStartSLOduration=2.456647867 podStartE2EDuration="5.042115138s" podCreationTimestamp="2026-04-21 17:41:38 +0000 UTC" firstStartedPulling="2026-04-21 17:41:39.402728941 +0000 UTC m=+486.542791686" lastFinishedPulling="2026-04-21 17:41:41.988196209 +0000 UTC m=+489.128258957" observedRunningTime="2026-04-21 17:41:43.040518194 +0000 UTC m=+490.180580961" watchObservedRunningTime="2026-04-21 17:41:43.042115138 +0000 UTC m=+490.182177904" Apr 21 17:41:54.025860 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:41:54.025824 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-q58td" Apr 21 17:42:06.005086 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.005045 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69"] Apr 21 17:42:06.013314 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.013283 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.016005 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.015973 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-g4mjv\"" Apr 21 17:42:06.016005 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.015995 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 17:42:06.021045 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.021010 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69"] Apr 21 17:42:06.117785 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.117729 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.117785 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.117784 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.118009 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.117811 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/16cc9871-1c08-4e35-968e-3d455ccf671e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.118009 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.117836 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8jgq\" (UniqueName: \"kubernetes.io/projected/16cc9871-1c08-4e35-968e-3d455ccf671e-kube-api-access-s8jgq\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.118009 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.117873 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.118009 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.117931 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.118009 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.117971 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/16cc9871-1c08-4e35-968e-3d455ccf671e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.118009 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.118009 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.118236 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.118054 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/16cc9871-1c08-4e35-968e-3d455ccf671e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.218897 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.218850 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.219083 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.218906 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.219083 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.218933 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/16cc9871-1c08-4e35-968e-3d455ccf671e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.219083 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.218959 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8jgq\" (UniqueName: \"kubernetes.io/projected/16cc9871-1c08-4e35-968e-3d455ccf671e-kube-api-access-s8jgq\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.219083 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.219001 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.219083 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.219044 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.219371 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.219090 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/16cc9871-1c08-4e35-968e-3d455ccf671e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.219371 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.219145 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.219371 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.219210 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/16cc9871-1c08-4e35-968e-3d455ccf671e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.219723 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.219693 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.219857 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.219838 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.220195 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.220125 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.220390 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.220362 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.220490 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.220366 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/16cc9871-1c08-4e35-968e-3d455ccf671e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.221844 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.221821 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/16cc9871-1c08-4e35-968e-3d455ccf671e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.222686 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.222663 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/16cc9871-1c08-4e35-968e-3d455ccf671e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.227999 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.227971 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8jgq\" (UniqueName: \"kubernetes.io/projected/16cc9871-1c08-4e35-968e-3d455ccf671e-kube-api-access-s8jgq\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.228100 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.228056 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/16cc9871-1c08-4e35-968e-3d455ccf671e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cffwd69\" (UID: \"16cc9871-1c08-4e35-968e-3d455ccf671e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.326604 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.326498 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:06.461790 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:06.461760 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69"] Apr 21 17:42:06.464349 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:42:06.464303 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16cc9871_1c08_4e35_968e_3d455ccf671e.slice/crio-f89eb8f450a679342e4b95292f068bfdbeeacff026c22ed1e7b15a266ad3aedf WatchSource:0}: Error finding container f89eb8f450a679342e4b95292f068bfdbeeacff026c22ed1e7b15a266ad3aedf: Status 404 returned error can't find the container with id f89eb8f450a679342e4b95292f068bfdbeeacff026c22ed1e7b15a266ad3aedf Apr 21 17:42:07.095969 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:07.095931 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" event={"ID":"16cc9871-1c08-4e35-968e-3d455ccf671e","Type":"ContainerStarted","Data":"f89eb8f450a679342e4b95292f068bfdbeeacff026c22ed1e7b15a266ad3aedf"} Apr 21 17:42:09.126965 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:09.126926 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 21 17:42:09.127272 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:09.127010 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 21 17:42:09.127272 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:09.127040 2583 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 21 17:42:10.108718 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:10.108674 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" event={"ID":"16cc9871-1c08-4e35-968e-3d455ccf671e","Type":"ContainerStarted","Data":"227a0c0fa08e6d91206f324024ed9135b3828ff2475080409c84c06bb1b49e8b"} Apr 21 17:42:10.132274 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:10.132224 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" podStartSLOduration=2.471810889 podStartE2EDuration="5.132207129s" podCreationTimestamp="2026-04-21 17:42:05 +0000 UTC" firstStartedPulling="2026-04-21 17:42:06.466257198 +0000 UTC m=+513.606319943" lastFinishedPulling="2026-04-21 17:42:09.126653438 +0000 UTC m=+516.266716183" observedRunningTime="2026-04-21 17:42:10.12993467 +0000 UTC m=+517.269997449" watchObservedRunningTime="2026-04-21 17:42:10.132207129 +0000 UTC m=+517.272269895" Apr 21 17:42:10.326919 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:10.326872 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:10.331752 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:10.331726 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:11.112021 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:11.111949 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:11.112959 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:11.112942 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cffwd69" Apr 21 17:42:37.982324 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:37.982288 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-g4cgg"] Apr 21 17:42:37.991781 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:37.991758 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-g4cgg" Apr 21 17:42:37.995288 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:37.995242 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-g4cgg"] Apr 21 17:42:37.995701 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:37.995674 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-lqsm8\"" Apr 21 17:42:37.996776 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:37.996753 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 17:42:37.996873 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:37.996827 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 17:42:38.103599 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:38.103557 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-867xq\" (UniqueName: \"kubernetes.io/projected/414ac8c7-bd76-4c6e-ab31-7fe35c79fc70-kube-api-access-867xq\") pod \"kuadrant-operator-catalog-g4cgg\" (UID: \"414ac8c7-bd76-4c6e-ab31-7fe35c79fc70\") " pod="kuadrant-system/kuadrant-operator-catalog-g4cgg" Apr 21 17:42:38.204245 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:38.204214 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-867xq\" (UniqueName: \"kubernetes.io/projected/414ac8c7-bd76-4c6e-ab31-7fe35c79fc70-kube-api-access-867xq\") pod \"kuadrant-operator-catalog-g4cgg\" (UID: \"414ac8c7-bd76-4c6e-ab31-7fe35c79fc70\") " pod="kuadrant-system/kuadrant-operator-catalog-g4cgg" Apr 21 17:42:38.217779 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:38.217746 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-867xq\" (UniqueName: \"kubernetes.io/projected/414ac8c7-bd76-4c6e-ab31-7fe35c79fc70-kube-api-access-867xq\") pod \"kuadrant-operator-catalog-g4cgg\" (UID: \"414ac8c7-bd76-4c6e-ab31-7fe35c79fc70\") " pod="kuadrant-system/kuadrant-operator-catalog-g4cgg" Apr 21 17:42:38.302185 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:38.302080 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-g4cgg" Apr 21 17:42:38.328635 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:38.328602 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-g4cgg"] Apr 21 17:42:38.429145 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:38.429117 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-g4cgg"] Apr 21 17:42:38.431713 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:42:38.431685 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod414ac8c7_bd76_4c6e_ab31_7fe35c79fc70.slice/crio-f227a16a710b0b07a80408f2b0e4b91d72788f038d8e29a08e0fa90284ebbb6c WatchSource:0}: Error finding container f227a16a710b0b07a80408f2b0e4b91d72788f038d8e29a08e0fa90284ebbb6c: Status 404 returned error can't find the container with id f227a16a710b0b07a80408f2b0e4b91d72788f038d8e29a08e0fa90284ebbb6c Apr 21 17:42:38.540421 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:38.540387 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nc9q5"] Apr 21 17:42:38.543548 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:38.543531 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nc9q5" Apr 21 17:42:38.565761 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:38.565687 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nc9q5"] Apr 21 17:42:38.608206 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:38.608158 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v27v\" (UniqueName: \"kubernetes.io/projected/f349e837-52d4-4138-b150-4f2f90ca1ec8-kube-api-access-4v27v\") pod \"kuadrant-operator-catalog-nc9q5\" (UID: \"f349e837-52d4-4138-b150-4f2f90ca1ec8\") " pod="kuadrant-system/kuadrant-operator-catalog-nc9q5" Apr 21 17:42:38.709024 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:38.708990 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4v27v\" (UniqueName: \"kubernetes.io/projected/f349e837-52d4-4138-b150-4f2f90ca1ec8-kube-api-access-4v27v\") pod \"kuadrant-operator-catalog-nc9q5\" (UID: \"f349e837-52d4-4138-b150-4f2f90ca1ec8\") " pod="kuadrant-system/kuadrant-operator-catalog-nc9q5" Apr 21 17:42:38.717736 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:38.717710 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v27v\" (UniqueName: \"kubernetes.io/projected/f349e837-52d4-4138-b150-4f2f90ca1ec8-kube-api-access-4v27v\") pod \"kuadrant-operator-catalog-nc9q5\" (UID: \"f349e837-52d4-4138-b150-4f2f90ca1ec8\") " pod="kuadrant-system/kuadrant-operator-catalog-nc9q5" Apr 21 17:42:38.853254 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:38.853129 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-nc9q5" Apr 21 17:42:38.973996 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:38.973974 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-nc9q5"] Apr 21 17:42:38.976253 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:42:38.976229 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf349e837_52d4_4138_b150_4f2f90ca1ec8.slice/crio-1b24f6c6f609b7518fc2266000ae3c251cb5dd9e13e6976f1922891dee61d983 WatchSource:0}: Error finding container 1b24f6c6f609b7518fc2266000ae3c251cb5dd9e13e6976f1922891dee61d983: Status 404 returned error can't find the container with id 1b24f6c6f609b7518fc2266000ae3c251cb5dd9e13e6976f1922891dee61d983 Apr 21 17:42:39.207718 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:39.207674 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nc9q5" event={"ID":"f349e837-52d4-4138-b150-4f2f90ca1ec8","Type":"ContainerStarted","Data":"1b24f6c6f609b7518fc2266000ae3c251cb5dd9e13e6976f1922891dee61d983"} Apr 21 17:42:39.208865 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:39.208837 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-g4cgg" event={"ID":"414ac8c7-bd76-4c6e-ab31-7fe35c79fc70","Type":"ContainerStarted","Data":"f227a16a710b0b07a80408f2b0e4b91d72788f038d8e29a08e0fa90284ebbb6c"} Apr 21 17:42:41.217517 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:41.217481 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-nc9q5" event={"ID":"f349e837-52d4-4138-b150-4f2f90ca1ec8","Type":"ContainerStarted","Data":"81e2b44e64e8de4df56dd66a94152026b8a03254332f7ca2f9bd8e4eccaeebe2"} Apr 21 17:42:41.218887 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:41.218862 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-g4cgg" event={"ID":"414ac8c7-bd76-4c6e-ab31-7fe35c79fc70","Type":"ContainerStarted","Data":"ff3f2c628f943ff0f4883272ebc5c6a1ace6dae9acb7121d2e7355ce850de1cc"} Apr 21 17:42:41.219014 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:41.218952 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-g4cgg" podUID="414ac8c7-bd76-4c6e-ab31-7fe35c79fc70" containerName="registry-server" containerID="cri-o://ff3f2c628f943ff0f4883272ebc5c6a1ace6dae9acb7121d2e7355ce850de1cc" gracePeriod=2 Apr 21 17:42:41.234437 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:41.234397 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-nc9q5" podStartSLOduration=1.64437582 podStartE2EDuration="3.234382596s" podCreationTimestamp="2026-04-21 17:42:38 +0000 UTC" firstStartedPulling="2026-04-21 17:42:38.978051794 +0000 UTC m=+546.118114539" lastFinishedPulling="2026-04-21 17:42:40.568058558 +0000 UTC m=+547.708121315" observedRunningTime="2026-04-21 17:42:41.232687355 +0000 UTC m=+548.372750134" watchObservedRunningTime="2026-04-21 17:42:41.234382596 +0000 UTC m=+548.374445362" Apr 21 17:42:41.247215 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:41.247154 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-g4cgg" podStartSLOduration=2.108530019 podStartE2EDuration="4.247139475s" podCreationTimestamp="2026-04-21 17:42:37 +0000 UTC" firstStartedPulling="2026-04-21 17:42:38.433412959 +0000 UTC m=+545.573475704" lastFinishedPulling="2026-04-21 17:42:40.572022415 +0000 UTC m=+547.712085160" observedRunningTime="2026-04-21 17:42:41.246716752 +0000 UTC m=+548.386779521" watchObservedRunningTime="2026-04-21 17:42:41.247139475 +0000 UTC m=+548.387202272" Apr 21 17:42:41.463434 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:41.463409 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-g4cgg" Apr 21 17:42:41.536666 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:41.536573 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-867xq\" (UniqueName: \"kubernetes.io/projected/414ac8c7-bd76-4c6e-ab31-7fe35c79fc70-kube-api-access-867xq\") pod \"414ac8c7-bd76-4c6e-ab31-7fe35c79fc70\" (UID: \"414ac8c7-bd76-4c6e-ab31-7fe35c79fc70\") " Apr 21 17:42:41.543517 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:41.543486 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/414ac8c7-bd76-4c6e-ab31-7fe35c79fc70-kube-api-access-867xq" (OuterVolumeSpecName: "kube-api-access-867xq") pod "414ac8c7-bd76-4c6e-ab31-7fe35c79fc70" (UID: "414ac8c7-bd76-4c6e-ab31-7fe35c79fc70"). InnerVolumeSpecName "kube-api-access-867xq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:42:41.638205 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:41.638141 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-867xq\" (UniqueName: \"kubernetes.io/projected/414ac8c7-bd76-4c6e-ab31-7fe35c79fc70-kube-api-access-867xq\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:42:42.223482 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:42.223441 2583 generic.go:358] "Generic (PLEG): container finished" podID="414ac8c7-bd76-4c6e-ab31-7fe35c79fc70" containerID="ff3f2c628f943ff0f4883272ebc5c6a1ace6dae9acb7121d2e7355ce850de1cc" exitCode=0 Apr 21 17:42:42.223969 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:42.223523 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-g4cgg" Apr 21 17:42:42.223969 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:42.223529 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-g4cgg" event={"ID":"414ac8c7-bd76-4c6e-ab31-7fe35c79fc70","Type":"ContainerDied","Data":"ff3f2c628f943ff0f4883272ebc5c6a1ace6dae9acb7121d2e7355ce850de1cc"} Apr 21 17:42:42.223969 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:42.223577 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-g4cgg" event={"ID":"414ac8c7-bd76-4c6e-ab31-7fe35c79fc70","Type":"ContainerDied","Data":"f227a16a710b0b07a80408f2b0e4b91d72788f038d8e29a08e0fa90284ebbb6c"} Apr 21 17:42:42.223969 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:42.223600 2583 scope.go:117] "RemoveContainer" containerID="ff3f2c628f943ff0f4883272ebc5c6a1ace6dae9acb7121d2e7355ce850de1cc" Apr 21 17:42:42.232310 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:42.232295 2583 scope.go:117] "RemoveContainer" containerID="ff3f2c628f943ff0f4883272ebc5c6a1ace6dae9acb7121d2e7355ce850de1cc" Apr 21 17:42:42.232581 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:42:42.232563 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff3f2c628f943ff0f4883272ebc5c6a1ace6dae9acb7121d2e7355ce850de1cc\": container with ID starting with ff3f2c628f943ff0f4883272ebc5c6a1ace6dae9acb7121d2e7355ce850de1cc not found: ID does not exist" containerID="ff3f2c628f943ff0f4883272ebc5c6a1ace6dae9acb7121d2e7355ce850de1cc" Apr 21 17:42:42.232629 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:42.232590 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff3f2c628f943ff0f4883272ebc5c6a1ace6dae9acb7121d2e7355ce850de1cc"} err="failed to get container status \"ff3f2c628f943ff0f4883272ebc5c6a1ace6dae9acb7121d2e7355ce850de1cc\": rpc error: code = NotFound desc = could not find container \"ff3f2c628f943ff0f4883272ebc5c6a1ace6dae9acb7121d2e7355ce850de1cc\": container with ID starting with ff3f2c628f943ff0f4883272ebc5c6a1ace6dae9acb7121d2e7355ce850de1cc not found: ID does not exist" Apr 21 17:42:42.244065 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:42.244039 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-g4cgg"] Apr 21 17:42:42.245731 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:42.245713 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-g4cgg"] Apr 21 17:42:43.465056 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:43.465026 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="414ac8c7-bd76-4c6e-ab31-7fe35c79fc70" path="/var/lib/kubelet/pods/414ac8c7-bd76-4c6e-ab31-7fe35c79fc70/volumes" Apr 21 17:42:48.853339 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:48.853297 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-nc9q5" Apr 21 17:42:48.853339 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:48.853341 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-nc9q5" Apr 21 17:42:48.875465 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:48.875440 2583 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-nc9q5" Apr 21 17:42:49.270937 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:42:49.270909 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-nc9q5" Apr 21 17:43:09.455794 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:09.455758 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-h9sv8"] Apr 21 17:43:09.456231 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:09.456111 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="414ac8c7-bd76-4c6e-ab31-7fe35c79fc70" containerName="registry-server" Apr 21 17:43:09.456231 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:09.456124 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="414ac8c7-bd76-4c6e-ab31-7fe35c79fc70" containerName="registry-server" Apr 21 17:43:09.456231 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:09.456218 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="414ac8c7-bd76-4c6e-ab31-7fe35c79fc70" containerName="registry-server" Apr 21 17:43:09.458056 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:09.458039 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-h9sv8" Apr 21 17:43:09.460646 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:09.460620 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-hsv8w\"" Apr 21 17:43:09.460646 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:09.460632 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 21 17:43:09.469579 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:09.469540 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-h9sv8"] Apr 21 17:43:09.488471 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:09.488436 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt88l\" (UniqueName: \"kubernetes.io/projected/fe34329f-98ea-4289-bbb4-f6052a0a01f5-kube-api-access-bt88l\") pod \"dns-operator-controller-manager-648d5c98bc-h9sv8\" (UID: \"fe34329f-98ea-4289-bbb4-f6052a0a01f5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-h9sv8" Apr 21 17:43:09.589621 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:09.589590 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bt88l\" (UniqueName: \"kubernetes.io/projected/fe34329f-98ea-4289-bbb4-f6052a0a01f5-kube-api-access-bt88l\") pod \"dns-operator-controller-manager-648d5c98bc-h9sv8\" (UID: \"fe34329f-98ea-4289-bbb4-f6052a0a01f5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-h9sv8" Apr 21 17:43:09.598150 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:09.598112 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt88l\" (UniqueName: \"kubernetes.io/projected/fe34329f-98ea-4289-bbb4-f6052a0a01f5-kube-api-access-bt88l\") pod \"dns-operator-controller-manager-648d5c98bc-h9sv8\" (UID: \"fe34329f-98ea-4289-bbb4-f6052a0a01f5\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-h9sv8" Apr 21 17:43:09.770372 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:09.770278 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-h9sv8" Apr 21 17:43:09.903197 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:09.903149 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-h9sv8"] Apr 21 17:43:09.905282 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:43:09.905254 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe34329f_98ea_4289_bbb4_f6052a0a01f5.slice/crio-8b140fc587e83cf1dc74ddbd8919309d6681e1c158dd80fd2f4d10155cb055a0 WatchSource:0}: Error finding container 8b140fc587e83cf1dc74ddbd8919309d6681e1c158dd80fd2f4d10155cb055a0: Status 404 returned error can't find the container with id 8b140fc587e83cf1dc74ddbd8919309d6681e1c158dd80fd2f4d10155cb055a0 Apr 21 17:43:10.317228 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:10.317192 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-h9sv8" event={"ID":"fe34329f-98ea-4289-bbb4-f6052a0a01f5","Type":"ContainerStarted","Data":"8b140fc587e83cf1dc74ddbd8919309d6681e1c158dd80fd2f4d10155cb055a0"} Apr 21 17:43:12.329166 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:12.329134 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-h9sv8" event={"ID":"fe34329f-98ea-4289-bbb4-f6052a0a01f5","Type":"ContainerStarted","Data":"e8815edac675f91865b63af9c4e6c734e4f61da7e33f1836a395d91ed11dfc37"} Apr 21 17:43:12.329571 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:12.329205 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-h9sv8" Apr 21 17:43:12.347364 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:12.347310 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-h9sv8" podStartSLOduration=1.103118789 podStartE2EDuration="3.347295487s" podCreationTimestamp="2026-04-21 17:43:09 +0000 UTC" firstStartedPulling="2026-04-21 17:43:09.907131804 +0000 UTC m=+577.047194552" lastFinishedPulling="2026-04-21 17:43:12.151308497 +0000 UTC m=+579.291371250" observedRunningTime="2026-04-21 17:43:12.345408314 +0000 UTC m=+579.485471082" watchObservedRunningTime="2026-04-21 17:43:12.347295487 +0000 UTC m=+579.487358255" Apr 21 17:43:12.915599 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:12.915562 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl"] Apr 21 17:43:12.918045 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:12.918030 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" Apr 21 17:43:12.920698 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:12.920678 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-vql4l\"" Apr 21 17:43:12.933577 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:12.933549 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl"] Apr 21 17:43:13.021832 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:13.021792 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lbbw\" (UniqueName: \"kubernetes.io/projected/da11be1b-6725-4e2a-82de-defe2a77dbdc-kube-api-access-7lbbw\") pod \"limitador-operator-controller-manager-85c4996f8c-2hwrl\" (UID: \"da11be1b-6725-4e2a-82de-defe2a77dbdc\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" Apr 21 17:43:13.123076 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:13.123033 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lbbw\" (UniqueName: \"kubernetes.io/projected/da11be1b-6725-4e2a-82de-defe2a77dbdc-kube-api-access-7lbbw\") pod \"limitador-operator-controller-manager-85c4996f8c-2hwrl\" (UID: \"da11be1b-6725-4e2a-82de-defe2a77dbdc\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" Apr 21 17:43:13.131041 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:13.131017 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lbbw\" (UniqueName: \"kubernetes.io/projected/da11be1b-6725-4e2a-82de-defe2a77dbdc-kube-api-access-7lbbw\") pod \"limitador-operator-controller-manager-85c4996f8c-2hwrl\" (UID: \"da11be1b-6725-4e2a-82de-defe2a77dbdc\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" Apr 21 17:43:13.228328 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:13.228278 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" Apr 21 17:43:13.354458 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:13.354426 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl"] Apr 21 17:43:13.357423 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:43:13.357381 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda11be1b_6725_4e2a_82de_defe2a77dbdc.slice/crio-a71396934944d634560674448eeaaf7dd494886d4bd50a13f90870f84e46a996 WatchSource:0}: Error finding container a71396934944d634560674448eeaaf7dd494886d4bd50a13f90870f84e46a996: Status 404 returned error can't find the container with id a71396934944d634560674448eeaaf7dd494886d4bd50a13f90870f84e46a996 Apr 21 17:43:14.337199 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:14.337141 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" event={"ID":"da11be1b-6725-4e2a-82de-defe2a77dbdc","Type":"ContainerStarted","Data":"a71396934944d634560674448eeaaf7dd494886d4bd50a13f90870f84e46a996"} Apr 21 17:43:15.342388 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:15.342350 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" event={"ID":"da11be1b-6725-4e2a-82de-defe2a77dbdc","Type":"ContainerStarted","Data":"eb037029e04f26f054cba169b19b7961774e852862d35c950b23b3ed9937764d"} Apr 21 17:43:15.342865 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:15.342406 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" Apr 21 17:43:15.369580 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:15.369532 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" podStartSLOduration=1.558270352 podStartE2EDuration="3.369518422s" podCreationTimestamp="2026-04-21 17:43:12 +0000 UTC" firstStartedPulling="2026-04-21 17:43:13.359533681 +0000 UTC m=+580.499596426" lastFinishedPulling="2026-04-21 17:43:15.17078174 +0000 UTC m=+582.310844496" observedRunningTime="2026-04-21 17:43:15.366694186 +0000 UTC m=+582.506756953" watchObservedRunningTime="2026-04-21 17:43:15.369518422 +0000 UTC m=+582.509581244" Apr 21 17:43:21.472473 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:21.472439 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-z4tzz"] Apr 21 17:43:21.475063 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:21.475045 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-z4tzz" Apr 21 17:43:21.478833 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:21.478809 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-gv6fj\"" Apr 21 17:43:21.487674 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:21.487652 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-z4tzz"] Apr 21 17:43:21.600186 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:21.600102 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcr4b\" (UniqueName: \"kubernetes.io/projected/e726105f-d76d-4f78-8162-228534307ab2-kube-api-access-rcr4b\") pod \"authorino-operator-657f44b778-z4tzz\" (UID: \"e726105f-d76d-4f78-8162-228534307ab2\") " pod="kuadrant-system/authorino-operator-657f44b778-z4tzz" Apr 21 17:43:21.700825 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:21.700791 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcr4b\" (UniqueName: \"kubernetes.io/projected/e726105f-d76d-4f78-8162-228534307ab2-kube-api-access-rcr4b\") pod \"authorino-operator-657f44b778-z4tzz\" (UID: \"e726105f-d76d-4f78-8162-228534307ab2\") " pod="kuadrant-system/authorino-operator-657f44b778-z4tzz" Apr 21 17:43:21.709769 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:21.709739 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcr4b\" (UniqueName: \"kubernetes.io/projected/e726105f-d76d-4f78-8162-228534307ab2-kube-api-access-rcr4b\") pod \"authorino-operator-657f44b778-z4tzz\" (UID: \"e726105f-d76d-4f78-8162-228534307ab2\") " pod="kuadrant-system/authorino-operator-657f44b778-z4tzz" Apr 21 17:43:21.784836 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:21.784745 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-z4tzz" Apr 21 17:43:21.918103 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:21.918076 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-z4tzz"] Apr 21 17:43:21.920610 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:43:21.920581 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode726105f_d76d_4f78_8162_228534307ab2.slice/crio-4eff08529f2a5be261842487b4a2eccf6dcae6525c5260af3105a263818bf9bd WatchSource:0}: Error finding container 4eff08529f2a5be261842487b4a2eccf6dcae6525c5260af3105a263818bf9bd: Status 404 returned error can't find the container with id 4eff08529f2a5be261842487b4a2eccf6dcae6525c5260af3105a263818bf9bd Apr 21 17:43:22.367109 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:22.367070 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-z4tzz" event={"ID":"e726105f-d76d-4f78-8162-228534307ab2","Type":"ContainerStarted","Data":"4eff08529f2a5be261842487b4a2eccf6dcae6525c5260af3105a263818bf9bd"} Apr 21 17:43:23.335635 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:23.335591 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-h9sv8" Apr 21 17:43:25.378693 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:25.378645 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-z4tzz" event={"ID":"e726105f-d76d-4f78-8162-228534307ab2","Type":"ContainerStarted","Data":"bee5353b92536d7e641fc2dfb6244cdc4cd29ccbbaa00d4f8432afb4f72e1990"} Apr 21 17:43:25.379147 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:25.378733 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-z4tzz" Apr 21 17:43:25.396050 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:25.396004 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-z4tzz" podStartSLOduration=1.988783526 podStartE2EDuration="4.395989506s" podCreationTimestamp="2026-04-21 17:43:21 +0000 UTC" firstStartedPulling="2026-04-21 17:43:21.922634921 +0000 UTC m=+589.062697671" lastFinishedPulling="2026-04-21 17:43:24.329840905 +0000 UTC m=+591.469903651" observedRunningTime="2026-04-21 17:43:25.394248047 +0000 UTC m=+592.534310892" watchObservedRunningTime="2026-04-21 17:43:25.395989506 +0000 UTC m=+592.536052360" Apr 21 17:43:26.348484 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:26.348453 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" Apr 21 17:43:36.384319 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:36.384288 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-z4tzz" Apr 21 17:43:37.408435 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.408389 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kzshp"] Apr 21 17:43:37.413348 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.413325 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kzshp" Apr 21 17:43:37.415959 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.415941 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-jn8hf\"" Apr 21 17:43:37.419472 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.419448 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kzshp"] Apr 21 17:43:37.423709 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:43:37.423678 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[extensions-socket-volume kube-api-access-xkn8c], unattached volumes=[], failed to process volumes=[extensions-socket-volume kube-api-access-xkn8c]: context canceled" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kzshp" podUID="c61199fa-72d7-464f-b671-8334386ae7ad" Apr 21 17:43:37.425077 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.425058 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kzshp"] Apr 21 17:43:37.429211 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.429192 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kzshp" Apr 21 17:43:37.433111 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.433085 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kzshp" Apr 21 17:43:37.435922 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.435883 2583 status_manager.go:919] "Failed to update status for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kzshp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61199fa-72d7-464f-b671-8334386ae7ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-21T17:43:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-21T17:43:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-21T17:43:37Z\\\",\\\"message\\\":\\\"containers with unready status: [manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-04-21T17:43:37Z\\\",\\\"message\\\":\\\"containers with unready status: [manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/kuadrant/kuadrant-operator:v1.4.2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/kuadrant\\\",\\\"name\\\":\\\"extensions-socket-volume\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xkn8c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"10.0.129.92\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"10.0.129.92\\\"}],\\\"startTime\\\":\\\"2026-04-21T17:43:37Z\\\"}}\" for pod \"kuadrant-system\"/\"kuadrant-operator-controller-manager-55c7f4c975-kzshp\": pods \"kuadrant-operator-controller-manager-55c7f4c975-kzshp\" not found" Apr 21 17:43:37.438668 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.438644 2583 status_manager.go:895] "Failed to get status for pod" podUID="c61199fa-72d7-464f-b671-8334386ae7ad" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kzshp" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-kzshp\" is forbidden: User \"system:node:ip-10-0-129-92.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-92.ec2.internal' and this object" Apr 21 17:43:37.464504 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.464475 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c61199fa-72d7-464f-b671-8334386ae7ad" path="/var/lib/kubelet/pods/c61199fa-72d7-464f-b671-8334386ae7ad/volumes" Apr 21 17:43:37.473969 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.473941 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt"] Apr 21 17:43:37.477821 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.477798 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" Apr 21 17:43:37.493902 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.493877 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt"] Apr 21 17:43:37.515190 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.515140 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl"] Apr 21 17:43:37.516252 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.516216 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" podUID="da11be1b-6725-4e2a-82de-defe2a77dbdc" containerName="manager" containerID="cri-o://eb037029e04f26f054cba169b19b7961774e852862d35c950b23b3ed9937764d" gracePeriod=2 Apr 21 17:43:37.529082 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.529054 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl"] Apr 21 17:43:37.650659 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.650625 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/233b2076-a625-45a7-b796-74a94445bca7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-42rpt\" (UID: \"233b2076-a625-45a7-b796-74a94445bca7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" Apr 21 17:43:37.650810 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.650771 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfv7m\" (UniqueName: \"kubernetes.io/projected/233b2076-a625-45a7-b796-74a94445bca7-kube-api-access-hfv7m\") pod \"kuadrant-operator-controller-manager-55c7f4c975-42rpt\" (UID: \"233b2076-a625-45a7-b796-74a94445bca7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" Apr 21 17:43:37.751629 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.751598 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfv7m\" (UniqueName: \"kubernetes.io/projected/233b2076-a625-45a7-b796-74a94445bca7-kube-api-access-hfv7m\") pod \"kuadrant-operator-controller-manager-55c7f4c975-42rpt\" (UID: \"233b2076-a625-45a7-b796-74a94445bca7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" Apr 21 17:43:37.751769 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.751659 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/233b2076-a625-45a7-b796-74a94445bca7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-42rpt\" (UID: \"233b2076-a625-45a7-b796-74a94445bca7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" Apr 21 17:43:37.751977 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.751960 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/233b2076-a625-45a7-b796-74a94445bca7-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-42rpt\" (UID: \"233b2076-a625-45a7-b796-74a94445bca7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" Apr 21 17:43:37.757549 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.757526 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" Apr 21 17:43:37.759865 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.759840 2583 status_manager.go:895] "Failed to get status for pod" podUID="da11be1b-6725-4e2a-82de-defe2a77dbdc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" err="pods \"limitador-operator-controller-manager-85c4996f8c-2hwrl\" is forbidden: User \"system:node:ip-10-0-129-92.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-92.ec2.internal' and this object" Apr 21 17:43:37.762888 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.762868 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfv7m\" (UniqueName: \"kubernetes.io/projected/233b2076-a625-45a7-b796-74a94445bca7-kube-api-access-hfv7m\") pod \"kuadrant-operator-controller-manager-55c7f4c975-42rpt\" (UID: \"233b2076-a625-45a7-b796-74a94445bca7\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" Apr 21 17:43:37.787883 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.787850 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" Apr 21 17:43:37.852204 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.852150 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lbbw\" (UniqueName: \"kubernetes.io/projected/da11be1b-6725-4e2a-82de-defe2a77dbdc-kube-api-access-7lbbw\") pod \"da11be1b-6725-4e2a-82de-defe2a77dbdc\" (UID: \"da11be1b-6725-4e2a-82de-defe2a77dbdc\") " Apr 21 17:43:37.854569 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.854536 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da11be1b-6725-4e2a-82de-defe2a77dbdc-kube-api-access-7lbbw" (OuterVolumeSpecName: "kube-api-access-7lbbw") pod "da11be1b-6725-4e2a-82de-defe2a77dbdc" (UID: "da11be1b-6725-4e2a-82de-defe2a77dbdc"). InnerVolumeSpecName "kube-api-access-7lbbw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:43:37.924353 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.924316 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt"] Apr 21 17:43:37.926621 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:43:37.926592 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod233b2076_a625_45a7_b796_74a94445bca7.slice/crio-1ceb5c91a5842accbb386a2d2f0947b4073376a2189b16997300e846bc7e07cd WatchSource:0}: Error finding container 1ceb5c91a5842accbb386a2d2f0947b4073376a2189b16997300e846bc7e07cd: Status 404 returned error can't find the container with id 1ceb5c91a5842accbb386a2d2f0947b4073376a2189b16997300e846bc7e07cd Apr 21 17:43:37.953241 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:37.953160 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7lbbw\" (UniqueName: \"kubernetes.io/projected/da11be1b-6725-4e2a-82de-defe2a77dbdc-kube-api-access-7lbbw\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:43:38.434056 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:38.434011 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" event={"ID":"233b2076-a625-45a7-b796-74a94445bca7","Type":"ContainerStarted","Data":"1ceb5c91a5842accbb386a2d2f0947b4073376a2189b16997300e846bc7e07cd"} Apr 21 17:43:38.435342 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:38.435315 2583 generic.go:358] "Generic (PLEG): container finished" podID="da11be1b-6725-4e2a-82de-defe2a77dbdc" containerID="eb037029e04f26f054cba169b19b7961774e852862d35c950b23b3ed9937764d" exitCode=0 Apr 21 17:43:38.435488 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:38.435379 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" Apr 21 17:43:38.435488 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:38.435393 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kzshp" Apr 21 17:43:38.435488 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:38.435424 2583 scope.go:117] "RemoveContainer" containerID="eb037029e04f26f054cba169b19b7961774e852862d35c950b23b3ed9937764d" Apr 21 17:43:38.438281 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:38.437868 2583 status_manager.go:895] "Failed to get status for pod" podUID="da11be1b-6725-4e2a-82de-defe2a77dbdc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" err="pods \"limitador-operator-controller-manager-85c4996f8c-2hwrl\" is forbidden: User \"system:node:ip-10-0-129-92.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-92.ec2.internal' and this object" Apr 21 17:43:38.439957 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:38.439925 2583 status_manager.go:895] "Failed to get status for pod" podUID="da11be1b-6725-4e2a-82de-defe2a77dbdc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" err="pods \"limitador-operator-controller-manager-85c4996f8c-2hwrl\" is forbidden: User \"system:node:ip-10-0-129-92.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-92.ec2.internal' and this object" Apr 21 17:43:38.441837 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:38.441806 2583 status_manager.go:895] "Failed to get status for pod" podUID="c61199fa-72d7-464f-b671-8334386ae7ad" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kzshp" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-kzshp\" is forbidden: User \"system:node:ip-10-0-129-92.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-92.ec2.internal' and this object" Apr 21 17:43:38.443830 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:38.443801 2583 status_manager.go:895] "Failed to get status for pod" podUID="c61199fa-72d7-464f-b671-8334386ae7ad" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kzshp" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-kzshp\" is forbidden: User \"system:node:ip-10-0-129-92.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-92.ec2.internal' and this object" Apr 21 17:43:38.445613 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:38.445558 2583 scope.go:117] "RemoveContainer" containerID="eb037029e04f26f054cba169b19b7961774e852862d35c950b23b3ed9937764d" Apr 21 17:43:38.445752 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:38.445727 2583 status_manager.go:895] "Failed to get status for pod" podUID="da11be1b-6725-4e2a-82de-defe2a77dbdc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" err="pods \"limitador-operator-controller-manager-85c4996f8c-2hwrl\" is forbidden: User \"system:node:ip-10-0-129-92.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-92.ec2.internal' and this object" Apr 21 17:43:38.445907 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:43:38.445886 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb037029e04f26f054cba169b19b7961774e852862d35c950b23b3ed9937764d\": container with ID starting with eb037029e04f26f054cba169b19b7961774e852862d35c950b23b3ed9937764d not found: ID does not exist" containerID="eb037029e04f26f054cba169b19b7961774e852862d35c950b23b3ed9937764d" Apr 21 17:43:38.445981 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:38.445917 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb037029e04f26f054cba169b19b7961774e852862d35c950b23b3ed9937764d"} err="failed to get container status \"eb037029e04f26f054cba169b19b7961774e852862d35c950b23b3ed9937764d\": rpc error: code = NotFound desc = could not find container \"eb037029e04f26f054cba169b19b7961774e852862d35c950b23b3ed9937764d\": container with ID starting with eb037029e04f26f054cba169b19b7961774e852862d35c950b23b3ed9937764d not found: ID does not exist" Apr 21 17:43:38.447935 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:38.447915 2583 status_manager.go:895] "Failed to get status for pod" podUID="c61199fa-72d7-464f-b671-8334386ae7ad" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-kzshp" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-kzshp\" is forbidden: User \"system:node:ip-10-0-129-92.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-92.ec2.internal' and this object" Apr 21 17:43:38.449803 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:38.449781 2583 status_manager.go:895] "Failed to get status for pod" podUID="da11be1b-6725-4e2a-82de-defe2a77dbdc" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-2hwrl" err="pods \"limitador-operator-controller-manager-85c4996f8c-2hwrl\" is forbidden: User \"system:node:ip-10-0-129-92.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-92.ec2.internal' and this object" Apr 21 17:43:39.465619 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:39.465581 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da11be1b-6725-4e2a-82de-defe2a77dbdc" path="/var/lib/kubelet/pods/da11be1b-6725-4e2a-82de-defe2a77dbdc/volumes" Apr 21 17:43:42.453280 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:42.453243 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" event={"ID":"233b2076-a625-45a7-b796-74a94445bca7","Type":"ContainerStarted","Data":"8bf0b3a3c4f2a2f32af898e28c6d3c8dc30092f2bf790d81ddc0ab5e3009092a"} Apr 21 17:43:42.453678 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:42.453404 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" Apr 21 17:43:42.478575 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:42.478525 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" podStartSLOduration=1.7090369760000002 podStartE2EDuration="5.478508275s" podCreationTimestamp="2026-04-21 17:43:37 +0000 UTC" firstStartedPulling="2026-04-21 17:43:37.928822947 +0000 UTC m=+605.068885693" lastFinishedPulling="2026-04-21 17:43:41.698294238 +0000 UTC m=+608.838356992" observedRunningTime="2026-04-21 17:43:42.475448188 +0000 UTC m=+609.615510979" watchObservedRunningTime="2026-04-21 17:43:42.478508275 +0000 UTC m=+609.618571042" Apr 21 17:43:53.459227 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:43:53.459187 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" Apr 21 17:44:14.365360 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.365317 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:44:14.365820 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.365699 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da11be1b-6725-4e2a-82de-defe2a77dbdc" containerName="manager" Apr 21 17:44:14.365820 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.365711 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="da11be1b-6725-4e2a-82de-defe2a77dbdc" containerName="manager" Apr 21 17:44:14.365820 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.365786 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="da11be1b-6725-4e2a-82de-defe2a77dbdc" containerName="manager" Apr 21 17:44:14.372906 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.372884 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-5h6gp" Apr 21 17:44:14.375592 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.375544 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 21 17:44:14.375756 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.375726 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-hdtws\"" Apr 21 17:44:14.380406 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.380381 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:44:14.410045 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.410015 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:44:14.476312 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.476278 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwm7q\" (UniqueName: \"kubernetes.io/projected/a4d82be3-2931-4deb-934c-487fe167e261-kube-api-access-kwm7q\") pod \"limitador-limitador-78c99df468-5h6gp\" (UID: \"a4d82be3-2931-4deb-934c-487fe167e261\") " pod="kuadrant-system/limitador-limitador-78c99df468-5h6gp" Apr 21 17:44:14.476480 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.476337 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a4d82be3-2931-4deb-934c-487fe167e261-config-file\") pod \"limitador-limitador-78c99df468-5h6gp\" (UID: \"a4d82be3-2931-4deb-934c-487fe167e261\") " pod="kuadrant-system/limitador-limitador-78c99df468-5h6gp" Apr 21 17:44:14.577451 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.577396 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a4d82be3-2931-4deb-934c-487fe167e261-config-file\") pod \"limitador-limitador-78c99df468-5h6gp\" (UID: \"a4d82be3-2931-4deb-934c-487fe167e261\") " pod="kuadrant-system/limitador-limitador-78c99df468-5h6gp" Apr 21 17:44:14.577627 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.577607 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwm7q\" (UniqueName: \"kubernetes.io/projected/a4d82be3-2931-4deb-934c-487fe167e261-kube-api-access-kwm7q\") pod \"limitador-limitador-78c99df468-5h6gp\" (UID: \"a4d82be3-2931-4deb-934c-487fe167e261\") " pod="kuadrant-system/limitador-limitador-78c99df468-5h6gp" Apr 21 17:44:14.578104 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.578082 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a4d82be3-2931-4deb-934c-487fe167e261-config-file\") pod \"limitador-limitador-78c99df468-5h6gp\" (UID: \"a4d82be3-2931-4deb-934c-487fe167e261\") " pod="kuadrant-system/limitador-limitador-78c99df468-5h6gp" Apr 21 17:44:14.587220 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.587149 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwm7q\" (UniqueName: \"kubernetes.io/projected/a4d82be3-2931-4deb-934c-487fe167e261-kube-api-access-kwm7q\") pod \"limitador-limitador-78c99df468-5h6gp\" (UID: \"a4d82be3-2931-4deb-934c-487fe167e261\") " pod="kuadrant-system/limitador-limitador-78c99df468-5h6gp" Apr 21 17:44:14.686295 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.686205 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-5h6gp" Apr 21 17:44:14.819035 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:14.819005 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:44:14.822039 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:44:14.822001 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4d82be3_2931_4deb_934c_487fe167e261.slice/crio-5122fd3f7f8b52f48b2fff866f85cdeedb3119ec9fcf6d7351b3440ddce030c8 WatchSource:0}: Error finding container 5122fd3f7f8b52f48b2fff866f85cdeedb3119ec9fcf6d7351b3440ddce030c8: Status 404 returned error can't find the container with id 5122fd3f7f8b52f48b2fff866f85cdeedb3119ec9fcf6d7351b3440ddce030c8 Apr 21 17:44:15.581732 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:15.581456 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-5h6gp" event={"ID":"a4d82be3-2931-4deb-934c-487fe167e261","Type":"ContainerStarted","Data":"5122fd3f7f8b52f48b2fff866f85cdeedb3119ec9fcf6d7351b3440ddce030c8"} Apr 21 17:44:17.589487 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:17.589446 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-5h6gp" event={"ID":"a4d82be3-2931-4deb-934c-487fe167e261","Type":"ContainerStarted","Data":"58d549df5288875adba9a70e2856757b53d68636401036e52699ade4b6151a81"} Apr 21 17:44:17.589928 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:17.589512 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-5h6gp" Apr 21 17:44:17.606338 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:17.606285 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-5h6gp" podStartSLOduration=1.048994117 podStartE2EDuration="3.606271737s" podCreationTimestamp="2026-04-21 17:44:14 +0000 UTC" firstStartedPulling="2026-04-21 17:44:14.823598072 +0000 UTC m=+641.963660820" lastFinishedPulling="2026-04-21 17:44:17.380875693 +0000 UTC m=+644.520938440" observedRunningTime="2026-04-21 17:44:17.604894844 +0000 UTC m=+644.744957612" watchObservedRunningTime="2026-04-21 17:44:17.606271737 +0000 UTC m=+644.746334527" Apr 21 17:44:28.594143 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:28.594115 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-5h6gp" Apr 21 17:44:51.719428 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:44:51.719389 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:45:23.437661 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:23.437580 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:45:26.572288 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.572246 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s"] Apr 21 17:45:26.575978 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.575959 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.578541 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.578520 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-nnhrd\"" Apr 21 17:45:26.578680 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.578524 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 17:45:26.579587 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.579566 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 17:45:26.579685 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.579619 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 21 17:45:26.584144 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.583844 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s"] Apr 21 17:45:26.722262 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.722221 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.722447 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.722264 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.722447 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.722293 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.722447 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.722344 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.722447 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.722370 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.722447 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.722395 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qfz9\" (UniqueName: \"kubernetes.io/projected/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-kube-api-access-7qfz9\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.823050 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.822970 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.823050 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.823012 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.823050 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.823030 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.823327 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.823070 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.823327 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.823089 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.823327 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.823109 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qfz9\" (UniqueName: \"kubernetes.io/projected/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-kube-api-access-7qfz9\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.823487 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.823458 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.823660 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.823636 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.823740 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.823506 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.825662 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.825643 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.825947 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.825925 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.832106 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.832081 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qfz9\" (UniqueName: \"kubernetes.io/projected/c8bcaf2f-5723-46b2-b6e7-42705b9b4d01-kube-api-access-7qfz9\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s\" (UID: \"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:26.888261 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:26.888224 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:27.017999 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:27.017976 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s"] Apr 21 17:45:27.020455 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:45:27.020419 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8bcaf2f_5723_46b2_b6e7_42705b9b4d01.slice/crio-5dccf3bc2dff8bbbfac81a13ed3f5a6a0ef63827c9dc82ca3484ec35d3ebee17 WatchSource:0}: Error finding container 5dccf3bc2dff8bbbfac81a13ed3f5a6a0ef63827c9dc82ca3484ec35d3ebee17: Status 404 returned error can't find the container with id 5dccf3bc2dff8bbbfac81a13ed3f5a6a0ef63827c9dc82ca3484ec35d3ebee17 Apr 21 17:45:27.542210 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:27.542153 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:45:27.829810 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:27.829718 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" event={"ID":"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01","Type":"ContainerStarted","Data":"5dccf3bc2dff8bbbfac81a13ed3f5a6a0ef63827c9dc82ca3484ec35d3ebee17"} Apr 21 17:45:32.853198 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:32.853143 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" event={"ID":"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01","Type":"ContainerStarted","Data":"aca3193ff06e2a4d33c059935f6e47df6ae78fdcf31e7a59c56103ab7c530b55"} Apr 21 17:45:35.819589 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:35.819543 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:45:37.876083 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:37.876001 2583 generic.go:358] "Generic (PLEG): container finished" podID="c8bcaf2f-5723-46b2-b6e7-42705b9b4d01" containerID="aca3193ff06e2a4d33c059935f6e47df6ae78fdcf31e7a59c56103ab7c530b55" exitCode=0 Apr 21 17:45:37.876083 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:37.876046 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" event={"ID":"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01","Type":"ContainerDied","Data":"aca3193ff06e2a4d33c059935f6e47df6ae78fdcf31e7a59c56103ab7c530b55"} Apr 21 17:45:39.885841 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:39.885803 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" event={"ID":"c8bcaf2f-5723-46b2-b6e7-42705b9b4d01","Type":"ContainerStarted","Data":"d1a7b974c6958295bc49515bfc74fc1fe71ed7e59ccb91d8f257a2f3f7b6832c"} Apr 21 17:45:39.886802 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:39.886773 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:39.906108 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:39.906054 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" podStartSLOduration=1.902482054 podStartE2EDuration="13.906036407s" podCreationTimestamp="2026-04-21 17:45:26 +0000 UTC" firstStartedPulling="2026-04-21 17:45:27.022221 +0000 UTC m=+714.162283745" lastFinishedPulling="2026-04-21 17:45:39.025775352 +0000 UTC m=+726.165838098" observedRunningTime="2026-04-21 17:45:39.904268919 +0000 UTC m=+727.044331687" watchObservedRunningTime="2026-04-21 17:45:39.906036407 +0000 UTC m=+727.046099176" Apr 21 17:45:46.769406 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:46.769372 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb"] Apr 21 17:45:46.772849 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:46.772827 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:46.775336 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:46.775310 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 21 17:45:46.783619 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:46.783580 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb"] Apr 21 17:45:46.912076 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:46.912039 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxvh2\" (UniqueName: \"kubernetes.io/projected/9ecb44b0-0eb4-473c-9289-ffa676e85f54-kube-api-access-zxvh2\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:46.912316 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:46.912102 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ecb44b0-0eb4-473c-9289-ffa676e85f54-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:46.912316 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:46.912131 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ecb44b0-0eb4-473c-9289-ffa676e85f54-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:46.912316 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:46.912152 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ecb44b0-0eb4-473c-9289-ffa676e85f54-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:46.912316 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:46.912210 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ecb44b0-0eb4-473c-9289-ffa676e85f54-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:46.912316 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:46.912267 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ecb44b0-0eb4-473c-9289-ffa676e85f54-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:47.013226 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.013150 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvh2\" (UniqueName: \"kubernetes.io/projected/9ecb44b0-0eb4-473c-9289-ffa676e85f54-kube-api-access-zxvh2\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:47.013407 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.013253 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ecb44b0-0eb4-473c-9289-ffa676e85f54-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:47.013407 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.013275 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ecb44b0-0eb4-473c-9289-ffa676e85f54-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:47.013407 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.013291 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ecb44b0-0eb4-473c-9289-ffa676e85f54-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:47.013605 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.013404 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ecb44b0-0eb4-473c-9289-ffa676e85f54-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:47.013605 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.013496 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ecb44b0-0eb4-473c-9289-ffa676e85f54-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:47.013726 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.013701 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ecb44b0-0eb4-473c-9289-ffa676e85f54-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:47.013823 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.013752 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ecb44b0-0eb4-473c-9289-ffa676e85f54-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:47.013823 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.013815 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ecb44b0-0eb4-473c-9289-ffa676e85f54-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:47.015801 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.015774 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ecb44b0-0eb4-473c-9289-ffa676e85f54-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:47.016021 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.016003 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ecb44b0-0eb4-473c-9289-ffa676e85f54-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:47.022397 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.022344 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxvh2\" (UniqueName: \"kubernetes.io/projected/9ecb44b0-0eb4-473c-9289-ffa676e85f54-kube-api-access-zxvh2\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb\" (UID: \"9ecb44b0-0eb4-473c-9289-ffa676e85f54\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:47.086401 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.086360 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:47.225923 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.225895 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb"] Apr 21 17:45:47.228776 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:45:47.228740 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ecb44b0_0eb4_473c_9289_ffa676e85f54.slice/crio-c96d403378455c259a51bad00dd0082c3cb57ba03531e95ea20d9ad987f0bf30 WatchSource:0}: Error finding container c96d403378455c259a51bad00dd0082c3cb57ba03531e95ea20d9ad987f0bf30: Status 404 returned error can't find the container with id c96d403378455c259a51bad00dd0082c3cb57ba03531e95ea20d9ad987f0bf30 Apr 21 17:45:47.914127 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.914088 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" event={"ID":"9ecb44b0-0eb4-473c-9289-ffa676e85f54","Type":"ContainerStarted","Data":"21540601c95e26dea7a5bbe2d88442b6f01fad933200a63259151442eebeba39"} Apr 21 17:45:47.914127 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:47.914128 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" event={"ID":"9ecb44b0-0eb4-473c-9289-ffa676e85f54","Type":"ContainerStarted","Data":"c96d403378455c259a51bad00dd0082c3cb57ba03531e95ea20d9ad987f0bf30"} Apr 21 17:45:48.022539 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:48.022504 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:45:50.903135 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:50.903102 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s" Apr 21 17:45:52.933370 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:52.933339 2583 generic.go:358] "Generic (PLEG): container finished" podID="9ecb44b0-0eb4-473c-9289-ffa676e85f54" containerID="21540601c95e26dea7a5bbe2d88442b6f01fad933200a63259151442eebeba39" exitCode=0 Apr 21 17:45:52.933804 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:52.933401 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" event={"ID":"9ecb44b0-0eb4-473c-9289-ffa676e85f54","Type":"ContainerDied","Data":"21540601c95e26dea7a5bbe2d88442b6f01fad933200a63259151442eebeba39"} Apr 21 17:45:53.938127 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:53.938090 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" event={"ID":"9ecb44b0-0eb4-473c-9289-ffa676e85f54","Type":"ContainerStarted","Data":"0a0754b2477fcda549099443a960c70adb079a2b63d29d18c40cf939deeffeef"} Apr 21 17:45:53.938611 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:53.938328 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:45:53.967948 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:45:53.967894 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" podStartSLOduration=7.75599155 podStartE2EDuration="7.967879994s" podCreationTimestamp="2026-04-21 17:45:46 +0000 UTC" firstStartedPulling="2026-04-21 17:45:52.934044648 +0000 UTC m=+740.074107392" lastFinishedPulling="2026-04-21 17:45:53.145933091 +0000 UTC m=+740.285995836" observedRunningTime="2026-04-21 17:45:53.965334937 +0000 UTC m=+741.105397704" watchObservedRunningTime="2026-04-21 17:45:53.967879994 +0000 UTC m=+741.107942775" Apr 21 17:46:04.953932 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:46:04.953901 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb" Apr 21 17:46:36.826504 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:46:36.826463 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:46:45.932532 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:46:45.932451 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:47:32.323349 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:47:32.323311 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:47:43.224498 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:47:43.224458 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:47:52.128812 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:47:52.128777 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:48:02.730521 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:48:02.730482 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:48:11.922828 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:48:11.922785 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:48:22.024698 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:48:22.024611 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:49:23.628437 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:49:23.628395 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:49:38.829060 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:49:38.829020 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:50:17.733088 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:50:17.733001 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:50:33.730064 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:50:33.730031 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:50:48.422658 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:50:48.422621 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:51:05.625695 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:51:05.625663 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:51:57.325045 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:51:57.325008 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:52:06.627783 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:52:06.627744 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:52:23.218334 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:52:23.218299 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:52:31.823670 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:52:31.823632 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:52:48.720782 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:52:48.720698 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:52:57.124257 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:52:57.124214 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:53:29.523318 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:53:29.523286 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:53:37.418580 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:53:37.418542 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:53:45.929082 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:53:45.929040 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:53:54.419902 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:53:54.419858 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:54:02.817485 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:54:02.817447 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:54:19.919112 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:54:19.919072 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:54:30.929190 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:54:30.929148 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:55:17.421965 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:55:17.421926 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:55:25.723657 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:55:25.723618 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:55:34.830521 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:55:34.830486 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:55:43.930223 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:55:43.930116 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:55:53.618140 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:55:53.618102 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:56:01.826450 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:56:01.826413 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:56:10.521522 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:56:10.521478 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:56:15.414593 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:56:15.414545 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:56:19.330166 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:56:19.330131 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:56:27.728112 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:56:27.728070 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:56:36.422266 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:56:36.422216 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:56:45.326880 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:56:45.326847 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:56:54.128948 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:56:54.128913 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:57:03.124601 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:57:03.124563 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:57:11.230271 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:57:11.230234 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:57:20.325109 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:57:20.325026 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:57:28.419269 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:57:28.419227 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:57:37.418776 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:57:37.418743 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:57:45.630037 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:57:45.630000 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 17:58:39.568513 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:39.568478 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt"] Apr 21 17:58:39.569025 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:39.568734 2583 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" podUID="233b2076-a625-45a7-b796-74a94445bca7" containerName="manager" containerID="cri-o://8bf0b3a3c4f2a2f32af898e28c6d3c8dc30092f2bf790d81ddc0ab5e3009092a" gracePeriod=10 Apr 21 17:58:39.809067 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:39.809039 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" Apr 21 17:58:39.902408 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:39.902320 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/233b2076-a625-45a7-b796-74a94445bca7-extensions-socket-volume\") pod \"233b2076-a625-45a7-b796-74a94445bca7\" (UID: \"233b2076-a625-45a7-b796-74a94445bca7\") " Apr 21 17:58:39.902408 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:39.902382 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfv7m\" (UniqueName: \"kubernetes.io/projected/233b2076-a625-45a7-b796-74a94445bca7-kube-api-access-hfv7m\") pod \"233b2076-a625-45a7-b796-74a94445bca7\" (UID: \"233b2076-a625-45a7-b796-74a94445bca7\") " Apr 21 17:58:39.902703 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:39.902676 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/233b2076-a625-45a7-b796-74a94445bca7-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "233b2076-a625-45a7-b796-74a94445bca7" (UID: "233b2076-a625-45a7-b796-74a94445bca7"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:58:39.904645 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:39.904626 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233b2076-a625-45a7-b796-74a94445bca7-kube-api-access-hfv7m" (OuterVolumeSpecName: "kube-api-access-hfv7m") pod "233b2076-a625-45a7-b796-74a94445bca7" (UID: "233b2076-a625-45a7-b796-74a94445bca7"). InnerVolumeSpecName "kube-api-access-hfv7m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:58:40.003333 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:40.003288 2583 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/233b2076-a625-45a7-b796-74a94445bca7-extensions-socket-volume\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:58:40.003333 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:40.003323 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hfv7m\" (UniqueName: \"kubernetes.io/projected/233b2076-a625-45a7-b796-74a94445bca7-kube-api-access-hfv7m\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 17:58:40.680240 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:40.680202 2583 generic.go:358] "Generic (PLEG): container finished" podID="233b2076-a625-45a7-b796-74a94445bca7" containerID="8bf0b3a3c4f2a2f32af898e28c6d3c8dc30092f2bf790d81ddc0ab5e3009092a" exitCode=0 Apr 21 17:58:40.680708 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:40.680271 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" Apr 21 17:58:40.680708 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:40.680289 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" event={"ID":"233b2076-a625-45a7-b796-74a94445bca7","Type":"ContainerDied","Data":"8bf0b3a3c4f2a2f32af898e28c6d3c8dc30092f2bf790d81ddc0ab5e3009092a"} Apr 21 17:58:40.680708 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:40.680341 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt" event={"ID":"233b2076-a625-45a7-b796-74a94445bca7","Type":"ContainerDied","Data":"1ceb5c91a5842accbb386a2d2f0947b4073376a2189b16997300e846bc7e07cd"} Apr 21 17:58:40.680708 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:40.680365 2583 scope.go:117] "RemoveContainer" containerID="8bf0b3a3c4f2a2f32af898e28c6d3c8dc30092f2bf790d81ddc0ab5e3009092a" Apr 21 17:58:40.689504 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:40.689484 2583 scope.go:117] "RemoveContainer" containerID="8bf0b3a3c4f2a2f32af898e28c6d3c8dc30092f2bf790d81ddc0ab5e3009092a" Apr 21 17:58:40.689755 ip-10-0-129-92 kubenswrapper[2583]: E0421 17:58:40.689734 2583 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bf0b3a3c4f2a2f32af898e28c6d3c8dc30092f2bf790d81ddc0ab5e3009092a\": container with ID starting with 8bf0b3a3c4f2a2f32af898e28c6d3c8dc30092f2bf790d81ddc0ab5e3009092a not found: ID does not exist" containerID="8bf0b3a3c4f2a2f32af898e28c6d3c8dc30092f2bf790d81ddc0ab5e3009092a" Apr 21 17:58:40.689847 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:40.689760 2583 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bf0b3a3c4f2a2f32af898e28c6d3c8dc30092f2bf790d81ddc0ab5e3009092a"} err="failed to get container status \"8bf0b3a3c4f2a2f32af898e28c6d3c8dc30092f2bf790d81ddc0ab5e3009092a\": rpc error: code = NotFound desc = could not find container \"8bf0b3a3c4f2a2f32af898e28c6d3c8dc30092f2bf790d81ddc0ab5e3009092a\": container with ID starting with 8bf0b3a3c4f2a2f32af898e28c6d3c8dc30092f2bf790d81ddc0ab5e3009092a not found: ID does not exist" Apr 21 17:58:40.702941 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:40.702917 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt"] Apr 21 17:58:40.706508 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:40.706488 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-42rpt"] Apr 21 17:58:41.464910 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:58:41.464876 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233b2076-a625-45a7-b796-74a94445bca7" path="/var/lib/kubelet/pods/233b2076-a625-45a7-b796-74a94445bca7/volumes" Apr 21 17:59:45.623832 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:45.623796 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs"] Apr 21 17:59:45.624414 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:45.624360 2583 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="233b2076-a625-45a7-b796-74a94445bca7" containerName="manager" Apr 21 17:59:45.624414 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:45.624380 2583 state_mem.go:107] "Deleted CPUSet assignment" podUID="233b2076-a625-45a7-b796-74a94445bca7" containerName="manager" Apr 21 17:59:45.624528 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:45.624502 2583 memory_manager.go:356] "RemoveStaleState removing state" podUID="233b2076-a625-45a7-b796-74a94445bca7" containerName="manager" Apr 21 17:59:45.627936 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:45.627915 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs" Apr 21 17:59:45.630833 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:45.630811 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-jn8hf\"" Apr 21 17:59:45.638148 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:45.638124 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs"] Apr 21 17:59:45.711906 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:45.711870 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r9dh\" (UniqueName: \"kubernetes.io/projected/28e201ea-4352-4b3f-9c48-07420cbab827-kube-api-access-7r9dh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zznzs\" (UID: \"28e201ea-4352-4b3f-9c48-07420cbab827\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs" Apr 21 17:59:45.712086 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:45.711996 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/28e201ea-4352-4b3f-9c48-07420cbab827-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zznzs\" (UID: \"28e201ea-4352-4b3f-9c48-07420cbab827\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs" Apr 21 17:59:45.812719 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:45.812678 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/28e201ea-4352-4b3f-9c48-07420cbab827-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zznzs\" (UID: \"28e201ea-4352-4b3f-9c48-07420cbab827\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs" Apr 21 17:59:45.812882 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:45.812728 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7r9dh\" (UniqueName: \"kubernetes.io/projected/28e201ea-4352-4b3f-9c48-07420cbab827-kube-api-access-7r9dh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zznzs\" (UID: \"28e201ea-4352-4b3f-9c48-07420cbab827\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs" Apr 21 17:59:45.813101 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:45.813082 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/28e201ea-4352-4b3f-9c48-07420cbab827-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zznzs\" (UID: \"28e201ea-4352-4b3f-9c48-07420cbab827\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs" Apr 21 17:59:45.821403 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:45.821365 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r9dh\" (UniqueName: \"kubernetes.io/projected/28e201ea-4352-4b3f-9c48-07420cbab827-kube-api-access-7r9dh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zznzs\" (UID: \"28e201ea-4352-4b3f-9c48-07420cbab827\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs" Apr 21 17:59:45.939625 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:45.939519 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs" Apr 21 17:59:46.092904 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:46.092868 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs"] Apr 21 17:59:46.094449 ip-10-0-129-92 kubenswrapper[2583]: W0421 17:59:46.094422 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28e201ea_4352_4b3f_9c48_07420cbab827.slice/crio-28eca449a72330c33444f7e94078c5ac0f352629b8e82cc15d229ffafd13eed4 WatchSource:0}: Error finding container 28eca449a72330c33444f7e94078c5ac0f352629b8e82cc15d229ffafd13eed4: Status 404 returned error can't find the container with id 28eca449a72330c33444f7e94078c5ac0f352629b8e82cc15d229ffafd13eed4 Apr 21 17:59:46.096697 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:46.096681 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 17:59:46.903376 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:46.903342 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs" event={"ID":"28e201ea-4352-4b3f-9c48-07420cbab827","Type":"ContainerStarted","Data":"7bc12a5534865a7067105a8a64d832edc2aa8e7f3b57014a531b77504c40212b"} Apr 21 17:59:46.903376 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:46.903377 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs" event={"ID":"28e201ea-4352-4b3f-9c48-07420cbab827","Type":"ContainerStarted","Data":"28eca449a72330c33444f7e94078c5ac0f352629b8e82cc15d229ffafd13eed4"} Apr 21 17:59:46.903822 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:46.903399 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs" Apr 21 17:59:46.928380 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:46.928328 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs" podStartSLOduration=1.92831317 podStartE2EDuration="1.92831317s" podCreationTimestamp="2026-04-21 17:59:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:59:46.925929774 +0000 UTC m=+1574.065992541" watchObservedRunningTime="2026-04-21 17:59:46.92831317 +0000 UTC m=+1574.068375937" Apr 21 17:59:57.909621 ip-10-0-129-92 kubenswrapper[2583]: I0421 17:59:57.909589 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zznzs" Apr 21 18:00:00.135161 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:00.135125 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29613240-kzjx8"] Apr 21 18:00:00.138654 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:00.138635 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613240-kzjx8" Apr 21 18:00:00.141530 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:00.141506 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-fq944\"" Apr 21 18:00:00.143152 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:00.143128 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc788\" (UniqueName: \"kubernetes.io/projected/4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd-kube-api-access-kc788\") pod \"maas-api-key-cleanup-29613240-kzjx8\" (UID: \"4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd\") " pod="opendatahub/maas-api-key-cleanup-29613240-kzjx8" Apr 21 18:00:00.150789 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:00.150761 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613240-kzjx8"] Apr 21 18:00:00.243713 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:00.243672 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kc788\" (UniqueName: \"kubernetes.io/projected/4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd-kube-api-access-kc788\") pod \"maas-api-key-cleanup-29613240-kzjx8\" (UID: \"4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd\") " pod="opendatahub/maas-api-key-cleanup-29613240-kzjx8" Apr 21 18:00:00.255562 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:00.255536 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc788\" (UniqueName: \"kubernetes.io/projected/4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd-kube-api-access-kc788\") pod \"maas-api-key-cleanup-29613240-kzjx8\" (UID: \"4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd\") " pod="opendatahub/maas-api-key-cleanup-29613240-kzjx8" Apr 21 18:00:00.449340 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:00.449250 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613240-kzjx8" Apr 21 18:00:00.578831 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:00.578791 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613240-kzjx8"] Apr 21 18:00:00.581355 ip-10-0-129-92 kubenswrapper[2583]: W0421 18:00:00.581327 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d2e26bd_065a_44f9_b2ef_7bc8b03dc1fd.slice/crio-da6f123f2d6e939974a6f2e19eb37a30e501a3954c5b03602b450f71ca954804 WatchSource:0}: Error finding container da6f123f2d6e939974a6f2e19eb37a30e501a3954c5b03602b450f71ca954804: Status 404 returned error can't find the container with id da6f123f2d6e939974a6f2e19eb37a30e501a3954c5b03602b450f71ca954804 Apr 21 18:00:00.953226 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:00.953163 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613240-kzjx8" event={"ID":"4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd","Type":"ContainerStarted","Data":"da6f123f2d6e939974a6f2e19eb37a30e501a3954c5b03602b450f71ca954804"} Apr 21 18:00:06.323338 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:06.323300 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:00:11.229235 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:11.229194 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:00:36.329585 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:36.329546 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:00:42.919434 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:42.919393 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:00:53.131477 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:00:53.131437 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:01:00.010395 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:00.010355 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613240-kzjx8"] Apr 21 18:01:01.631986 ip-10-0-129-92 kubenswrapper[2583]: E0421 18:01:01.631936 2583 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://registry.redhat.io/ubi9/ubi-minimal:9.7: reading manifest sha256:8e5f23f039511fbb19fcd95b7caa0bbfbf1780ed14d300c3539df6f1040e285d in registry.redhat.io/ubi9/ubi-minimal: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="registry.redhat.io/ubi9/ubi-minimal:9.7" Apr 21 18:01:01.632433 ip-10-0-129-92 kubenswrapper[2583]: E0421 18:01:01.632148 2583 kuberuntime_manager.go:1358] "Unhandled Error" err=< Apr 21 18:01:01.632433 ip-10-0-129-92 kubenswrapper[2583]: container &Container{Name:cleanup,Image:registry.redhat.io/ubi9/ubi-minimal:9.7,Command:[/bin/sh -c curl -sf -X POST http://maas-api:8080/internal/v1/api-keys/cleanup Apr 21 18:01:01.632433 ip-10-0-129-92 kubenswrapper[2583]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{33554432 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{16777216 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kc788,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod maas-api-key-cleanup-29613240-kzjx8_opendatahub(4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://registry.redhat.io/ubi9/ubi-minimal:9.7: reading manifest sha256:8e5f23f039511fbb19fcd95b7caa0bbfbf1780ed14d300c3539df6f1040e285d in registry.redhat.io/ubi9/ubi-minimal: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image Apr 21 18:01:01.632433 ip-10-0-129-92 kubenswrapper[2583]: > logger="UnhandledError" Apr 21 18:01:01.633780 ip-10-0-129-92 kubenswrapper[2583]: E0421 18:01:01.633747 2583 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://registry.redhat.io/ubi9/ubi-minimal:9.7: reading manifest sha256:8e5f23f039511fbb19fcd95b7caa0bbfbf1780ed14d300c3539df6f1040e285d in registry.redhat.io/ubi9/ubi-minimal: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="opendatahub/maas-api-key-cleanup-29613240-kzjx8" podUID="4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd" Apr 21 18:01:02.312121 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:02.312098 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613240-kzjx8" Apr 21 18:01:02.395307 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:02.395277 2583 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc788\" (UniqueName: \"kubernetes.io/projected/4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd-kube-api-access-kc788\") pod \"4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd\" (UID: \"4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd\") " Apr 21 18:01:02.397548 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:02.397519 2583 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd-kube-api-access-kc788" (OuterVolumeSpecName: "kube-api-access-kc788") pod "4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd" (UID: "4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd"). InnerVolumeSpecName "kube-api-access-kc788". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 18:01:02.496423 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:02.496388 2583 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kc788\" (UniqueName: \"kubernetes.io/projected/4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd-kube-api-access-kc788\") on node \"ip-10-0-129-92.ec2.internal\" DevicePath \"\"" Apr 21 18:01:03.183322 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:03.183296 2583 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613240-kzjx8" Apr 21 18:01:03.183322 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:03.183289 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613240-kzjx8" event={"ID":"4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd","Type":"ContainerDied","Data":"da6f123f2d6e939974a6f2e19eb37a30e501a3954c5b03602b450f71ca954804"} Apr 21 18:01:03.215916 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:03.215883 2583 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613240-kzjx8"] Apr 21 18:01:03.220774 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:03.220738 2583 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613240-kzjx8"] Apr 21 18:01:03.464917 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:03.464888 2583 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd" path="/var/lib/kubelet/pods/4d2e26bd-065a-44f9-b2ef-7bc8b03dc1fd/volumes" Apr 21 18:01:03.818265 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:03.818151 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:01:11.422652 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:11.422613 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:01:22.726311 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:22.726273 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:01:31.524855 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:31.524816 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:01:42.028649 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:42.028610 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:01:51.122045 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:01:51.121955 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:02:02.532274 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:02:02.532236 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:02:10.425305 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:02:10.425258 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:02:16.126933 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:02:16.126890 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:02:44.060192 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:02:44.060148 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:03:25.827417 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:03:25.822702 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:03:34.617141 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:03:34.617096 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:03:43.329415 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:03:43.329368 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:03:53.922595 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:03:53.922557 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:04:03.124799 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:04:03.124759 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:04:15.626545 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:04:15.626507 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:04:24.729507 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:04:24.729470 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:04:32.824602 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:04:32.824560 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:04:40.327347 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:04:40.327311 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:04:48.721401 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:04:48.721291 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:04:57.418772 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:04:57.418737 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:05:08.132846 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:05:08.132801 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:05:25.924564 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:05:25.924521 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:05:33.629913 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:05:33.629872 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:05:43.029118 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:05:43.029073 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:05:51.423538 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:05:51.423500 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:06:08.724937 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:06:08.724899 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:06:15.836012 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:06:15.835921 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:06:24.828209 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:06:24.828155 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:06:33.416532 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:06:33.416491 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:06:42.723610 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:06:42.723572 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:06:50.746080 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:06:50.746040 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:07:00.039624 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:07:00.039581 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:07:13.325991 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:07:13.325958 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:07:22.332570 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:07:22.332531 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:07:35.424382 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:07:35.424342 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:07:44.423737 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:07:44.423646 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:07:52.545712 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:07:52.545672 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:08:00.631295 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:08:00.631258 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:08:08.616347 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:08:08.616309 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:08:24.828036 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:08:24.827997 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:08:33.423143 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:08:33.423100 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:08:42.326808 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:08:42.326774 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:08:50.624088 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:08:50.624043 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:09:15.323384 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:15.323285 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:09:27.016369 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:27.016325 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-5h6gp"] Apr 21 18:09:33.366304 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:33.366270 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g_f61381e7-9639-4b7b-8596-4906556a0b03/manager/0.log" Apr 21 18:09:34.943446 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:34.943409 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-z4tzz_e726105f-d76d-4f78-8162-228534307ab2/manager/0.log" Apr 21 18:09:35.050031 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:35.049995 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-h9sv8_fe34329f-98ea-4289-bbb4-f6052a0a01f5/manager/0.log" Apr 21 18:09:35.284605 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:35.284519 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-nc9q5_f349e837-52d4-4138-b150-4f2f90ca1ec8/registry-server/0.log" Apr 21 18:09:35.411367 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:35.411331 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-zznzs_28e201ea-4352-4b3f-9c48-07420cbab827/manager/0.log" Apr 21 18:09:35.525932 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:35.525895 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-5h6gp_a4d82be3-2931-4deb-934c-487fe167e261/limitador/0.log" Apr 21 18:09:35.975747 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:35.975712 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cffwd69_16cc9871-1c08-4e35-968e-3d455ccf671e/istio-proxy/0.log" Apr 21 18:09:36.546928 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:36.546896 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-bcf746f86-2s4sn_3a283e8b-7dfc-4c49-9afd-4adb1c192587/router/0.log" Apr 21 18:09:36.875827 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:36.875747 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb_9ecb44b0-0eb4-473c-9289-ffa676e85f54/storage-initializer/0.log" Apr 21 18:09:36.883729 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:36.883701 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-whwwb_9ecb44b0-0eb4-473c-9289-ffa676e85f54/main/0.log" Apr 21 18:09:36.992504 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:36.992466 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s_c8bcaf2f-5723-46b2-b6e7-42705b9b4d01/storage-initializer/0.log" Apr 21 18:09:36.999974 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:36.999948 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-fgh2s_c8bcaf2f-5723-46b2-b6e7-42705b9b4d01/main/0.log" Apr 21 18:09:44.003526 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:44.003494 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-94ktg_b0a2f124-319a-473e-9b27-5c36c13da638/global-pull-secret-syncer/0.log" Apr 21 18:09:44.127287 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:44.127251 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-sjdgx_45e8c620-ac92-4664-985c-5abe0fc26bed/konnectivity-agent/0.log" Apr 21 18:09:44.187329 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:44.187298 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-92.ec2.internal_666dcd2066b38a4dbb7941535c2eb7f9/haproxy/0.log" Apr 21 18:09:48.161451 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:48.161412 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-z4tzz_e726105f-d76d-4f78-8162-228534307ab2/manager/0.log" Apr 21 18:09:48.180253 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:48.180224 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-h9sv8_fe34329f-98ea-4289-bbb4-f6052a0a01f5/manager/0.log" Apr 21 18:09:48.251476 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:48.251432 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-nc9q5_f349e837-52d4-4138-b150-4f2f90ca1ec8/registry-server/0.log" Apr 21 18:09:48.338582 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:48.338554 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-zznzs_28e201ea-4352-4b3f-9c48-07420cbab827/manager/0.log" Apr 21 18:09:48.356301 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:48.356269 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-5h6gp_a4d82be3-2931-4deb-934c-487fe167e261/limitador/0.log" Apr 21 18:09:49.773660 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:49.773606 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e4f8e89a-b68a-4807-9aca-4424ccedd246/alertmanager/0.log" Apr 21 18:09:49.791451 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:49.791423 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e4f8e89a-b68a-4807-9aca-4424ccedd246/config-reloader/0.log" Apr 21 18:09:49.816234 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:49.816144 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e4f8e89a-b68a-4807-9aca-4424ccedd246/kube-rbac-proxy-web/0.log" Apr 21 18:09:49.833869 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:49.833844 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e4f8e89a-b68a-4807-9aca-4424ccedd246/kube-rbac-proxy/0.log" Apr 21 18:09:49.855143 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:49.855114 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e4f8e89a-b68a-4807-9aca-4424ccedd246/kube-rbac-proxy-metric/0.log" Apr 21 18:09:49.872542 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:49.872512 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e4f8e89a-b68a-4807-9aca-4424ccedd246/prom-label-proxy/0.log" Apr 21 18:09:49.893526 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:49.893492 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e4f8e89a-b68a-4807-9aca-4424ccedd246/init-config-reloader/0.log" Apr 21 18:09:50.020577 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.020548 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-58f9c865df-m8vrm_d73ac7ba-d87d-428c-a3df-8217c7ce9412/metrics-server/0.log" Apr 21 18:09:50.162262 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.162187 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ns9q7_edb54a22-e9b8-49a2-b5f3-e4273f5b4400/node-exporter/0.log" Apr 21 18:09:50.180770 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.180742 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ns9q7_edb54a22-e9b8-49a2-b5f3-e4273f5b4400/kube-rbac-proxy/0.log" Apr 21 18:09:50.203848 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.203825 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ns9q7_edb54a22-e9b8-49a2-b5f3-e4273f5b4400/init-textfile/0.log" Apr 21 18:09:50.310458 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.310426 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hmmfp_b27107b2-5afd-4ffb-9d8b-7d03e1202c52/kube-rbac-proxy-main/0.log" Apr 21 18:09:50.334322 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.334292 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hmmfp_b27107b2-5afd-4ffb-9d8b-7d03e1202c52/kube-rbac-proxy-self/0.log" Apr 21 18:09:50.353245 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.353212 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-hmmfp_b27107b2-5afd-4ffb-9d8b-7d03e1202c52/openshift-state-metrics/0.log" Apr 21 18:09:50.405614 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.405572 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_485192ea-97bf-4e77-a113-a90abb8a1ff2/prometheus/0.log" Apr 21 18:09:50.430306 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.430274 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_485192ea-97bf-4e77-a113-a90abb8a1ff2/config-reloader/0.log" Apr 21 18:09:50.454598 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.454568 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_485192ea-97bf-4e77-a113-a90abb8a1ff2/thanos-sidecar/0.log" Apr 21 18:09:50.474148 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.474123 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_485192ea-97bf-4e77-a113-a90abb8a1ff2/kube-rbac-proxy-web/0.log" Apr 21 18:09:50.491031 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.491003 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_485192ea-97bf-4e77-a113-a90abb8a1ff2/kube-rbac-proxy/0.log" Apr 21 18:09:50.507138 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.507101 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_485192ea-97bf-4e77-a113-a90abb8a1ff2/kube-rbac-proxy-thanos/0.log" Apr 21 18:09:50.524188 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.524145 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_485192ea-97bf-4e77-a113-a90abb8a1ff2/init-config-reloader/0.log" Apr 21 18:09:50.604443 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:50.604402 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-nzxcj_82faf712-4dd3-4d34-8d74-ef5e8b3aa92c/prometheus-operator-admission-webhook/0.log" Apr 21 18:09:53.172031 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.171992 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-rf84h_b381afa1-d1a3-4590-97dd-05ddf6b9551c/download-server/0.log" Apr 21 18:09:53.455324 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.455292 2583 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59"] Apr 21 18:09:53.458780 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.458760 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.461276 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.461255 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nf5lv\"/\"kube-root-ca.crt\"" Apr 21 18:09:53.462418 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.462394 2583 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nf5lv\"/\"default-dockercfg-28xmb\"" Apr 21 18:09:53.462577 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.462553 2583 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nf5lv\"/\"openshift-service-ca.crt\"" Apr 21 18:09:53.466536 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.466513 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59"] Apr 21 18:09:53.575434 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.575398 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af63e195-21f4-4fc3-ae8a-481c664a6ee9-lib-modules\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.575616 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.575458 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af63e195-21f4-4fc3-ae8a-481c664a6ee9-proc\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.575616 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.575522 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af63e195-21f4-4fc3-ae8a-481c664a6ee9-podres\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.575709 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.575630 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af63e195-21f4-4fc3-ae8a-481c664a6ee9-sys\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.575709 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.575670 2583 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5krq\" (UniqueName: \"kubernetes.io/projected/af63e195-21f4-4fc3-ae8a-481c664a6ee9-kube-api-access-m5krq\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.677143 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.677104 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af63e195-21f4-4fc3-ae8a-481c664a6ee9-lib-modules\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.677143 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.677154 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af63e195-21f4-4fc3-ae8a-481c664a6ee9-proc\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.677472 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.677215 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af63e195-21f4-4fc3-ae8a-481c664a6ee9-podres\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.677472 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.677255 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/af63e195-21f4-4fc3-ae8a-481c664a6ee9-proc\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.677472 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.677267 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af63e195-21f4-4fc3-ae8a-481c664a6ee9-sys\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.677472 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.677325 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/af63e195-21f4-4fc3-ae8a-481c664a6ee9-sys\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.677472 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.677327 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af63e195-21f4-4fc3-ae8a-481c664a6ee9-lib-modules\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.677472 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.677329 2583 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5krq\" (UniqueName: \"kubernetes.io/projected/af63e195-21f4-4fc3-ae8a-481c664a6ee9-kube-api-access-m5krq\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.677472 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.677381 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/af63e195-21f4-4fc3-ae8a-481c664a6ee9-podres\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.685126 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.685103 2583 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5krq\" (UniqueName: \"kubernetes.io/projected/af63e195-21f4-4fc3-ae8a-481c664a6ee9-kube-api-access-m5krq\") pod \"perf-node-gather-daemonset-vsl59\" (UID: \"af63e195-21f4-4fc3-ae8a-481c664a6ee9\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.770784 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.770692 2583 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:53.901184 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.901142 2583 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59"] Apr 21 18:09:53.903946 ip-10-0-129-92 kubenswrapper[2583]: W0421 18:09:53.903917 2583 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaf63e195_21f4_4fc3_ae8a_481c664a6ee9.slice/crio-3f736b46e6b4dafd8237ab95f5fd688669dfcdab5b54bc1d50ec8920f5683b02 WatchSource:0}: Error finding container 3f736b46e6b4dafd8237ab95f5fd688669dfcdab5b54bc1d50ec8920f5683b02: Status 404 returned error can't find the container with id 3f736b46e6b4dafd8237ab95f5fd688669dfcdab5b54bc1d50ec8920f5683b02 Apr 21 18:09:53.905575 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:53.905550 2583 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 18:09:54.093232 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:54.093121 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" event={"ID":"af63e195-21f4-4fc3-ae8a-481c664a6ee9","Type":"ContainerStarted","Data":"c13abdd5ff6f7904dd6225bb0f433eac6eb1dd7c31c3df63eb24288f10104287"} Apr 21 18:09:54.093232 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:54.093165 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:09:54.093232 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:54.093195 2583 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" event={"ID":"af63e195-21f4-4fc3-ae8a-481c664a6ee9","Type":"ContainerStarted","Data":"3f736b46e6b4dafd8237ab95f5fd688669dfcdab5b54bc1d50ec8920f5683b02"} Apr 21 18:09:54.110871 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:54.110824 2583 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" podStartSLOduration=1.110808125 podStartE2EDuration="1.110808125s" podCreationTimestamp="2026-04-21 18:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 18:09:54.109709158 +0000 UTC m=+2181.249771924" watchObservedRunningTime="2026-04-21 18:09:54.110808125 +0000 UTC m=+2181.250870916" Apr 21 18:09:54.621107 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:54.621067 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tk5wc_c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124/dns/0.log" Apr 21 18:09:54.640714 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:54.640680 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tk5wc_c3cd5d3c-ce71-40e7-aaf4-cef9eac5f124/kube-rbac-proxy/0.log" Apr 21 18:09:54.685680 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:54.685635 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hccth_16529597-1f2f-47de-ade5-9fb7b122147c/dns-node-resolver/0.log" Apr 21 18:09:55.223630 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:55.223595 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-86bffdb85-f9pnq_ec7cf3c5-c440-4bc6-a03d-076cc402b14a/registry/0.log" Apr 21 18:09:55.264036 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:55.264001 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hmlbc_d5d8e485-86a1-4255-912b-af222842087c/node-ca/0.log" Apr 21 18:09:56.061903 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:56.061862 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cffwd69_16cc9871-1c08-4e35-968e-3d455ccf671e/istio-proxy/0.log" Apr 21 18:09:56.355669 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:56.355561 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-bcf746f86-2s4sn_3a283e8b-7dfc-4c49-9afd-4adb1c192587/router/0.log" Apr 21 18:09:56.877469 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:56.877432 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4cg4j_53e51873-1fc7-47fe-b3f9-224f9c0521d4/serve-healthcheck-canary/0.log" Apr 21 18:09:57.479820 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:57.479785 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dllbm_c73551f6-0e85-4d52-beca-b016ec94ad43/kube-rbac-proxy/0.log" Apr 21 18:09:57.495638 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:57.495613 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dllbm_c73551f6-0e85-4d52-beca-b016ec94ad43/exporter/0.log" Apr 21 18:09:57.512387 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:57.512347 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dllbm_c73551f6-0e85-4d52-beca-b016ec94ad43/extractor/0.log" Apr 21 18:09:59.734544 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:09:59.734509 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d5f5c78f5-5hw9g_f61381e7-9639-4b7b-8596-4906556a0b03/manager/0.log" Apr 21 18:10:00.106688 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:00.106620 2583 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-vsl59" Apr 21 18:10:00.924087 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:00.924053 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-bdd4f6877-q58td_6eec124d-d377-4906-b479-77f34f5fe45a/manager/0.log" Apr 21 18:10:05.713335 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:05.713273 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-h7cx4_343fcef3-240d-459d-8c84-7164f0722f10/kube-storage-version-migrator-operator/1.log" Apr 21 18:10:05.715315 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:05.715286 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-h7cx4_343fcef3-240d-459d-8c84-7164f0722f10/kube-storage-version-migrator-operator/0.log" Apr 21 18:10:06.800614 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:06.800584 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4qt27_47035621-4957-4280-94ce-ecd6810f7254/kube-multus-additional-cni-plugins/0.log" Apr 21 18:10:06.817535 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:06.817506 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4qt27_47035621-4957-4280-94ce-ecd6810f7254/egress-router-binary-copy/0.log" Apr 21 18:10:06.833668 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:06.833633 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4qt27_47035621-4957-4280-94ce-ecd6810f7254/cni-plugins/0.log" Apr 21 18:10:06.857502 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:06.857474 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4qt27_47035621-4957-4280-94ce-ecd6810f7254/bond-cni-plugin/0.log" Apr 21 18:10:06.874056 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:06.874022 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4qt27_47035621-4957-4280-94ce-ecd6810f7254/routeoverride-cni/0.log" Apr 21 18:10:06.892281 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:06.892250 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4qt27_47035621-4957-4280-94ce-ecd6810f7254/whereabouts-cni-bincopy/0.log" Apr 21 18:10:06.909056 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:06.909028 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4qt27_47035621-4957-4280-94ce-ecd6810f7254/whereabouts-cni/0.log" Apr 21 18:10:07.209032 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:07.209002 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sgc5c_edc1db03-a462-4f21-bb36-369766777418/kube-multus/0.log" Apr 21 18:10:07.256519 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:07.256490 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rfmv6_38cd15ba-d0c7-4b4f-b220-f72981ccd9da/network-metrics-daemon/0.log" Apr 21 18:10:07.271812 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:07.271781 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rfmv6_38cd15ba-d0c7-4b4f-b220-f72981ccd9da/kube-rbac-proxy/0.log" Apr 21 18:10:08.712198 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:08.712141 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfgp5_14778c8a-9e3f-4e53-aea1-4de908a64e9f/ovn-controller/0.log" Apr 21 18:10:08.745768 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:08.745736 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfgp5_14778c8a-9e3f-4e53-aea1-4de908a64e9f/ovn-acl-logging/0.log" Apr 21 18:10:08.765243 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:08.765214 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfgp5_14778c8a-9e3f-4e53-aea1-4de908a64e9f/kube-rbac-proxy-node/0.log" Apr 21 18:10:08.786513 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:08.786468 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfgp5_14778c8a-9e3f-4e53-aea1-4de908a64e9f/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 18:10:08.800061 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:08.800032 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfgp5_14778c8a-9e3f-4e53-aea1-4de908a64e9f/northd/0.log" Apr 21 18:10:08.816019 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:08.815987 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfgp5_14778c8a-9e3f-4e53-aea1-4de908a64e9f/nbdb/0.log" Apr 21 18:10:08.832188 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:08.832143 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfgp5_14778c8a-9e3f-4e53-aea1-4de908a64e9f/sbdb/0.log" Apr 21 18:10:09.003756 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:09.003668 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xfgp5_14778c8a-9e3f-4e53-aea1-4de908a64e9f/ovnkube-controller/0.log" Apr 21 18:10:10.115557 ip-10-0-129-92 kubenswrapper[2583]: I0421 18:10:10.115529 2583 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-5bfpn_fbb6a7fe-cc60-43c1-919d-78f0d38148cd/network-check-target-container/0.log"