Apr 22 18:40:49.569748 ip-10-0-134-244 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:40:49.569761 ip-10-0-134-244 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:40:49.569770 ip-10-0-134-244 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:40:49.570098 ip-10-0-134-244 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:40:59.571007 ip-10-0-134-244 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:40:59.571043 ip-10-0-134-244 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot d2fcc8ae6a364720ae0455235cd8d1f1 -- Apr 22 18:43:28.909913 ip-10-0-134-244 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:43:29.281753 ip-10-0-134-244 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:43:29.281753 ip-10-0-134-244 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:43:29.281753 ip-10-0-134-244 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:43:29.281753 ip-10-0-134-244 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:43:29.281753 ip-10-0-134-244 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:43:29.282563 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.282475 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:43:29.285344 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285329 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:29.285344 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285345 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285349 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285352 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285355 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285357 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285360 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285364 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285367 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285370 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285372 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285375 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285377 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285380 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285382 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285385 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285389 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285393 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285396 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285399 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:29.285409 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285401 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285411 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285416 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285419 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285422 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285425 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285428 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285431 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285433 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285436 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285439 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285442 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285444 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285447 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285449 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285453 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285455 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285458 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285460 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285463 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:29.285990 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285465 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285468 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285470 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285473 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285476 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285478 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285481 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285484 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285487 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285489 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285492 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285494 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285497 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285500 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285503 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285506 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285509 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285511 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285514 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285517 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:29.286466 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285519 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285522 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285524 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285527 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285529 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285532 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285534 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285537 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285539 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285549 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285552 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285555 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285558 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285561 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285563 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285567 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285569 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285572 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285575 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:29.286972 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285579 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285582 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285585 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285587 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285589 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285592 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.285595 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286118 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286125 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286128 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286131 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286134 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286137 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286139 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286142 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286145 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286147 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286150 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286152 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286155 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:29.287424 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286157 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286160 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286162 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286165 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286168 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286171 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286173 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286176 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286179 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286181 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286184 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286187 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286189 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286194 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286198 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286203 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286206 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286208 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286211 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:29.287953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286214 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286217 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286219 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286222 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286224 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286227 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286230 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286232 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286234 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286237 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286239 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286243 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286245 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286247 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286250 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286252 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286255 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286257 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286259 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286262 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:29.288423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286264 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286267 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286270 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286272 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286275 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286278 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286280 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286283 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286285 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286288 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286290 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286293 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286296 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286298 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286301 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286303 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286306 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286308 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286311 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:29.288931 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286313 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286316 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286318 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286321 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286324 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286326 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286329 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286332 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286334 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286337 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286339 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286341 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286344 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286346 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.286348 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287508 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287523 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287530 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287535 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287540 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287543 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:43:29.289394 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287548 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287552 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287556 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287559 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287562 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287566 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287569 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287572 2570 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287575 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287578 2570 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287581 2570 flags.go:64] FLAG: --cloud-config="" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287583 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287586 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287590 2570 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287593 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287596 2570 flags.go:64] FLAG: --config-dir="" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287599 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287602 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287606 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287609 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287612 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287626 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287630 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287633 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:43:29.289920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287636 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287639 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287642 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287646 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287649 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287654 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287657 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287660 2570 flags.go:64] FLAG: --enable-server="true" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287663 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287668 2570 flags.go:64] FLAG: --event-burst="100" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287671 2570 flags.go:64] FLAG: --event-qps="50" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287674 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287677 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287680 2570 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287684 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287686 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287689 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287692 2570 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287695 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287698 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287701 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287704 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287707 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287709 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287712 2570 flags.go:64] FLAG: --feature-gates="" Apr 22 18:43:29.290487 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287716 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287719 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287722 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287725 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287728 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287731 2570 flags.go:64] FLAG: --help="false" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287734 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287737 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287740 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287743 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287747 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287750 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287754 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287757 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287760 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287763 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287766 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287769 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287772 2570 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287774 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287777 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287780 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287783 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287786 2570 flags.go:64] FLAG: --lock-file="" Apr 22 18:43:29.291097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287788 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287791 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287795 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287800 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287803 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287806 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287808 2570 flags.go:64] FLAG: --logging-format="text" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287811 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287814 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287817 2570 flags.go:64] FLAG: --manifest-url="" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287820 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287825 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287828 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287832 2570 flags.go:64] FLAG: --max-pods="110" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287835 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287838 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287841 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287844 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287846 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287849 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287852 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287860 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287863 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287866 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:43:29.291668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287869 2570 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287872 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287877 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287880 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287883 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287886 2570 flags.go:64] FLAG: --port="10250" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287889 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287892 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-05bd581a642a93001" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287895 2570 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287898 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287901 2570 flags.go:64] FLAG: --register-node="true" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287904 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287906 2570 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287910 2570 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287913 2570 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287916 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287918 2570 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287922 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287925 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287928 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287931 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287934 2570 flags.go:64] FLAG: --runonce="false" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287936 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287939 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287942 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:43:29.292248 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287945 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287948 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287951 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287954 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287957 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287961 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287964 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287967 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287970 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287973 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287976 2570 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287979 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287984 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287987 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287989 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287993 2570 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287996 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.287999 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.288001 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.288004 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.288007 2570 flags.go:64] FLAG: --v="2" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.288012 2570 flags.go:64] FLAG: --version="false" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.288015 2570 flags.go:64] FLAG: --vmodule="" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.288020 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.288023 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:43:29.292903 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288122 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288126 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288129 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288131 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288134 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288137 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288139 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288142 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288145 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288147 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288150 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288156 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288158 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288161 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288164 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288166 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288169 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288171 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288174 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288177 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:29.293511 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288179 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288182 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288184 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288187 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288189 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288192 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288194 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288197 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288199 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288202 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288204 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288207 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288213 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288216 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288219 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288222 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288224 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288227 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288229 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288231 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:29.294116 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288234 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288236 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288240 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288244 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288246 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288249 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288252 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288255 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288257 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288260 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288264 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288267 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288271 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288273 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288276 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288279 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288281 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288284 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288286 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:29.295017 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288289 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288291 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288294 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288296 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288299 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288302 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288305 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288307 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288310 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288312 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288315 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288318 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288320 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288323 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288325 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288328 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288331 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288335 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288338 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288341 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:29.295850 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288344 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288347 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288349 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288352 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288355 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288357 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.288360 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.288955 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.296262 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.296282 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296358 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296366 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296371 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296376 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296381 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296385 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:29.296713 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296389 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296393 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296397 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296402 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296406 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296410 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296415 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296419 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296423 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296428 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296431 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296436 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296440 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296445 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296450 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296454 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296458 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296462 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296466 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296471 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:29.297423 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296474 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296478 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296482 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296487 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296491 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296498 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296502 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296507 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296512 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296516 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296520 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296524 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296529 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296534 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296538 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296544 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296550 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296554 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296558 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:29.298029 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296562 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296567 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296571 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296576 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296580 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296585 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296589 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296593 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296597 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296601 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296605 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296609 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296613 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296638 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296643 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296647 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296652 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296656 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296661 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296666 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:29.298599 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296670 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296674 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296678 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296682 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296686 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296691 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296695 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296699 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296703 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296709 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296716 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296721 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296724 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296729 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296733 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296737 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296742 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296746 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296751 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:29.299326 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296755 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296759 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.296768 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296928 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296937 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296941 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296945 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296952 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296958 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296963 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296967 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296972 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296976 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296981 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296985 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:43:29.299866 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296988 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296992 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.296996 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297000 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297004 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297008 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297012 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297015 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297019 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297023 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297027 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297031 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297035 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297039 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297043 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297047 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297051 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297055 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297059 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297063 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:43:29.300303 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297067 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297071 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297076 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297080 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297084 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297088 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297092 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297096 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297100 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297104 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297108 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297120 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297124 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297128 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297132 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297136 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297140 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297144 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297149 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:43:29.301000 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297153 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297157 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297161 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297165 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297169 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297175 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297180 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297186 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297191 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297195 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297200 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297204 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297208 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297212 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297216 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297220 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297224 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297228 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297232 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:43:29.301922 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297236 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297240 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297244 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297248 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297252 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297256 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297262 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297266 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297271 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297275 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297279 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297283 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297287 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297291 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297295 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:43:29.302487 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:29.297299 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:43:29.302956 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.297307 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:43:29.302956 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.298140 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:43:29.302956 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.302909 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:43:29.303849 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.303834 2570 server.go:1019] "Starting client certificate rotation" Apr 22 18:43:29.303960 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.303940 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:43:29.303995 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.303980 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:43:29.324350 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.324314 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:43:29.328295 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.328265 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:43:29.341439 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.341411 2570 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:43:29.346670 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.346647 2570 log.go:25] "Validated CRI v1 image API" Apr 22 18:43:29.348396 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.348378 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:43:29.352175 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.352153 2570 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a4a743f3-b024-4013-a0fb-8668f67ddc71:/dev/nvme0n1p3 f847cc56-a678-4659-bf63-8ebe039a02cf:/dev/nvme0n1p4] Apr 22 18:43:29.352243 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.352174 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:43:29.353290 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.353276 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:43:29.357875 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.357757 2570 manager.go:217] Machine: {Timestamp:2026-04-22 18:43:29.356004269 +0000 UTC m=+0.339470202 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099404 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c2513bb48af033721b12a58139cae SystemUUID:ec2c2513-bb48-af03-3721-b12a58139cae BootID:d2fcc8ae-6a36-4720-ae04-55235cd8d1f1 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:bd:ff:1f:2d:d3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:bd:ff:1f:2d:d3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:56:3d:02:ae:fa:91 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:43:29.357875 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.357870 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:43:29.358012 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.358000 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:43:29.358917 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.358891 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:43:29.359056 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.358920 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-244.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:43:29.359108 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.359066 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:43:29.359108 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.359074 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:43:29.359108 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.359087 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:43:29.359684 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.359673 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:43:29.360884 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.360873 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:43:29.360999 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.360989 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:43:29.362860 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.362848 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:43:29.362906 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.362871 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:43:29.362906 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.362884 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:43:29.362906 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.362896 2570 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:43:29.363013 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.362908 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:43:29.363827 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.363810 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:43:29.363901 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.363838 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:43:29.366970 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.366950 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:43:29.368364 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.368351 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:43:29.369982 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.369965 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:43:29.369982 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.369982 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:43:29.370126 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.369989 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:43:29.370126 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.369994 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:43:29.370126 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.370000 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:43:29.370126 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.370007 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:43:29.370126 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.370012 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:43:29.370126 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.370018 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:43:29.370126 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.370026 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:43:29.370126 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.370032 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:43:29.370126 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.370040 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:43:29.370126 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.370049 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:43:29.371336 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.371325 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:43:29.371336 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.371336 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:43:29.372968 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.372941 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:43:29.373484 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.373450 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-244.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:43:29.375657 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.375641 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:43:29.375743 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.375682 2570 server.go:1295] "Started kubelet" Apr 22 18:43:29.375818 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.375787 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:43:29.375869 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.375780 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:43:29.375869 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.375862 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:43:29.375942 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.375933 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-244.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:43:29.376545 ip-10-0-134-244 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:43:29.377037 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.377015 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:43:29.378180 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.378165 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:43:29.382695 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.382669 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:43:29.382695 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.382678 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:43:29.383221 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.383199 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:43:29.383221 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.383221 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:43:29.383377 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.383324 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:43:29.383377 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.383370 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:43:29.383377 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.383378 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:43:29.383543 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.383411 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:29.383703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.383669 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qjqxn" Apr 22 18:43:29.384214 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.384192 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:43:29.384586 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.384574 2570 factory.go:153] Registering CRI-O factory Apr 22 18:43:29.384661 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.384595 2570 factory.go:223] Registration of the crio container factory successfully Apr 22 18:43:29.384661 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.384659 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:43:29.384750 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.384666 2570 factory.go:55] Registering systemd factory Apr 22 18:43:29.384750 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.384671 2570 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:43:29.384750 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.384689 2570 factory.go:103] Registering Raw factory Apr 22 18:43:29.384750 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.384698 2570 manager.go:1196] Started watching for new ooms in manager Apr 22 18:43:29.385137 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.385124 2570 manager.go:319] Starting recovery of all containers Apr 22 18:43:29.387778 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.387537 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-244.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:43:29.388717 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.387933 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-244.ec2.internal.18a8c20be71ef64a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-244.ec2.internal,UID:ip-10-0-134-244.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-244.ec2.internal,},FirstTimestamp:2026-04-22 18:43:29.375655498 +0000 UTC m=+0.359121430,LastTimestamp:2026-04-22 18:43:29.375655498 +0000 UTC m=+0.359121430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-244.ec2.internal,}" Apr 22 18:43:29.390296 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.390252 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:43:29.396147 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.396119 2570 manager.go:324] Recovery completed Apr 22 18:43:29.396357 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.396333 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qjqxn" Apr 22 18:43:29.402173 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.402155 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:29.404533 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.404516 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:29.404592 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.404550 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:29.404592 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.404564 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:29.405087 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.405072 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:43:29.405087 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.405086 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:43:29.405178 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.405102 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:43:29.406786 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.406715 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-244.ec2.internal.18a8c20be8d79c34 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-244.ec2.internal,UID:ip-10-0-134-244.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-244.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-244.ec2.internal,},FirstTimestamp:2026-04-22 18:43:29.404533812 +0000 UTC m=+0.387999744,LastTimestamp:2026-04-22 18:43:29.404533812 +0000 UTC m=+0.387999744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-244.ec2.internal,}" Apr 22 18:43:29.408343 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.408331 2570 policy_none.go:49] "None policy: Start" Apr 22 18:43:29.408400 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.408347 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:43:29.408400 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.408357 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:43:29.452309 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.452290 2570 manager.go:341] "Starting Device Plugin manager" Apr 22 18:43:29.476761 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.452332 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:43:29.476761 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.452345 2570 server.go:85] "Starting device plugin registration server" Apr 22 18:43:29.476761 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.452644 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:43:29.476761 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.452660 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:43:29.476761 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.452748 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:43:29.476761 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.452832 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:43:29.476761 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.452843 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:43:29.476761 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.454140 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:43:29.476761 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.454180 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:29.543544 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.543447 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:43:29.544812 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.544794 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:43:29.544904 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.544828 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:43:29.544904 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.544857 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:43:29.544904 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.544867 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:43:29.545036 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.544911 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:43:29.547855 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.547836 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:29.553432 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.553416 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:29.554197 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.554181 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:29.554274 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.554212 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:29.554274 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.554229 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:29.554274 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.554257 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.564226 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.564210 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.564226 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.564233 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-244.ec2.internal\": node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:29.595939 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.595915 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:29.645816 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.645780 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal"] Apr 22 18:43:29.645981 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.645884 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:29.646931 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.646912 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:29.647062 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.646950 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:29.647062 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.646964 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:29.649327 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.649309 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:29.649466 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.649437 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.649466 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.649468 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:29.650115 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.650093 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:29.650212 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.650126 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:29.650212 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.650093 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:29.650212 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.650157 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:29.650212 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.650140 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:29.650212 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.650173 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:29.652404 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.652390 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.652450 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.652417 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:43:29.653135 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.653120 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:43:29.653214 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.653144 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:43:29.653214 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.653157 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:43:29.683440 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.683417 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-244.ec2.internal\" not found" node="ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.687910 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.687890 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-244.ec2.internal\" not found" node="ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.696944 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.696927 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:29.784878 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.784842 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1c6e032a9cd3412b502d428b8f5c545c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal\" (UID: \"1c6e032a9cd3412b502d428b8f5c545c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.784878 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.784880 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c6e032a9cd3412b502d428b8f5c545c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal\" (UID: \"1c6e032a9cd3412b502d428b8f5c545c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.785068 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.784904 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/009f099669c612b1a9a7e8809b1d3526-config\") pod \"kube-apiserver-proxy-ip-10-0-134-244.ec2.internal\" (UID: \"009f099669c612b1a9a7e8809b1d3526\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.797837 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.797762 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:29.885153 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.885121 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c6e032a9cd3412b502d428b8f5c545c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal\" (UID: \"1c6e032a9cd3412b502d428b8f5c545c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.885244 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.885163 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/009f099669c612b1a9a7e8809b1d3526-config\") pod \"kube-apiserver-proxy-ip-10-0-134-244.ec2.internal\" (UID: \"009f099669c612b1a9a7e8809b1d3526\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.885244 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.885214 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/009f099669c612b1a9a7e8809b1d3526-config\") pod \"kube-apiserver-proxy-ip-10-0-134-244.ec2.internal\" (UID: \"009f099669c612b1a9a7e8809b1d3526\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.885244 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.885216 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1c6e032a9cd3412b502d428b8f5c545c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal\" (UID: \"1c6e032a9cd3412b502d428b8f5c545c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.885359 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.885264 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1c6e032a9cd3412b502d428b8f5c545c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal\" (UID: \"1c6e032a9cd3412b502d428b8f5c545c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.885359 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.885285 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1c6e032a9cd3412b502d428b8f5c545c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal\" (UID: \"1c6e032a9cd3412b502d428b8f5c545c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.898209 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.898186 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:29.985591 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.985537 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.990393 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:29.990369 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" Apr 22 18:43:29.999281 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:29.999259 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:30.099758 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:30.099726 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:30.200264 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:30.200234 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:30.300773 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:30.300744 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:30.303915 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.303896 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:43:30.304036 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.304019 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:43:30.383757 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.383675 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:43:30.395132 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.395103 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:43:30.399490 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.399456 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:38:29 +0000 UTC" deadline="2028-02-02 02:58:40.79787548 +0000 UTC" Apr 22 18:43:30.399490 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.399489 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15608h15m10.398390002s" Apr 22 18:43:30.400896 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:30.400879 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:30.437396 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.437365 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-p6p5b" Apr 22 18:43:30.446916 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.446895 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-p6p5b" Apr 22 18:43:30.500730 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.500693 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:30.501675 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:30.501655 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:30.570587 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:30.570550 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c6e032a9cd3412b502d428b8f5c545c.slice/crio-161faef8475daffcdfd3776a1dbe07d4fefa76defc4576b317a27e8ffb524ce2 WatchSource:0}: Error finding container 161faef8475daffcdfd3776a1dbe07d4fefa76defc4576b317a27e8ffb524ce2: Status 404 returned error can't find the container with id 161faef8475daffcdfd3776a1dbe07d4fefa76defc4576b317a27e8ffb524ce2 Apr 22 18:43:30.570953 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:30.570915 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod009f099669c612b1a9a7e8809b1d3526.slice/crio-a88ea51acfe4458b52b0069e6a13142eedc605d767d3fe63e6a7e0976af1c082 WatchSource:0}: Error finding container a88ea51acfe4458b52b0069e6a13142eedc605d767d3fe63e6a7e0976af1c082: Status 404 returned error can't find the container with id a88ea51acfe4458b52b0069e6a13142eedc605d767d3fe63e6a7e0976af1c082 Apr 22 18:43:30.575486 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.575471 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:43:30.579635 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.579599 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:30.602289 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:30.602253 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:30.702867 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:30.702794 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-244.ec2.internal\" not found" Apr 22 18:43:30.777445 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.777416 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:30.783629 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.783600 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" Apr 22 18:43:30.796809 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.796784 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:43:30.797635 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.797606 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" Apr 22 18:43:30.807823 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:30.807799 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:43:31.363859 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.363830 2570 apiserver.go:52] "Watching apiserver" Apr 22 18:43:31.374200 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.374167 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:43:31.375282 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.375247 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal","openshift-multus/multus-additional-cni-plugins-ftgq4","openshift-multus/multus-rhtwv","openshift-multus/network-metrics-daemon-w8q5c","openshift-network-operator/iptables-alerter-szgjh","openshift-ovn-kubernetes/ovnkube-node-rpznc","kube-system/konnectivity-agent-scf4m","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk","openshift-cluster-node-tuning-operator/tuned-x7jz9","openshift-dns/node-resolver-2z78z","openshift-image-registry/node-ca-gmml2","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal","openshift-network-diagnostics/network-check-target-shrxv"] Apr 22 18:43:31.378254 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.378230 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-scf4m" Apr 22 18:43:31.380950 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.380922 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.382400 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.382209 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jqjm5\"" Apr 22 18:43:31.382400 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.382270 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:43:31.382997 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.382977 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.383391 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.383371 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:43:31.383849 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.383829 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:43:31.384168 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.384064 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:43:31.384168 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.384104 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-t2cwl\"" Apr 22 18:43:31.384168 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.384066 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:43:31.384461 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.384436 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:43:31.384543 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.384528 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:43:31.385221 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.385063 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:43:31.385404 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.385384 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-82r49\"" Apr 22 18:43:31.387564 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.387545 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:31.387659 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:31.387638 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:31.389812 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.389794 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-szgjh" Apr 22 18:43:31.392341 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.392325 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.392429 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.392398 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:43:31.392489 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.392437 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qkfkn\"" Apr 22 18:43:31.392602 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.392587 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:43:31.392699 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.392685 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:43:31.392823 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.392807 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.393455 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393433 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/10fccf91-d12b-4767-94a3-6a751cf19eb8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.393554 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393470 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-multus-socket-dir-parent\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.393634 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393549 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10fccf91-d12b-4767-94a3-6a751cf19eb8-cni-binary-copy\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.393634 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393593 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-cnibin\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.393743 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393655 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-run-netns\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.393743 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393683 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-hostroot\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.393743 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393708 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-multus-conf-dir\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.393864 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393746 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-etc-kubernetes\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.393864 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393773 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10fccf91-d12b-4767-94a3-6a751cf19eb8-cnibin\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.393864 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393788 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-run-k8s-cni-cncf-io\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.393864 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393803 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10fccf91-d12b-4767-94a3-6a751cf19eb8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.393864 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393837 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10fccf91-d12b-4767-94a3-6a751cf19eb8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.394027 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393873 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-multus-cni-dir\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.394027 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393904 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-var-lib-cni-bin\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.394027 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393939 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-var-lib-cni-multus\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.394027 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393964 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-system-cni-dir\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.394027 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.393986 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-cni-binary-copy\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.394027 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.394026 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-var-lib-kubelet\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.394237 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.394041 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vdhc\" (UniqueName: \"kubernetes.io/projected/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-kube-api-access-4vdhc\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:31.394237 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.394064 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/68ae446a-922e-4629-b3f8-e81fa3ec7eec-agent-certs\") pod \"konnectivity-agent-scf4m\" (UID: \"68ae446a-922e-4629-b3f8-e81fa3ec7eec\") " pod="kube-system/konnectivity-agent-scf4m" Apr 22 18:43:31.394237 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.394087 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10fccf91-d12b-4767-94a3-6a751cf19eb8-os-release\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.394237 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.394109 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-os-release\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.394237 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.394125 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-run-multus-certs\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.394237 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.394139 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxbq9\" (UniqueName: \"kubernetes.io/projected/10fccf91-d12b-4767-94a3-6a751cf19eb8-kube-api-access-zxbq9\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.394237 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.394180 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-multus-daemon-config\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.394237 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.394228 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s254\" (UniqueName: \"kubernetes.io/projected/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-kube-api-access-2s254\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.394548 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.394256 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:31.394548 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.394277 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/68ae446a-922e-4629-b3f8-e81fa3ec7eec-konnectivity-ca\") pod \"konnectivity-agent-scf4m\" (UID: \"68ae446a-922e-4629-b3f8-e81fa3ec7eec\") " pod="kube-system/konnectivity-agent-scf4m" Apr 22 18:43:31.394548 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.394329 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10fccf91-d12b-4767-94a3-6a751cf19eb8-system-cni-dir\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.394548 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.394516 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.394921 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.394903 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:43:31.395376 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.395357 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:43:31.395376 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.395367 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:43:31.395605 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.395589 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:43:31.395715 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.395699 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:43:31.395809 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.395795 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:43:31.395942 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.395926 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:43:31.396020 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.395979 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:43:31.396240 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.396203 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-7mrr7\"" Apr 22 18:43:31.396338 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.396320 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-sdbzg\"" Apr 22 18:43:31.396409 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.396380 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:43:31.396748 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.396726 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:43:31.397050 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.397030 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:43:31.397155 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.397133 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wl2jm\"" Apr 22 18:43:31.397733 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.397693 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2z78z" Apr 22 18:43:31.399946 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.399927 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:43:31.400040 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.399955 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8bdh6\"" Apr 22 18:43:31.400040 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.399982 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:43:31.401038 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.401006 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gmml2" Apr 22 18:43:31.403502 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.403485 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:43:31.403959 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.403935 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4rbfm\"" Apr 22 18:43:31.404305 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.404287 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:43:31.404526 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.404509 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:43:31.405976 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.405956 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:31.406070 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:31.406050 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:43:31.447637 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.447590 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:38:30 +0000 UTC" deadline="2027-12-26 13:57:54.704685924 +0000 UTC" Apr 22 18:43:31.447753 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.447643 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14707h14m23.257046594s" Apr 22 18:43:31.484890 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.484863 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:43:31.494819 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.494787 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:31.494965 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.494836 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/862d4b9d-0093-4dad-8175-851155e4b065-ovnkube-script-lib\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.494965 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.494866 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-registration-dir\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.494965 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.494891 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-kubernetes\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.494965 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.494937 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/68ae446a-922e-4629-b3f8-e81fa3ec7eec-konnectivity-ca\") pod \"konnectivity-agent-scf4m\" (UID: \"68ae446a-922e-4629-b3f8-e81fa3ec7eec\") " pod="kube-system/konnectivity-agent-scf4m" Apr 22 18:43:31.495155 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.494966 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/10fccf91-d12b-4767-94a3-6a751cf19eb8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.495155 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.494995 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c19ea2f1-d4de-4038-9535-1cf172dcae5a-tmp\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.495155 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495021 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-run-netns\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.495155 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495045 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-etc-kubernetes\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.495155 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495070 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d9fe3aa-63a5-4203-989e-96acc10d9ca1-host-slash\") pod \"iptables-alerter-szgjh\" (UID: \"3d9fe3aa-63a5-4203-989e-96acc10d9ca1\") " pod="openshift-network-operator/iptables-alerter-szgjh" Apr 22 18:43:31.495155 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495096 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-log-socket\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.495155 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495125 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.495155 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495149 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/862d4b9d-0093-4dad-8175-851155e4b065-ovn-node-metrics-cert\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.495482 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495175 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-sysctl-conf\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.495482 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495201 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-systemd\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.495482 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495227 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtj8t\" (UniqueName: \"kubernetes.io/projected/862d4b9d-0093-4dad-8175-851155e4b065-kube-api-access-wtj8t\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.495482 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495254 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10fccf91-d12b-4767-94a3-6a751cf19eb8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.495482 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495280 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-var-lib-cni-bin\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.495482 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495305 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-var-lib-cni-multus\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.495482 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495330 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-run-openvswitch\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.495482 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495374 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-etc-selinux\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.495482 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495399 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-sysctl-d\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.495482 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495425 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-system-cni-dir\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.495482 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495451 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-etc-openvswitch\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.495983 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495502 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-sysconfig\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.495983 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495530 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-lib-modules\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.495983 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495557 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/68ae446a-922e-4629-b3f8-e81fa3ec7eec-agent-certs\") pod \"konnectivity-agent-scf4m\" (UID: \"68ae446a-922e-4629-b3f8-e81fa3ec7eec\") " pod="kube-system/konnectivity-agent-scf4m" Apr 22 18:43:31.495983 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495575 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/68ae446a-922e-4629-b3f8-e81fa3ec7eec-konnectivity-ca\") pod \"konnectivity-agent-scf4m\" (UID: \"68ae446a-922e-4629-b3f8-e81fa3ec7eec\") " pod="kube-system/konnectivity-agent-scf4m" Apr 22 18:43:31.495983 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495602 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/10fccf91-d12b-4767-94a3-6a751cf19eb8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.495983 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:31.494996 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:31.495983 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495583 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-os-release\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.495983 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495674 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-os-release\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.495983 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:31.495724 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs podName:317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:31.995700082 +0000 UTC m=+2.979166001 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs") pod "network-metrics-daemon-w8q5c" (UID: "317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:31.495983 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-run-multus-certs\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.495983 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495922 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-run-netns\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.495983 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.495968 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-var-lib-cni-bin\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.496511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496006 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-etc-kubernetes\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.496511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496041 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-var-lib-cni-multus\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.496511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496093 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-system-cni-dir\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.496511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496260 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10fccf91-d12b-4767-94a3-6a751cf19eb8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.496511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496310 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-kubelet\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.496511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496339 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-cni-netd\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.496511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496376 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-run\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.496511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496393 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:43:31.496511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496404 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4b955592-582e-4878-a4b9-99767a2aaefb-hosts-file\") pod \"node-resolver-2z78z\" (UID: \"4b955592-582e-4878-a4b9-99767a2aaefb\") " pod="openshift-dns/node-resolver-2z78z" Apr 22 18:43:31.496511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496428 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdk9z\" (UniqueName: \"kubernetes.io/projected/4b955592-582e-4878-a4b9-99767a2aaefb-kube-api-access-qdk9z\") pod \"node-resolver-2z78z\" (UID: \"4b955592-582e-4878-a4b9-99767a2aaefb\") " pod="openshift-dns/node-resolver-2z78z" Apr 22 18:43:31.496511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496454 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-node-log\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.496511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496505 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-cni-bin\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.496945 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496530 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-device-dir\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.496945 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496554 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-modprobe-d\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.496945 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496585 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2s254\" (UniqueName: \"kubernetes.io/projected/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-kube-api-access-2s254\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.496945 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496637 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-var-lib-kubelet\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.496945 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496659 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-run-multus-certs\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.496945 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496667 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4flk\" (UniqueName: \"kubernetes.io/projected/d150b818-e3a3-47e2-835c-16ae11dff162-kube-api-access-f4flk\") pod \"node-ca-gmml2\" (UID: \"d150b818-e3a3-47e2-835c-16ae11dff162\") " pod="openshift-image-registry/node-ca-gmml2" Apr 22 18:43:31.496945 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496695 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10fccf91-d12b-4767-94a3-6a751cf19eb8-system-cni-dir\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.496945 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496757 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-multus-socket-dir-parent\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.496945 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496768 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10fccf91-d12b-4767-94a3-6a751cf19eb8-system-cni-dir\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.496945 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496790 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-run-ovn\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.496945 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496915 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/862d4b9d-0093-4dad-8175-851155e4b065-ovnkube-config\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.497302 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.496968 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxxrp\" (UniqueName: \"kubernetes.io/projected/55130c2b-d690-4eb6-a00d-df5385f42586-kube-api-access-fxxrp\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.497302 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497001 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10fccf91-d12b-4767-94a3-6a751cf19eb8-cni-binary-copy\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.497302 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497037 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-multus-socket-dir-parent\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.497302 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497074 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-cnibin\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.497302 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497100 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-hostroot\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.497302 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497131 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-multus-conf-dir\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.497302 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497163 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3d9fe3aa-63a5-4203-989e-96acc10d9ca1-iptables-alerter-script\") pod \"iptables-alerter-szgjh\" (UID: \"3d9fe3aa-63a5-4203-989e-96acc10d9ca1\") " pod="openshift-network-operator/iptables-alerter-szgjh" Apr 22 18:43:31.497302 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497233 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-host\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.497302 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497267 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d150b818-e3a3-47e2-835c-16ae11dff162-serviceca\") pod \"node-ca-gmml2\" (UID: \"d150b818-e3a3-47e2-835c-16ae11dff162\") " pod="openshift-image-registry/node-ca-gmml2" Apr 22 18:43:31.497302 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497283 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.497302 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497301 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10fccf91-d12b-4767-94a3-6a751cf19eb8-cnibin\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.497703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497324 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-run-k8s-cni-cncf-io\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.497703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497353 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-slash\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.497703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497381 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-run-netns\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.497703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497432 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-cnibin\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.497703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497451 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-hostroot\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.497703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497472 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-multus-conf-dir\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.497703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497488 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10fccf91-d12b-4767-94a3-6a751cf19eb8-cni-binary-copy\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.497703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497582 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-run-k8s-cni-cncf-io\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.497703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497613 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10fccf91-d12b-4767-94a3-6a751cf19eb8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.497703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497655 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-multus-cni-dir\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.497703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497686 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d150b818-e3a3-47e2-835c-16ae11dff162-host\") pod \"node-ca-gmml2\" (UID: \"d150b818-e3a3-47e2-835c-16ae11dff162\") " pod="openshift-image-registry/node-ca-gmml2" Apr 22 18:43:31.498061 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497706 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-cni-binary-copy\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.498061 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497737 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-var-lib-kubelet\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.498061 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497761 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vdhc\" (UniqueName: \"kubernetes.io/projected/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-kube-api-access-4vdhc\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:31.498061 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497821 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-649mt\" (UniqueName: \"kubernetes.io/projected/3d9fe3aa-63a5-4203-989e-96acc10d9ca1-kube-api-access-649mt\") pod \"iptables-alerter-szgjh\" (UID: \"3d9fe3aa-63a5-4203-989e-96acc10d9ca1\") " pod="openshift-network-operator/iptables-alerter-szgjh" Apr 22 18:43:31.498061 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497844 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-run-systemd\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.498061 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497860 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-var-lib-openvswitch\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.498061 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497880 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-run-ovn-kubernetes\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.498061 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497926 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnzpb\" (UniqueName: \"kubernetes.io/projected/c19ea2f1-d4de-4038-9535-1cf172dcae5a-kube-api-access-cnzpb\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.498061 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497955 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10fccf91-d12b-4767-94a3-6a751cf19eb8-os-release\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.498061 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.497978 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-systemd-units\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.498061 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.498009 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/862d4b9d-0093-4dad-8175-851155e4b065-env-overrides\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.498416 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.498079 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-socket-dir\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.498416 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.498107 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vqz5\" (UniqueName: \"kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5\") pod \"network-check-target-shrxv\" (UID: \"2897aec5-9829-4f9d-a583-c9d3db52b220\") " pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:31.498416 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.498137 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-tuned\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.498416 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.498158 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4b955592-582e-4878-a4b9-99767a2aaefb-tmp-dir\") pod \"node-resolver-2z78z\" (UID: \"4b955592-582e-4878-a4b9-99767a2aaefb\") " pod="openshift-dns/node-resolver-2z78z" Apr 22 18:43:31.498416 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.498176 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxbq9\" (UniqueName: \"kubernetes.io/projected/10fccf91-d12b-4767-94a3-6a751cf19eb8-kube-api-access-zxbq9\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.498416 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.498195 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-sys-fs\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.498416 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.498214 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-sys\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.498416 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.498232 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-multus-daemon-config\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.498797 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.498780 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-multus-daemon-config\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.498855 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.498843 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10fccf91-d12b-4767-94a3-6a751cf19eb8-os-release\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.498898 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.498853 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-multus-cni-dir\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.498934 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.498898 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10fccf91-d12b-4767-94a3-6a751cf19eb8-cnibin\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.499041 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.499019 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10fccf91-d12b-4767-94a3-6a751cf19eb8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.499041 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.499040 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-host-var-lib-kubelet\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.499489 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.499468 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-cni-binary-copy\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.501120 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.501102 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/68ae446a-922e-4629-b3f8-e81fa3ec7eec-agent-certs\") pod \"konnectivity-agent-scf4m\" (UID: \"68ae446a-922e-4629-b3f8-e81fa3ec7eec\") " pod="kube-system/konnectivity-agent-scf4m" Apr 22 18:43:31.507327 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.507303 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s254\" (UniqueName: \"kubernetes.io/projected/d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8-kube-api-access-2s254\") pod \"multus-rhtwv\" (UID: \"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8\") " pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.508827 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.508803 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vdhc\" (UniqueName: \"kubernetes.io/projected/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-kube-api-access-4vdhc\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:31.508965 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.508943 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxbq9\" (UniqueName: \"kubernetes.io/projected/10fccf91-d12b-4767-94a3-6a751cf19eb8-kube-api-access-zxbq9\") pod \"multus-additional-cni-plugins-ftgq4\" (UID: \"10fccf91-d12b-4767-94a3-6a751cf19eb8\") " pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.549066 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.549012 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" event={"ID":"1c6e032a9cd3412b502d428b8f5c545c","Type":"ContainerStarted","Data":"161faef8475daffcdfd3776a1dbe07d4fefa76defc4576b317a27e8ffb524ce2"} Apr 22 18:43:31.550139 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.550110 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" event={"ID":"009f099669c612b1a9a7e8809b1d3526","Type":"ContainerStarted","Data":"a88ea51acfe4458b52b0069e6a13142eedc605d767d3fe63e6a7e0976af1c082"} Apr 22 18:43:31.599006 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.598965 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-lib-modules\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.599191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599019 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-kubelet\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.599191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599045 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-cni-netd\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.599191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599070 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-run\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.599191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599092 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4b955592-582e-4878-a4b9-99767a2aaefb-hosts-file\") pod \"node-resolver-2z78z\" (UID: \"4b955592-582e-4878-a4b9-99767a2aaefb\") " pod="openshift-dns/node-resolver-2z78z" Apr 22 18:43:31.599191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599115 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdk9z\" (UniqueName: \"kubernetes.io/projected/4b955592-582e-4878-a4b9-99767a2aaefb-kube-api-access-qdk9z\") pod \"node-resolver-2z78z\" (UID: \"4b955592-582e-4878-a4b9-99767a2aaefb\") " pod="openshift-dns/node-resolver-2z78z" Apr 22 18:43:31.599191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599122 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-cni-netd\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.599191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599137 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-node-log\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.599191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599122 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-kubelet\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.599191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599160 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-cni-bin\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.599191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599179 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4b955592-582e-4878-a4b9-99767a2aaefb-hosts-file\") pod \"node-resolver-2z78z\" (UID: \"4b955592-582e-4878-a4b9-99767a2aaefb\") " pod="openshift-dns/node-resolver-2z78z" Apr 22 18:43:31.599191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599186 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-device-dir\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.599191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599183 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-lib-modules\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599207 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-run\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599234 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-cni-bin\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599236 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-node-log\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599256 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-modprobe-d\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599284 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-var-lib-kubelet\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599303 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4flk\" (UniqueName: \"kubernetes.io/projected/d150b818-e3a3-47e2-835c-16ae11dff162-kube-api-access-f4flk\") pod \"node-ca-gmml2\" (UID: \"d150b818-e3a3-47e2-835c-16ae11dff162\") " pod="openshift-image-registry/node-ca-gmml2" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599237 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-device-dir\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599331 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-run-ovn\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599355 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/862d4b9d-0093-4dad-8175-851155e4b065-ovnkube-config\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599375 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-var-lib-kubelet\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599374 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-modprobe-d\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599382 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxxrp\" (UniqueName: \"kubernetes.io/projected/55130c2b-d690-4eb6-a00d-df5385f42586-kube-api-access-fxxrp\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599415 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-run-ovn\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599428 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3d9fe3aa-63a5-4203-989e-96acc10d9ca1-iptables-alerter-script\") pod \"iptables-alerter-szgjh\" (UID: \"3d9fe3aa-63a5-4203-989e-96acc10d9ca1\") " pod="openshift-network-operator/iptables-alerter-szgjh" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599458 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-host\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599482 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d150b818-e3a3-47e2-835c-16ae11dff162-serviceca\") pod \"node-ca-gmml2\" (UID: \"d150b818-e3a3-47e2-835c-16ae11dff162\") " pod="openshift-image-registry/node-ca-gmml2" Apr 22 18:43:31.599783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599510 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599537 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-slash\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599548 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-host\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599561 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-run-netns\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d150b818-e3a3-47e2-835c-16ae11dff162-host\") pod \"node-ca-gmml2\" (UID: \"d150b818-e3a3-47e2-835c-16ae11dff162\") " pod="openshift-image-registry/node-ca-gmml2" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599635 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-649mt\" (UniqueName: \"kubernetes.io/projected/3d9fe3aa-63a5-4203-989e-96acc10d9ca1-kube-api-access-649mt\") pod \"iptables-alerter-szgjh\" (UID: \"3d9fe3aa-63a5-4203-989e-96acc10d9ca1\") " pod="openshift-network-operator/iptables-alerter-szgjh" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599661 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-run-systemd\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599668 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-slash\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599687 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-var-lib-openvswitch\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599710 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-run-ovn-kubernetes\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599735 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnzpb\" (UniqueName: \"kubernetes.io/projected/c19ea2f1-d4de-4038-9535-1cf172dcae5a-kube-api-access-cnzpb\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599760 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-systemd-units\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599784 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/862d4b9d-0093-4dad-8175-851155e4b065-env-overrides\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599815 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-socket-dir\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599844 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqz5\" (UniqueName: \"kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5\") pod \"network-check-target-shrxv\" (UID: \"2897aec5-9829-4f9d-a583-c9d3db52b220\") " pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599858 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d150b818-e3a3-47e2-835c-16ae11dff162-host\") pod \"node-ca-gmml2\" (UID: \"d150b818-e3a3-47e2-835c-16ae11dff162\") " pod="openshift-image-registry/node-ca-gmml2" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599590 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.600540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599892 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-tuned\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599915 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4b955592-582e-4878-a4b9-99767a2aaefb-tmp-dir\") pod \"node-resolver-2z78z\" (UID: \"4b955592-582e-4878-a4b9-99767a2aaefb\") " pod="openshift-dns/node-resolver-2z78z" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599935 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-sys-fs\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599950 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-sys\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599982 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/862d4b9d-0093-4dad-8175-851155e4b065-ovnkube-script-lib\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600005 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-registration-dir\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600021 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-kubernetes\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600037 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c19ea2f1-d4de-4038-9535-1cf172dcae5a-tmp\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600055 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d9fe3aa-63a5-4203-989e-96acc10d9ca1-host-slash\") pod \"iptables-alerter-szgjh\" (UID: \"3d9fe3aa-63a5-4203-989e-96acc10d9ca1\") " pod="openshift-network-operator/iptables-alerter-szgjh" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600054 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-sys-fs\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600071 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-log-socket\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600072 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d150b818-e3a3-47e2-835c-16ae11dff162-serviceca\") pod \"node-ca-gmml2\" (UID: \"d150b818-e3a3-47e2-835c-16ae11dff162\") " pod="openshift-image-registry/node-ca-gmml2" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.599638 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-run-netns\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600106 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/862d4b9d-0093-4dad-8175-851155e4b065-ovnkube-config\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600115 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600149 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/862d4b9d-0093-4dad-8175-851155e4b065-ovn-node-metrics-cert\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600163 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-kubernetes\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.601199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600165 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600177 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-sysctl-conf\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600192 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-systemd-units\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600201 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-systemd\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600209 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-var-lib-openvswitch\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600228 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtj8t\" (UniqueName: \"kubernetes.io/projected/862d4b9d-0093-4dad-8175-851155e4b065-kube-api-access-wtj8t\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600239 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-registration-dir\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600271 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-run-openvswitch\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600300 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-etc-selinux\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600324 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-sysctl-d\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600351 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-etc-openvswitch\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600376 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-sysconfig\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600397 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4b955592-582e-4878-a4b9-99767a2aaefb-tmp-dir\") pod \"node-resolver-2z78z\" (UID: \"4b955592-582e-4878-a4b9-99767a2aaefb\") " pod="openshift-dns/node-resolver-2z78z" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600106 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3d9fe3aa-63a5-4203-989e-96acc10d9ca1-iptables-alerter-script\") pod \"iptables-alerter-szgjh\" (UID: \"3d9fe3aa-63a5-4203-989e-96acc10d9ca1\") " pod="openshift-network-operator/iptables-alerter-szgjh" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600438 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d9fe3aa-63a5-4203-989e-96acc10d9ca1-host-slash\") pod \"iptables-alerter-szgjh\" (UID: \"3d9fe3aa-63a5-4203-989e-96acc10d9ca1\") " pod="openshift-network-operator/iptables-alerter-szgjh" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600454 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-sysconfig\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600486 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-host-run-ovn-kubernetes\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.601765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600495 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-log-socket\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.602431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600514 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/862d4b9d-0093-4dad-8175-851155e4b065-ovnkube-script-lib\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.602431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600534 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-etc-openvswitch\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.602431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600545 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-etc-selinux\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.602431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600152 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-run-systemd\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.602431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600580 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/862d4b9d-0093-4dad-8175-851155e4b065-env-overrides\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.602431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600587 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/862d4b9d-0093-4dad-8175-851155e4b065-run-openvswitch\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.602431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600582 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-sys\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.602431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600650 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-systemd\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.602431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600685 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-sysctl-conf\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.602431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600753 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-sysctl-d\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.602431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.600787 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/55130c2b-d690-4eb6-a00d-df5385f42586-socket-dir\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.603462 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.603439 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c19ea2f1-d4de-4038-9535-1cf172dcae5a-tmp\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.603560 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.603468 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/862d4b9d-0093-4dad-8175-851155e4b065-ovn-node-metrics-cert\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.603603 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.603583 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c19ea2f1-d4de-4038-9535-1cf172dcae5a-etc-tuned\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.607207 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:31.607152 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:31.607207 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:31.607177 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:31.607207 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:31.607191 2570 projected.go:194] Error preparing data for projected volume kube-api-access-5vqz5 for pod openshift-network-diagnostics/network-check-target-shrxv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:31.607364 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:31.607256 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5 podName:2897aec5-9829-4f9d-a583-c9d3db52b220 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:32.107237648 +0000 UTC m=+3.090703580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5vqz5" (UniqueName: "kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5") pod "network-check-target-shrxv" (UID: "2897aec5-9829-4f9d-a583-c9d3db52b220") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:31.609090 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.609064 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-649mt\" (UniqueName: \"kubernetes.io/projected/3d9fe3aa-63a5-4203-989e-96acc10d9ca1-kube-api-access-649mt\") pod \"iptables-alerter-szgjh\" (UID: \"3d9fe3aa-63a5-4203-989e-96acc10d9ca1\") " pod="openshift-network-operator/iptables-alerter-szgjh" Apr 22 18:43:31.609188 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.609148 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxxrp\" (UniqueName: \"kubernetes.io/projected/55130c2b-d690-4eb6-a00d-df5385f42586-kube-api-access-fxxrp\") pod \"aws-ebs-csi-driver-node-kx8hk\" (UID: \"55130c2b-d690-4eb6-a00d-df5385f42586\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.609485 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.609463 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtj8t\" (UniqueName: \"kubernetes.io/projected/862d4b9d-0093-4dad-8175-851155e4b065-kube-api-access-wtj8t\") pod \"ovnkube-node-rpznc\" (UID: \"862d4b9d-0093-4dad-8175-851155e4b065\") " pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.609762 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.609736 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnzpb\" (UniqueName: \"kubernetes.io/projected/c19ea2f1-d4de-4038-9535-1cf172dcae5a-kube-api-access-cnzpb\") pod \"tuned-x7jz9\" (UID: \"c19ea2f1-d4de-4038-9535-1cf172dcae5a\") " pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.610177 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.610147 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdk9z\" (UniqueName: \"kubernetes.io/projected/4b955592-582e-4878-a4b9-99767a2aaefb-kube-api-access-qdk9z\") pod \"node-resolver-2z78z\" (UID: \"4b955592-582e-4878-a4b9-99767a2aaefb\") " pod="openshift-dns/node-resolver-2z78z" Apr 22 18:43:31.610924 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.610907 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4flk\" (UniqueName: \"kubernetes.io/projected/d150b818-e3a3-47e2-835c-16ae11dff162-kube-api-access-f4flk\") pod \"node-ca-gmml2\" (UID: \"d150b818-e3a3-47e2-835c-16ae11dff162\") " pod="openshift-image-registry/node-ca-gmml2" Apr 22 18:43:31.689551 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.689473 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-scf4m" Apr 22 18:43:31.697295 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.697264 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ftgq4" Apr 22 18:43:31.708173 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.708149 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rhtwv" Apr 22 18:43:31.717831 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.717804 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-szgjh" Apr 22 18:43:31.724260 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.724237 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:31.727713 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.727693 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:43:31.731671 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.731651 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" Apr 22 18:43:31.739318 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.739300 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" Apr 22 18:43:31.746876 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.746851 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2z78z" Apr 22 18:43:31.752504 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:31.752484 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gmml2" Apr 22 18:43:32.002910 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.002820 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:32.003077 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:32.002992 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:32.003077 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:32.003075 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs podName:317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:33.003053751 +0000 UTC m=+3.986519689 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs") pod "network-metrics-daemon-w8q5c" (UID: "317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:32.204265 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.204228 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqz5\" (UniqueName: \"kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5\") pod \"network-check-target-shrxv\" (UID: \"2897aec5-9829-4f9d-a583-c9d3db52b220\") " pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:32.204465 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:32.204415 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:32.204465 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:32.204441 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:32.204465 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:32.204454 2570 projected.go:194] Error preparing data for projected volume kube-api-access-5vqz5 for pod openshift-network-diagnostics/network-check-target-shrxv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:32.204692 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:32.204528 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5 podName:2897aec5-9829-4f9d-a583-c9d3db52b220 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:33.204507017 +0000 UTC m=+4.187972936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5vqz5" (UniqueName: "kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5") pod "network-check-target-shrxv" (UID: "2897aec5-9829-4f9d-a583-c9d3db52b220") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:32.239393 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:32.239365 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc19ea2f1_d4de_4038_9535_1cf172dcae5a.slice/crio-250d471d3135508f0e08f06711b63708876addf20e554289c9a9e0b01d62e592 WatchSource:0}: Error finding container 250d471d3135508f0e08f06711b63708876addf20e554289c9a9e0b01d62e592: Status 404 returned error can't find the container with id 250d471d3135508f0e08f06711b63708876addf20e554289c9a9e0b01d62e592 Apr 22 18:43:32.241416 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:32.241390 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd150b818_e3a3_47e2_835c_16ae11dff162.slice/crio-927d3369756818948f98738f21ec828918a8881c3f71150111c36bf3a310e060 WatchSource:0}: Error finding container 927d3369756818948f98738f21ec828918a8881c3f71150111c36bf3a310e060: Status 404 returned error can't find the container with id 927d3369756818948f98738f21ec828918a8881c3f71150111c36bf3a310e060 Apr 22 18:43:32.244460 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:32.244436 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55130c2b_d690_4eb6_a00d_df5385f42586.slice/crio-a178a275cff527d4feeaede74c31f39245414e45e31f41975d5600bbc32d6645 WatchSource:0}: Error finding container a178a275cff527d4feeaede74c31f39245414e45e31f41975d5600bbc32d6645: Status 404 returned error can't find the container with id a178a275cff527d4feeaede74c31f39245414e45e31f41975d5600bbc32d6645 Apr 22 18:43:32.245681 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:32.245651 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b955592_582e_4878_a4b9_99767a2aaefb.slice/crio-21bf52e7d850f6e33c59fd3827315f9c00e01d11ebd2f090434d18ae28d87f76 WatchSource:0}: Error finding container 21bf52e7d850f6e33c59fd3827315f9c00e01d11ebd2f090434d18ae28d87f76: Status 404 returned error can't find the container with id 21bf52e7d850f6e33c59fd3827315f9c00e01d11ebd2f090434d18ae28d87f76 Apr 22 18:43:32.249055 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:32.246719 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10fccf91_d12b_4767_94a3_6a751cf19eb8.slice/crio-48c1394f1a88d8774e316051b826965f1c710a37e2921f2f2e27398060e4961f WatchSource:0}: Error finding container 48c1394f1a88d8774e316051b826965f1c710a37e2921f2f2e27398060e4961f: Status 404 returned error can't find the container with id 48c1394f1a88d8774e316051b826965f1c710a37e2921f2f2e27398060e4961f Apr 22 18:43:32.250893 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:32.250564 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68ae446a_922e_4629_b3f8_e81fa3ec7eec.slice/crio-8c33b38717efe32ef8e047ccd1ff1980d9a58f888e180477da1be47fcba1078a WatchSource:0}: Error finding container 8c33b38717efe32ef8e047ccd1ff1980d9a58f888e180477da1be47fcba1078a: Status 404 returned error can't find the container with id 8c33b38717efe32ef8e047ccd1ff1980d9a58f888e180477da1be47fcba1078a Apr 22 18:43:32.251410 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:32.251391 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod862d4b9d_0093_4dad_8175_851155e4b065.slice/crio-e7e8e091a97acccdce71e36bbea67a870383bc32d831d09597089f35041478a0 WatchSource:0}: Error finding container e7e8e091a97acccdce71e36bbea67a870383bc32d831d09597089f35041478a0: Status 404 returned error can't find the container with id e7e8e091a97acccdce71e36bbea67a870383bc32d831d09597089f35041478a0 Apr 22 18:43:32.253507 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:43:32.253376 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd24c0ebd_b6b6_4d29_bb7b_8abf194a33f8.slice/crio-c26be8bd7d0570a9550f213757130cb32bfec09457737915759cc710164b4914 WatchSource:0}: Error finding container c26be8bd7d0570a9550f213757130cb32bfec09457737915759cc710164b4914: Status 404 returned error can't find the container with id c26be8bd7d0570a9550f213757130cb32bfec09457737915759cc710164b4914 Apr 22 18:43:32.448561 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.448358 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:38:30 +0000 UTC" deadline="2028-01-28 01:44:23.946549792 +0000 UTC" Apr 22 18:43:32.448561 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.448555 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15487h0m51.497997425s" Apr 22 18:43:32.546096 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.546001 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:32.546217 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:32.546126 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:32.557054 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.557015 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2z78z" event={"ID":"4b955592-582e-4878-a4b9-99767a2aaefb","Type":"ContainerStarted","Data":"21bf52e7d850f6e33c59fd3827315f9c00e01d11ebd2f090434d18ae28d87f76"} Apr 22 18:43:32.559495 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.559463 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-szgjh" event={"ID":"3d9fe3aa-63a5-4203-989e-96acc10d9ca1","Type":"ContainerStarted","Data":"0cab0eb95010c3e85b5d757956e47dbf53886bd830d729cb2b7169dc65e164a6"} Apr 22 18:43:32.560549 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.560520 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gmml2" event={"ID":"d150b818-e3a3-47e2-835c-16ae11dff162","Type":"ContainerStarted","Data":"927d3369756818948f98738f21ec828918a8881c3f71150111c36bf3a310e060"} Apr 22 18:43:32.561559 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.561511 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" event={"ID":"c19ea2f1-d4de-4038-9535-1cf172dcae5a","Type":"ContainerStarted","Data":"250d471d3135508f0e08f06711b63708876addf20e554289c9a9e0b01d62e592"} Apr 22 18:43:32.564112 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.564072 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" event={"ID":"009f099669c612b1a9a7e8809b1d3526","Type":"ContainerStarted","Data":"e6efb8aea6fc092bfeba107c2d6260532aa03a823fb45fbbe2f67742445fce5f"} Apr 22 18:43:32.565635 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.565590 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rhtwv" event={"ID":"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8","Type":"ContainerStarted","Data":"c26be8bd7d0570a9550f213757130cb32bfec09457737915759cc710164b4914"} Apr 22 18:43:32.566669 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.566646 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" event={"ID":"862d4b9d-0093-4dad-8175-851155e4b065","Type":"ContainerStarted","Data":"e7e8e091a97acccdce71e36bbea67a870383bc32d831d09597089f35041478a0"} Apr 22 18:43:32.568202 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.568171 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-scf4m" event={"ID":"68ae446a-922e-4629-b3f8-e81fa3ec7eec","Type":"ContainerStarted","Data":"8c33b38717efe32ef8e047ccd1ff1980d9a58f888e180477da1be47fcba1078a"} Apr 22 18:43:32.570256 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.570232 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgq4" event={"ID":"10fccf91-d12b-4767-94a3-6a751cf19eb8","Type":"ContainerStarted","Data":"48c1394f1a88d8774e316051b826965f1c710a37e2921f2f2e27398060e4961f"} Apr 22 18:43:32.573225 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:32.573198 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" event={"ID":"55130c2b-d690-4eb6-a00d-df5385f42586","Type":"ContainerStarted","Data":"a178a275cff527d4feeaede74c31f39245414e45e31f41975d5600bbc32d6645"} Apr 22 18:43:33.011549 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:33.010899 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:33.011549 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:33.011060 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:33.011549 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:33.011124 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs podName:317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:35.011104759 +0000 UTC m=+5.994570682 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs") pod "network-metrics-daemon-w8q5c" (UID: "317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:33.212879 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:33.212838 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqz5\" (UniqueName: \"kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5\") pod \"network-check-target-shrxv\" (UID: \"2897aec5-9829-4f9d-a583-c9d3db52b220\") " pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:33.213075 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:33.213054 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:33.213137 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:33.213077 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:33.213137 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:33.213091 2570 projected.go:194] Error preparing data for projected volume kube-api-access-5vqz5 for pod openshift-network-diagnostics/network-check-target-shrxv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:33.213232 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:33.213150 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5 podName:2897aec5-9829-4f9d-a583-c9d3db52b220 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:35.21313062 +0000 UTC m=+6.196596553 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5vqz5" (UniqueName: "kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5") pod "network-check-target-shrxv" (UID: "2897aec5-9829-4f9d-a583-c9d3db52b220") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:33.546741 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:33.545507 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:33.547194 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:33.546813 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:43:33.594690 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:33.594585 2570 generic.go:358] "Generic (PLEG): container finished" podID="1c6e032a9cd3412b502d428b8f5c545c" containerID="147a1a58ad65738406e6df58a0ecf4648083421cdad3992ce21c60fd3a97432c" exitCode=0 Apr 22 18:43:33.595965 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:33.595930 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" event={"ID":"1c6e032a9cd3412b502d428b8f5c545c","Type":"ContainerDied","Data":"147a1a58ad65738406e6df58a0ecf4648083421cdad3992ce21c60fd3a97432c"} Apr 22 18:43:33.613697 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:33.613636 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-244.ec2.internal" podStartSLOduration=3.613596567 podStartE2EDuration="3.613596567s" podCreationTimestamp="2026-04-22 18:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:43:32.590609645 +0000 UTC m=+3.574075585" watchObservedRunningTime="2026-04-22 18:43:33.613596567 +0000 UTC m=+4.597062509" Apr 22 18:43:34.546098 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:34.545534 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:34.546098 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:34.545707 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:34.606651 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:34.606080 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" event={"ID":"1c6e032a9cd3412b502d428b8f5c545c","Type":"ContainerStarted","Data":"853439132f631a5c53f7ed11db6d52ff5bed9039b2d2b35b60efe717956ca3ee"} Apr 22 18:43:35.029280 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:35.029236 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:35.029480 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:35.029376 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:35.029480 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:35.029440 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs podName:317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:39.02942202 +0000 UTC m=+10.012887946 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs") pod "network-metrics-daemon-w8q5c" (UID: "317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:35.230742 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:35.230699 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqz5\" (UniqueName: \"kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5\") pod \"network-check-target-shrxv\" (UID: \"2897aec5-9829-4f9d-a583-c9d3db52b220\") " pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:35.230919 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:35.230885 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:35.230919 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:35.230905 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:35.230919 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:35.230917 2570 projected.go:194] Error preparing data for projected volume kube-api-access-5vqz5 for pod openshift-network-diagnostics/network-check-target-shrxv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:35.231082 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:35.230980 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5 podName:2897aec5-9829-4f9d-a583-c9d3db52b220 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:39.230961557 +0000 UTC m=+10.214427481 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5vqz5" (UniqueName: "kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5") pod "network-check-target-shrxv" (UID: "2897aec5-9829-4f9d-a583-c9d3db52b220") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:35.548403 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:35.548370 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:35.548603 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:35.548542 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:43:36.546177 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:36.546132 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:36.546688 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:36.546322 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:36.737111 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:36.737047 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-244.ec2.internal" podStartSLOduration=6.737024143 podStartE2EDuration="6.737024143s" podCreationTimestamp="2026-04-22 18:43:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:43:34.63291362 +0000 UTC m=+5.616379562" watchObservedRunningTime="2026-04-22 18:43:36.737024143 +0000 UTC m=+7.720490085" Apr 22 18:43:36.737418 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:36.737387 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9v7dk"] Apr 22 18:43:36.740679 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:36.740650 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:36.740808 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:36.740729 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:43:36.849471 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:36.849435 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-kubelet-config\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:36.849663 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:36.849503 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-dbus\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:36.849663 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:36.849583 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:36.950641 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:36.950292 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:36.950641 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:36.950360 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-kubelet-config\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:36.950641 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:36.950411 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-dbus\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:36.950641 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:36.950434 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:36.950641 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:36.950506 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret podName:a88529a8-1055-4ebf-bd16-aa151ce8e4cb nodeName:}" failed. No retries permitted until 2026-04-22 18:43:37.450489703 +0000 UTC m=+8.433955621 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret") pod "global-pull-secret-syncer-9v7dk" (UID: "a88529a8-1055-4ebf-bd16-aa151ce8e4cb") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:36.950641 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:36.950441 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-kubelet-config\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:36.950641 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:36.950589 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-dbus\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:37.454502 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:37.454460 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:37.454711 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:37.454642 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:37.454770 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:37.454729 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret podName:a88529a8-1055-4ebf-bd16-aa151ce8e4cb nodeName:}" failed. No retries permitted until 2026-04-22 18:43:38.45470824 +0000 UTC m=+9.438174161 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret") pod "global-pull-secret-syncer-9v7dk" (UID: "a88529a8-1055-4ebf-bd16-aa151ce8e4cb") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:37.545883 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:37.545847 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:37.546059 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:37.545967 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:43:38.463318 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:38.463268 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:38.463779 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:38.463431 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:38.463779 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:38.463490 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret podName:a88529a8-1055-4ebf-bd16-aa151ce8e4cb nodeName:}" failed. No retries permitted until 2026-04-22 18:43:40.463474584 +0000 UTC m=+11.446940502 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret") pod "global-pull-secret-syncer-9v7dk" (UID: "a88529a8-1055-4ebf-bd16-aa151ce8e4cb") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:38.545641 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:38.545596 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:38.545805 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:38.545744 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:43:38.546128 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:38.546097 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:38.546240 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:38.546207 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:39.069775 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:39.069717 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:39.069958 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:39.069880 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:39.070018 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:39.069957 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs podName:317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:47.069933542 +0000 UTC m=+18.053399478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs") pod "network-metrics-daemon-w8q5c" (UID: "317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:39.271881 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:39.271841 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqz5\" (UniqueName: \"kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5\") pod \"network-check-target-shrxv\" (UID: \"2897aec5-9829-4f9d-a583-c9d3db52b220\") " pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:39.272069 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:39.272013 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:39.272069 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:39.272034 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:39.272069 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:39.272048 2570 projected.go:194] Error preparing data for projected volume kube-api-access-5vqz5 for pod openshift-network-diagnostics/network-check-target-shrxv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:39.272228 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:39.272109 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5 podName:2897aec5-9829-4f9d-a583-c9d3db52b220 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:47.272090524 +0000 UTC m=+18.255556450 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5vqz5" (UniqueName: "kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5") pod "network-check-target-shrxv" (UID: "2897aec5-9829-4f9d-a583-c9d3db52b220") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:39.546660 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:39.546608 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:39.547031 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:39.546748 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:43:40.481916 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:40.481864 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:40.482113 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:40.482085 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:40.482184 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:40.482142 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret podName:a88529a8-1055-4ebf-bd16-aa151ce8e4cb nodeName:}" failed. No retries permitted until 2026-04-22 18:43:44.482128724 +0000 UTC m=+15.465594643 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret") pod "global-pull-secret-syncer-9v7dk" (UID: "a88529a8-1055-4ebf-bd16-aa151ce8e4cb") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:40.546637 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:40.546134 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:40.546637 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:40.546258 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:43:40.546637 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:40.546318 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:40.546637 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:40.546452 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:41.545967 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:41.545930 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:41.546128 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:41.546066 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:43:42.545332 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:42.545297 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:42.545783 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:42.545413 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:43:42.545783 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:42.545480 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:42.545783 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:42.545608 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:43.545697 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:43.545654 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:43.546086 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:43.545782 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:43:44.514804 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:44.514758 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:44.514967 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:44.514892 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:44.514967 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:44.514949 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret podName:a88529a8-1055-4ebf-bd16-aa151ce8e4cb nodeName:}" failed. No retries permitted until 2026-04-22 18:43:52.514935895 +0000 UTC m=+23.498401813 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret") pod "global-pull-secret-syncer-9v7dk" (UID: "a88529a8-1055-4ebf-bd16-aa151ce8e4cb") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:44.545448 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:44.545419 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:44.545448 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:44.545441 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:44.545697 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:44.545546 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:43:44.545754 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:44.545710 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:45.546078 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:45.546040 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:45.546526 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:45.546175 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:43:46.545613 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:46.545574 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:46.545851 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:46.545578 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:46.545851 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:46.545735 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:43:46.545851 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:46.545805 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:47.131609 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:47.131579 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:47.132071 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:47.131734 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:47.132071 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:47.131816 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs podName:317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:03.13179464 +0000 UTC m=+34.115260578 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs") pod "network-metrics-daemon-w8q5c" (UID: "317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:43:47.333667 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:47.333614 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqz5\" (UniqueName: \"kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5\") pod \"network-check-target-shrxv\" (UID: \"2897aec5-9829-4f9d-a583-c9d3db52b220\") " pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:47.333901 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:47.333749 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:43:47.333901 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:47.333771 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:43:47.333901 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:47.333782 2570 projected.go:194] Error preparing data for projected volume kube-api-access-5vqz5 for pod openshift-network-diagnostics/network-check-target-shrxv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:47.333901 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:47.333840 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5 podName:2897aec5-9829-4f9d-a583-c9d3db52b220 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:03.333819998 +0000 UTC m=+34.317285920 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5vqz5" (UniqueName: "kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5") pod "network-check-target-shrxv" (UID: "2897aec5-9829-4f9d-a583-c9d3db52b220") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:43:47.547719 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:47.545442 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:47.547966 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:47.547728 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:43:48.545214 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:48.545181 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:48.545744 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:48.545182 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:48.545744 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:48.545326 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:48.545744 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:48.545392 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:43:49.547058 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:49.547026 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:49.547442 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:49.547140 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:43:49.632713 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:49.632669 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-scf4m" event={"ID":"68ae446a-922e-4629-b3f8-e81fa3ec7eec","Type":"ContainerStarted","Data":"9fca7e0e62e73d6da96f57d9fa806391c2d20409f677fa5ace026b29628cbe0c"} Apr 22 18:43:49.633975 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:49.633956 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" event={"ID":"55130c2b-d690-4eb6-a00d-df5385f42586","Type":"ContainerStarted","Data":"0870784389f8568a2a2d4e4599165ec046499980d86efcfa1111637199474c2b"} Apr 22 18:43:50.546189 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.545958 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:50.546348 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.545968 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:50.546348 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:50.546299 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:43:50.546469 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:50.546385 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:50.636802 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.636778 2570 generic.go:358] "Generic (PLEG): container finished" podID="10fccf91-d12b-4767-94a3-6a751cf19eb8" containerID="c945474a46f74ba784879b022636d3f474f73d67fd39af55ad50325391e6d399" exitCode=0 Apr 22 18:43:50.637489 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.636847 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgq4" event={"ID":"10fccf91-d12b-4767-94a3-6a751cf19eb8","Type":"ContainerDied","Data":"c945474a46f74ba784879b022636d3f474f73d67fd39af55ad50325391e6d399"} Apr 22 18:43:50.638054 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.638034 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2z78z" event={"ID":"4b955592-582e-4878-a4b9-99767a2aaefb","Type":"ContainerStarted","Data":"c0aebe382f4e5f48d1848f2237efd058a162c8ded3bb688958cd3ee281e64324"} Apr 22 18:43:50.639208 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.639181 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gmml2" event={"ID":"d150b818-e3a3-47e2-835c-16ae11dff162","Type":"ContainerStarted","Data":"e40484401ec83667cae03e128af9ee68233c229cacaec1093c590a50212148b0"} Apr 22 18:43:50.640349 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.640327 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" event={"ID":"c19ea2f1-d4de-4038-9535-1cf172dcae5a","Type":"ContainerStarted","Data":"9e026b691e5e4145c9bf3e3e1dde277d16cfdad01a8aaaf13eefd498dd0bf530"} Apr 22 18:43:50.641584 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.641558 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rhtwv" event={"ID":"d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8","Type":"ContainerStarted","Data":"c7d6db474f4564edd59523a7afeca32d888ed98665622a99419a458934361a5f"} Apr 22 18:43:50.644728 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.644711 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/ovn-acl-logging/0.log" Apr 22 18:43:50.645049 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.645032 2570 generic.go:358] "Generic (PLEG): container finished" podID="862d4b9d-0093-4dad-8175-851155e4b065" containerID="6b9c445e3066f28446690a5198c1d0a1cd91976c71d356ee32b410837a101431" exitCode=1 Apr 22 18:43:50.645118 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.645093 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" event={"ID":"862d4b9d-0093-4dad-8175-851155e4b065","Type":"ContainerStarted","Data":"2031d3c4ffee74b303bae683e937efa84858fe84354b474a1b1be744b8f31ed1"} Apr 22 18:43:50.645166 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.645125 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" event={"ID":"862d4b9d-0093-4dad-8175-851155e4b065","Type":"ContainerStarted","Data":"f0ad9b3d56a02399ea56e959e2766a13797455687b7211cdb1efd43c8f2a43fb"} Apr 22 18:43:50.645166 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.645139 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" event={"ID":"862d4b9d-0093-4dad-8175-851155e4b065","Type":"ContainerStarted","Data":"c0ab90ee7df5f4a99118482fefd8021b6109d71925bb862508558dc2739b37e9"} Apr 22 18:43:50.645166 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.645154 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" event={"ID":"862d4b9d-0093-4dad-8175-851155e4b065","Type":"ContainerStarted","Data":"bcc9a60670d389ec37e8b7f51d65349abcb9f71ee028f491f54b2f5e82760ca1"} Apr 22 18:43:50.645288 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.645168 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" event={"ID":"862d4b9d-0093-4dad-8175-851155e4b065","Type":"ContainerDied","Data":"6b9c445e3066f28446690a5198c1d0a1cd91976c71d356ee32b410837a101431"} Apr 22 18:43:50.645288 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.645183 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" event={"ID":"862d4b9d-0093-4dad-8175-851155e4b065","Type":"ContainerStarted","Data":"aa925ebf9ee35ba0fa7552678efc0163786f4b2e71f5f665273c9cd1104726e4"} Apr 22 18:43:50.676641 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.676589 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-scf4m" podStartSLOduration=4.566705819 podStartE2EDuration="21.676579373s" podCreationTimestamp="2026-04-22 18:43:29 +0000 UTC" firstStartedPulling="2026-04-22 18:43:32.252794307 +0000 UTC m=+3.236260240" lastFinishedPulling="2026-04-22 18:43:49.362667874 +0000 UTC m=+20.346133794" observedRunningTime="2026-04-22 18:43:50.676272531 +0000 UTC m=+21.659738469" watchObservedRunningTime="2026-04-22 18:43:50.676579373 +0000 UTC m=+21.660045314" Apr 22 18:43:50.694932 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.694900 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-x7jz9" podStartSLOduration=4.491756969 podStartE2EDuration="21.694889259s" podCreationTimestamp="2026-04-22 18:43:29 +0000 UTC" firstStartedPulling="2026-04-22 18:43:32.241526958 +0000 UTC m=+3.224992891" lastFinishedPulling="2026-04-22 18:43:49.444659257 +0000 UTC m=+20.428125181" observedRunningTime="2026-04-22 18:43:50.694561326 +0000 UTC m=+21.678027267" watchObservedRunningTime="2026-04-22 18:43:50.694889259 +0000 UTC m=+21.678355200" Apr 22 18:43:50.718457 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.718419 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gmml2" podStartSLOduration=4.519612167 podStartE2EDuration="21.718409065s" podCreationTimestamp="2026-04-22 18:43:29 +0000 UTC" firstStartedPulling="2026-04-22 18:43:32.243586545 +0000 UTC m=+3.227052465" lastFinishedPulling="2026-04-22 18:43:49.44238343 +0000 UTC m=+20.425849363" observedRunningTime="2026-04-22 18:43:50.715207502 +0000 UTC m=+21.698673443" watchObservedRunningTime="2026-04-22 18:43:50.718409065 +0000 UTC m=+21.701875005" Apr 22 18:43:50.740021 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.739984 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rhtwv" podStartSLOduration=4.5405910689999995 podStartE2EDuration="21.739975372s" podCreationTimestamp="2026-04-22 18:43:29 +0000 UTC" firstStartedPulling="2026-04-22 18:43:32.255270913 +0000 UTC m=+3.238736835" lastFinishedPulling="2026-04-22 18:43:49.454655216 +0000 UTC m=+20.438121138" observedRunningTime="2026-04-22 18:43:50.739692609 +0000 UTC m=+21.723158556" watchObservedRunningTime="2026-04-22 18:43:50.739975372 +0000 UTC m=+21.723441313" Apr 22 18:43:50.757947 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.757900 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2z78z" podStartSLOduration=4.55699881 podStartE2EDuration="21.757888581s" podCreationTimestamp="2026-04-22 18:43:29 +0000 UTC" firstStartedPulling="2026-04-22 18:43:32.249432764 +0000 UTC m=+3.232898685" lastFinishedPulling="2026-04-22 18:43:49.450322335 +0000 UTC m=+20.433788456" observedRunningTime="2026-04-22 18:43:50.757685114 +0000 UTC m=+21.741151047" watchObservedRunningTime="2026-04-22 18:43:50.757888581 +0000 UTC m=+21.741354522" Apr 22 18:43:50.912222 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:50.912201 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:43:51.471225 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:51.471113 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:43:50.912217591Z","UUID":"837c32df-a9f2-4e33-86c1-647c40ac469b","Handler":null,"Name":"","Endpoint":""} Apr 22 18:43:51.474531 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:51.474436 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:43:51.474531 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:51.474530 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:43:51.545791 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:51.545755 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:51.545949 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:51.545885 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:43:51.650126 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:51.650087 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" event={"ID":"55130c2b-d690-4eb6-a00d-df5385f42586","Type":"ContainerStarted","Data":"852dda466e04a8cd832dbfcff7726ef8fe04d3698b51493c0b5c089c9bfe3947"} Apr 22 18:43:51.651834 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:51.651806 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-szgjh" event={"ID":"3d9fe3aa-63a5-4203-989e-96acc10d9ca1","Type":"ContainerStarted","Data":"ad1200a7b069db98a7b2f0fc5b192c843c89b9f5e40a5f8c8cbbdeb3fef4ddd6"} Apr 22 18:43:51.670231 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:51.670173 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-szgjh" podStartSLOduration=5.558446901 podStartE2EDuration="22.670158348s" podCreationTimestamp="2026-04-22 18:43:29 +0000 UTC" firstStartedPulling="2026-04-22 18:43:32.250960387 +0000 UTC m=+3.234426317" lastFinishedPulling="2026-04-22 18:43:49.362671844 +0000 UTC m=+20.346137764" observedRunningTime="2026-04-22 18:43:51.670012719 +0000 UTC m=+22.653478676" watchObservedRunningTime="2026-04-22 18:43:51.670158348 +0000 UTC m=+22.653624288" Apr 22 18:43:52.545692 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:52.545458 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:52.545909 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:52.545458 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:52.545909 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:52.545784 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:52.545909 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:52.545876 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:43:52.569695 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:52.569652 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:52.569899 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:52.569737 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:52.569899 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:52.569802 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret podName:a88529a8-1055-4ebf-bd16-aa151ce8e4cb nodeName:}" failed. No retries permitted until 2026-04-22 18:44:08.569786742 +0000 UTC m=+39.553252661 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret") pod "global-pull-secret-syncer-9v7dk" (UID: "a88529a8-1055-4ebf-bd16-aa151ce8e4cb") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:43:52.575691 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:52.575662 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-scf4m" Apr 22 18:43:52.576330 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:52.576310 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-scf4m" Apr 22 18:43:52.655777 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:52.655736 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" event={"ID":"55130c2b-d690-4eb6-a00d-df5385f42586","Type":"ContainerStarted","Data":"a145f55dbbf04b539da6a2b051089567e9e545f1f4d1759d99f416e901da73fe"} Apr 22 18:43:53.545985 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:53.545952 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:53.546193 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:53.546070 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:43:53.660805 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:53.660775 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/ovn-acl-logging/0.log" Apr 22 18:43:53.661433 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:53.661196 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" event={"ID":"862d4b9d-0093-4dad-8175-851155e4b065","Type":"ContainerStarted","Data":"353a485404169530182d4c85cb51684cee4118254c395cab3cd9188127f88083"} Apr 22 18:43:54.545555 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:54.545515 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:54.545775 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:54.545514 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:54.545775 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:54.545684 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:54.545775 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:54.545718 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:43:55.545549 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:55.545356 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:55.546313 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:55.545657 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:43:55.668927 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:55.668895 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/ovn-acl-logging/0.log" Apr 22 18:43:55.669247 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:55.669224 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" event={"ID":"862d4b9d-0093-4dad-8175-851155e4b065","Type":"ContainerStarted","Data":"8789c3f9d582ac0f15ef9cab8dc45dc2db15db4a10437ecb89d82c79c7f0dbd3"} Apr 22 18:43:55.669540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:55.669512 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:55.669655 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:55.669549 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:55.669774 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:55.669755 2570 scope.go:117] "RemoveContainer" containerID="6b9c445e3066f28446690a5198c1d0a1cd91976c71d356ee32b410837a101431" Apr 22 18:43:55.671047 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:55.670990 2570 generic.go:358] "Generic (PLEG): container finished" podID="10fccf91-d12b-4767-94a3-6a751cf19eb8" containerID="0aecd5e82d258b025440a83adb1bf98297b595224d02239d919b1ee1e2f226b5" exitCode=0 Apr 22 18:43:55.671047 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:55.671036 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgq4" event={"ID":"10fccf91-d12b-4767-94a3-6a751cf19eb8","Type":"ContainerDied","Data":"0aecd5e82d258b025440a83adb1bf98297b595224d02239d919b1ee1e2f226b5"} Apr 22 18:43:55.686252 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:55.686227 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:55.700757 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:55.699174 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kx8hk" podStartSLOduration=6.987096225 podStartE2EDuration="26.699157691s" podCreationTimestamp="2026-04-22 18:43:29 +0000 UTC" firstStartedPulling="2026-04-22 18:43:32.246181781 +0000 UTC m=+3.229647715" lastFinishedPulling="2026-04-22 18:43:51.958243252 +0000 UTC m=+22.941709181" observedRunningTime="2026-04-22 18:43:52.675278933 +0000 UTC m=+23.658744875" watchObservedRunningTime="2026-04-22 18:43:55.699157691 +0000 UTC m=+26.682623633" Apr 22 18:43:56.545300 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.545266 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:56.545461 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.545266 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:56.545461 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:56.545391 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:43:56.545577 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:56.545476 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:56.677467 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.677437 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/ovn-acl-logging/0.log" Apr 22 18:43:56.677926 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.677893 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" event={"ID":"862d4b9d-0093-4dad-8175-851155e4b065","Type":"ContainerStarted","Data":"faef7428a119bc84faaa1c095bb6a94dc96b18e13380de7e7728d0f35bbd708b"} Apr 22 18:43:56.678281 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.678246 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:56.680674 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.680644 2570 generic.go:358] "Generic (PLEG): container finished" podID="10fccf91-d12b-4767-94a3-6a751cf19eb8" containerID="4279cc8959edd39cdd426fceceb6504ef0bc26925bb4a422852583ca39207def" exitCode=0 Apr 22 18:43:56.680795 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.680707 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgq4" event={"ID":"10fccf91-d12b-4767-94a3-6a751cf19eb8","Type":"ContainerDied","Data":"4279cc8959edd39cdd426fceceb6504ef0bc26925bb4a422852583ca39207def"} Apr 22 18:43:56.695945 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.695918 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:43:56.711001 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.710953 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" podStartSLOduration=10.193616801 podStartE2EDuration="27.710936408s" podCreationTimestamp="2026-04-22 18:43:29 +0000 UTC" firstStartedPulling="2026-04-22 18:43:32.253732511 +0000 UTC m=+3.237198441" lastFinishedPulling="2026-04-22 18:43:49.771051924 +0000 UTC m=+20.754518048" observedRunningTime="2026-04-22 18:43:56.709540592 +0000 UTC m=+27.693006514" watchObservedRunningTime="2026-04-22 18:43:56.710936408 +0000 UTC m=+27.694402350" Apr 22 18:43:56.934891 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.934640 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9v7dk"] Apr 22 18:43:56.935026 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.934930 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:56.935069 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:56.935028 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:43:56.938136 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.938111 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-w8q5c"] Apr 22 18:43:56.938274 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.938224 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:56.938332 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:56.938312 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:56.939037 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.939017 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-shrxv"] Apr 22 18:43:56.939142 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:56.939105 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:56.939198 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:56.939181 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:43:57.684857 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:57.684821 2570 generic.go:358] "Generic (PLEG): container finished" podID="10fccf91-d12b-4767-94a3-6a751cf19eb8" containerID="8aa87350cc627fc21cd7c62df915aa660f53c884a5533dd2d52ff72ce951f62f" exitCode=0 Apr 22 18:43:57.685260 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:57.684907 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgq4" event={"ID":"10fccf91-d12b-4767-94a3-6a751cf19eb8","Type":"ContainerDied","Data":"8aa87350cc627fc21cd7c62df915aa660f53c884a5533dd2d52ff72ce951f62f"} Apr 22 18:43:58.545208 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:58.545169 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:43:58.545208 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:58.545196 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:43:58.545437 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:43:58.545281 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:43:58.545437 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:58.545293 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:43:58.545437 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:58.545399 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:43:58.545607 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:43:58.545516 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:44:00.546402 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:00.545950 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:44:00.546402 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:00.545949 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:44:00.546402 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:00.546070 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:44:00.546402 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:00.545950 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:44:00.546402 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:00.546231 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:44:00.546402 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:00.546130 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:44:01.449316 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:01.449259 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-scf4m" Apr 22 18:44:01.449484 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:01.449445 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:44:01.450293 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:01.450269 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-scf4m" Apr 22 18:44:02.545127 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.545091 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:44:02.545636 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.545139 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:44:02.545636 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.545095 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:44:02.545636 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:02.545218 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-shrxv" podUID="2897aec5-9829-4f9d-a583-c9d3db52b220" Apr 22 18:44:02.545636 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:02.545287 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:44:02.545636 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:02.545376 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v7dk" podUID="a88529a8-1055-4ebf-bd16-aa151ce8e4cb" Apr 22 18:44:02.844123 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.844038 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-244.ec2.internal" event="NodeReady" Apr 22 18:44:02.844356 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.844216 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:44:02.891436 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.891402 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6zh6w"] Apr 22 18:44:02.895945 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.895919 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:02.896673 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.896648 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ltknd"] Apr 22 18:44:02.898459 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.898437 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:44:02.898563 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.898455 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xcpp5\"" Apr 22 18:44:02.898563 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.898463 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:44:02.899416 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.899398 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:44:02.902046 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.902028 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:44:02.902164 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.902147 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xmncv\"" Apr 22 18:44:02.902246 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.902161 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:44:02.902309 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.902253 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:44:02.907020 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.906999 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6zh6w"] Apr 22 18:44:02.910002 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:02.909965 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ltknd"] Apr 22 18:44:03.048235 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.048196 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj54t\" (UniqueName: \"kubernetes.io/projected/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-kube-api-access-dj54t\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:03.048235 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.048228 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert\") pod \"ingress-canary-ltknd\" (UID: \"57a5c4fb-aa29-4923-b848-a57df5a62462\") " pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:44:03.048505 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.048265 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mcnv\" (UniqueName: \"kubernetes.io/projected/57a5c4fb-aa29-4923-b848-a57df5a62462-kube-api-access-8mcnv\") pod \"ingress-canary-ltknd\" (UID: \"57a5c4fb-aa29-4923-b848-a57df5a62462\") " pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:44:03.048505 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.048386 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-config-volume\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:03.048505 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.048421 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:03.048505 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.048442 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-tmp-dir\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:03.149593 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.149555 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj54t\" (UniqueName: \"kubernetes.io/projected/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-kube-api-access-dj54t\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:03.149593 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.149596 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert\") pod \"ingress-canary-ltknd\" (UID: \"57a5c4fb-aa29-4923-b848-a57df5a62462\") " pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:44:03.149826 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.149661 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mcnv\" (UniqueName: \"kubernetes.io/projected/57a5c4fb-aa29-4923-b848-a57df5a62462-kube-api-access-8mcnv\") pod \"ingress-canary-ltknd\" (UID: \"57a5c4fb-aa29-4923-b848-a57df5a62462\") " pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:44:03.149826 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.149682 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:44:03.149826 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.149732 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-config-volume\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:03.149826 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.149767 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:03.149826 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:03.149791 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:03.150029 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:03.149841 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:44:03.150029 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:03.149851 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:03.150029 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.149791 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-tmp-dir\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:03.150029 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:03.149857 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert podName:57a5c4fb-aa29-4923-b848-a57df5a62462 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:03.649835165 +0000 UTC m=+34.633301097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert") pod "ingress-canary-ltknd" (UID: "57a5c4fb-aa29-4923-b848-a57df5a62462") : secret "canary-serving-cert" not found Apr 22 18:44:03.150029 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:03.149920 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls podName:ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d nodeName:}" failed. No retries permitted until 2026-04-22 18:44:03.64990336 +0000 UTC m=+34.633369279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls") pod "dns-default-6zh6w" (UID: "ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d") : secret "dns-default-metrics-tls" not found Apr 22 18:44:03.150029 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:03.149956 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs podName:317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:35.149930192 +0000 UTC m=+66.133396112 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs") pod "network-metrics-daemon-w8q5c" (UID: "317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:44:03.150218 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.150077 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-tmp-dir\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:03.150328 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.150310 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-config-volume\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:03.161724 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.161572 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj54t\" (UniqueName: \"kubernetes.io/projected/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-kube-api-access-dj54t\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:03.161724 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.161715 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mcnv\" (UniqueName: \"kubernetes.io/projected/57a5c4fb-aa29-4923-b848-a57df5a62462-kube-api-access-8mcnv\") pod \"ingress-canary-ltknd\" (UID: \"57a5c4fb-aa29-4923-b848-a57df5a62462\") " pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:44:03.350733 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.350690 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqz5\" (UniqueName: \"kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5\") pod \"network-check-target-shrxv\" (UID: \"2897aec5-9829-4f9d-a583-c9d3db52b220\") " pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:44:03.350903 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:03.350849 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:44:03.350903 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:03.350872 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:44:03.350903 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:03.350883 2570 projected.go:194] Error preparing data for projected volume kube-api-access-5vqz5 for pod openshift-network-diagnostics/network-check-target-shrxv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:44:03.351027 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:03.350943 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5 podName:2897aec5-9829-4f9d-a583-c9d3db52b220 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:35.350928753 +0000 UTC m=+66.334394672 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-5vqz5" (UniqueName: "kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5") pod "network-check-target-shrxv" (UID: "2897aec5-9829-4f9d-a583-c9d3db52b220") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:44:03.653551 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.653505 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:03.653964 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.653586 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert\") pod \"ingress-canary-ltknd\" (UID: \"57a5c4fb-aa29-4923-b848-a57df5a62462\") " pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:44:03.653964 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:03.653679 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:03.653964 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:03.653755 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls podName:ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d nodeName:}" failed. No retries permitted until 2026-04-22 18:44:04.653735361 +0000 UTC m=+35.637201282 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls") pod "dns-default-6zh6w" (UID: "ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d") : secret "dns-default-metrics-tls" not found Apr 22 18:44:03.653964 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:03.653758 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:03.653964 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:03.653812 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert podName:57a5c4fb-aa29-4923-b848-a57df5a62462 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:04.653796388 +0000 UTC m=+35.637262311 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert") pod "ingress-canary-ltknd" (UID: "57a5c4fb-aa29-4923-b848-a57df5a62462") : secret "canary-serving-cert" not found Apr 22 18:44:03.699503 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.699428 2570 generic.go:358] "Generic (PLEG): container finished" podID="10fccf91-d12b-4767-94a3-6a751cf19eb8" containerID="d9802a2f1903849b49709fcaf66f4fb4cfb6dbcee69c7dedf78c65c14ade8316" exitCode=0 Apr 22 18:44:03.699503 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:03.699490 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgq4" event={"ID":"10fccf91-d12b-4767-94a3-6a751cf19eb8","Type":"ContainerDied","Data":"d9802a2f1903849b49709fcaf66f4fb4cfb6dbcee69c7dedf78c65c14ade8316"} Apr 22 18:44:04.545743 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:04.545705 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:44:04.545920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:04.545706 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:44:04.546040 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:04.545706 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:44:04.549385 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:04.549360 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:44:04.549385 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:04.549370 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mgrtw\"" Apr 22 18:44:04.549573 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:04.549365 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:44:04.549573 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:04.549363 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c6clv\"" Apr 22 18:44:04.549573 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:04.549371 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:44:04.549573 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:04.549362 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:44:04.661837 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:04.661798 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert\") pod \"ingress-canary-ltknd\" (UID: \"57a5c4fb-aa29-4923-b848-a57df5a62462\") " pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:44:04.662273 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:04.661964 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:04.662273 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:04.662022 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:04.662273 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:04.662042 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert podName:57a5c4fb-aa29-4923-b848-a57df5a62462 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:06.662022382 +0000 UTC m=+37.645488307 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert") pod "ingress-canary-ltknd" (UID: "57a5c4fb-aa29-4923-b848-a57df5a62462") : secret "canary-serving-cert" not found Apr 22 18:44:04.662273 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:04.662094 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:04.662273 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:04.662156 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls podName:ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d nodeName:}" failed. No retries permitted until 2026-04-22 18:44:06.662143243 +0000 UTC m=+37.645609162 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls") pod "dns-default-6zh6w" (UID: "ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d") : secret "dns-default-metrics-tls" not found Apr 22 18:44:04.705268 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:04.705223 2570 generic.go:358] "Generic (PLEG): container finished" podID="10fccf91-d12b-4767-94a3-6a751cf19eb8" containerID="3ca7afd92cae79c5637fe1f94fe9f816c88eb902eb7c72e14a1233411cffea78" exitCode=0 Apr 22 18:44:04.705511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:04.705290 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgq4" event={"ID":"10fccf91-d12b-4767-94a3-6a751cf19eb8","Type":"ContainerDied","Data":"3ca7afd92cae79c5637fe1f94fe9f816c88eb902eb7c72e14a1233411cffea78"} Apr 22 18:44:05.710315 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:05.710286 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ftgq4" event={"ID":"10fccf91-d12b-4767-94a3-6a751cf19eb8","Type":"ContainerStarted","Data":"b8dd2f2145c8c5ee9beb2e07ee882e90e376e7eb910334695ceee7693e838776"} Apr 22 18:44:05.740433 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:05.740364 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ftgq4" podStartSLOduration=5.635424552 podStartE2EDuration="36.740348735s" podCreationTimestamp="2026-04-22 18:43:29 +0000 UTC" firstStartedPulling="2026-04-22 18:43:32.249125666 +0000 UTC m=+3.232591584" lastFinishedPulling="2026-04-22 18:44:03.354049847 +0000 UTC m=+34.337515767" observedRunningTime="2026-04-22 18:44:05.739844143 +0000 UTC m=+36.723310084" watchObservedRunningTime="2026-04-22 18:44:05.740348735 +0000 UTC m=+36.723814676" Apr 22 18:44:06.678460 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:06.678412 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:06.678659 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:06.678482 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert\") pod \"ingress-canary-ltknd\" (UID: \"57a5c4fb-aa29-4923-b848-a57df5a62462\") " pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:44:06.678659 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:06.678582 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:06.678659 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:06.678588 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:06.678781 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:06.678663 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls podName:ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d nodeName:}" failed. No retries permitted until 2026-04-22 18:44:10.678647807 +0000 UTC m=+41.662113727 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls") pod "dns-default-6zh6w" (UID: "ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d") : secret "dns-default-metrics-tls" not found Apr 22 18:44:06.678781 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:06.678677 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert podName:57a5c4fb-aa29-4923-b848-a57df5a62462 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:10.678671098 +0000 UTC m=+41.662137017 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert") pod "ingress-canary-ltknd" (UID: "57a5c4fb-aa29-4923-b848-a57df5a62462") : secret "canary-serving-cert" not found Apr 22 18:44:08.590793 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:08.590736 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:44:08.594421 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:08.594401 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a88529a8-1055-4ebf-bd16-aa151ce8e4cb-original-pull-secret\") pod \"global-pull-secret-syncer-9v7dk\" (UID: \"a88529a8-1055-4ebf-bd16-aa151ce8e4cb\") " pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:44:08.784479 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:08.784439 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v7dk" Apr 22 18:44:08.910889 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:08.910853 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9v7dk"] Apr 22 18:44:08.914147 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:44:08.914121 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda88529a8_1055_4ebf_bd16_aa151ce8e4cb.slice/crio-acc5c3748f91b95f02600cf6b1719fd19e88b50b850d03a6ab9ecc6b66899bfe WatchSource:0}: Error finding container acc5c3748f91b95f02600cf6b1719fd19e88b50b850d03a6ab9ecc6b66899bfe: Status 404 returned error can't find the container with id acc5c3748f91b95f02600cf6b1719fd19e88b50b850d03a6ab9ecc6b66899bfe Apr 22 18:44:09.719308 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:09.719258 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9v7dk" event={"ID":"a88529a8-1055-4ebf-bd16-aa151ce8e4cb","Type":"ContainerStarted","Data":"acc5c3748f91b95f02600cf6b1719fd19e88b50b850d03a6ab9ecc6b66899bfe"} Apr 22 18:44:10.706792 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:10.706750 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert\") pod \"ingress-canary-ltknd\" (UID: \"57a5c4fb-aa29-4923-b848-a57df5a62462\") " pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:44:10.706991 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:10.706877 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:10.706991 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:10.706925 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:10.706991 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:10.706974 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:10.707158 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:10.707009 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert podName:57a5c4fb-aa29-4923-b848-a57df5a62462 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:18.706985896 +0000 UTC m=+49.690451835 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert") pod "ingress-canary-ltknd" (UID: "57a5c4fb-aa29-4923-b848-a57df5a62462") : secret "canary-serving-cert" not found Apr 22 18:44:10.707158 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:10.707031 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls podName:ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d nodeName:}" failed. No retries permitted until 2026-04-22 18:44:18.707021138 +0000 UTC m=+49.690487062 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls") pod "dns-default-6zh6w" (UID: "ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d") : secret "dns-default-metrics-tls" not found Apr 22 18:44:13.728958 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:13.728914 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9v7dk" event={"ID":"a88529a8-1055-4ebf-bd16-aa151ce8e4cb","Type":"ContainerStarted","Data":"269d0068c64873cb7da7eb929ee0580aa4dfe73e0e0efc66a4d94ce9f310a504"} Apr 22 18:44:13.749923 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:13.749874 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9v7dk" podStartSLOduration=33.709128813 podStartE2EDuration="37.749859645s" podCreationTimestamp="2026-04-22 18:43:36 +0000 UTC" firstStartedPulling="2026-04-22 18:44:08.915924894 +0000 UTC m=+39.899390814" lastFinishedPulling="2026-04-22 18:44:12.956655726 +0000 UTC m=+43.940121646" observedRunningTime="2026-04-22 18:44:13.748800339 +0000 UTC m=+44.732266280" watchObservedRunningTime="2026-04-22 18:44:13.749859645 +0000 UTC m=+44.733325582" Apr 22 18:44:18.766148 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:18.766115 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:18.766591 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:18.766159 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert\") pod \"ingress-canary-ltknd\" (UID: \"57a5c4fb-aa29-4923-b848-a57df5a62462\") " pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:44:18.766591 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:18.766314 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:18.766591 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:18.766377 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls podName:ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d nodeName:}" failed. No retries permitted until 2026-04-22 18:44:34.766362084 +0000 UTC m=+65.749828004 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls") pod "dns-default-6zh6w" (UID: "ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d") : secret "dns-default-metrics-tls" not found Apr 22 18:44:18.766591 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:18.766320 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:18.766591 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:18.766462 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert podName:57a5c4fb-aa29-4923-b848-a57df5a62462 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:34.766448942 +0000 UTC m=+65.749914862 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert") pod "ingress-canary-ltknd" (UID: "57a5c4fb-aa29-4923-b848-a57df5a62462") : secret "canary-serving-cert" not found Apr 22 18:44:28.698450 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:28.698418 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rpznc" Apr 22 18:44:34.767573 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:34.767514 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert\") pod \"ingress-canary-ltknd\" (UID: \"57a5c4fb-aa29-4923-b848-a57df5a62462\") " pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:44:34.768150 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:34.767672 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:44:34.768150 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:34.767677 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:34.768150 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:34.767748 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert podName:57a5c4fb-aa29-4923-b848-a57df5a62462 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:06.767732748 +0000 UTC m=+97.751198667 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert") pod "ingress-canary-ltknd" (UID: "57a5c4fb-aa29-4923-b848-a57df5a62462") : secret "canary-serving-cert" not found Apr 22 18:44:34.768150 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:34.767796 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:34.768150 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:34.767869 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls podName:ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d nodeName:}" failed. No retries permitted until 2026-04-22 18:45:06.767850063 +0000 UTC m=+97.751315985 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls") pod "dns-default-6zh6w" (UID: "ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d") : secret "dns-default-metrics-tls" not found Apr 22 18:44:35.171780 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:35.171745 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:44:35.174159 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:35.174139 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:44:35.182119 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:35.182086 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:44:35.182182 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:44:35.182171 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs podName:317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:39.182154554 +0000 UTC m=+130.165620491 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs") pod "network-metrics-daemon-w8q5c" (UID: "317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282") : secret "metrics-daemon-secret" not found Apr 22 18:44:35.373656 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:35.373589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqz5\" (UniqueName: \"kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5\") pod \"network-check-target-shrxv\" (UID: \"2897aec5-9829-4f9d-a583-c9d3db52b220\") " pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:44:35.376401 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:35.376381 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:44:35.386752 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:35.386731 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:44:35.397143 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:35.397123 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vqz5\" (UniqueName: \"kubernetes.io/projected/2897aec5-9829-4f9d-a583-c9d3db52b220-kube-api-access-5vqz5\") pod \"network-check-target-shrxv\" (UID: \"2897aec5-9829-4f9d-a583-c9d3db52b220\") " pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:44:35.458333 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:35.458262 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c6clv\"" Apr 22 18:44:35.466768 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:35.466741 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:44:35.583925 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:35.583892 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-shrxv"] Apr 22 18:44:35.588043 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:44:35.588001 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2897aec5_9829_4f9d_a583_c9d3db52b220.slice/crio-6a86f5d5a88a1eb5707c60c38758003b86376bf5f3440a918a3810d5cd34d7cf WatchSource:0}: Error finding container 6a86f5d5a88a1eb5707c60c38758003b86376bf5f3440a918a3810d5cd34d7cf: Status 404 returned error can't find the container with id 6a86f5d5a88a1eb5707c60c38758003b86376bf5f3440a918a3810d5cd34d7cf Apr 22 18:44:35.772228 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:35.772142 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-shrxv" event={"ID":"2897aec5-9829-4f9d-a583-c9d3db52b220","Type":"ContainerStarted","Data":"6a86f5d5a88a1eb5707c60c38758003b86376bf5f3440a918a3810d5cd34d7cf"} Apr 22 18:44:38.780190 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:38.780160 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-shrxv" event={"ID":"2897aec5-9829-4f9d-a583-c9d3db52b220","Type":"ContainerStarted","Data":"156d580a2b94c520587209ea00d62b14d0524a1fe97b30011ebab0f9d587ec6a"} Apr 22 18:44:38.780562 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:38.780281 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:44:38.798447 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:44:38.798398 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-shrxv" podStartSLOduration=67.249940702 podStartE2EDuration="1m9.798383638s" podCreationTimestamp="2026-04-22 18:43:29 +0000 UTC" firstStartedPulling="2026-04-22 18:44:35.590279931 +0000 UTC m=+66.573745850" lastFinishedPulling="2026-04-22 18:44:38.138722864 +0000 UTC m=+69.122188786" observedRunningTime="2026-04-22 18:44:38.797385789 +0000 UTC m=+69.780851730" watchObservedRunningTime="2026-04-22 18:44:38.798383638 +0000 UTC m=+69.781849579" Apr 22 18:45:06.791410 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:06.791307 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:45:06.791410 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:06.791354 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert\") pod \"ingress-canary-ltknd\" (UID: \"57a5c4fb-aa29-4923-b848-a57df5a62462\") " pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:45:06.791897 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:06.791444 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:45:06.791897 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:06.791449 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:45:06.791897 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:06.791507 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert podName:57a5c4fb-aa29-4923-b848-a57df5a62462 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:10.791492469 +0000 UTC m=+161.774958388 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert") pod "ingress-canary-ltknd" (UID: "57a5c4fb-aa29-4923-b848-a57df5a62462") : secret "canary-serving-cert" not found Apr 22 18:45:06.791897 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:06.791520 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls podName:ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d nodeName:}" failed. No retries permitted until 2026-04-22 18:46:10.791513437 +0000 UTC m=+161.774979355 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls") pod "dns-default-6zh6w" (UID: "ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d") : secret "dns-default-metrics-tls" not found Apr 22 18:45:09.784844 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:09.784808 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-shrxv" Apr 22 18:45:29.774112 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.774077 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-thx79"] Apr 22 18:45:29.776806 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.776790 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.780569 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.780542 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 18:45:29.780569 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.780545 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:45:29.781245 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.781230 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:45:29.781741 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.781714 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-9zjlg\"" Apr 22 18:45:29.782033 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.782017 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 18:45:29.784943 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.784921 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-88bf8fcd4-w7x4n"] Apr 22 18:45:29.785851 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.785835 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 18:45:29.787601 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.787586 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:29.790160 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.790145 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 18:45:29.790346 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.790316 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:45:29.790440 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.790399 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 18:45:29.790501 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.790477 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:45:29.790501 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.790484 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 18:45:29.790611 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.790521 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 18:45:29.790786 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.790770 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-cx5zk\"" Apr 22 18:45:29.793954 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.793935 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-thx79"] Apr 22 18:45:29.801188 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.801163 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-88bf8fcd4-w7x4n"] Apr 22 18:45:29.846437 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.846406 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd2nx\" (UniqueName: \"kubernetes.io/projected/78f709d1-ef5e-40ac-8845-3f0108fe6b96-kube-api-access-cd2nx\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:29.846437 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.846438 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0c7ad148-5682-41c7-874a-a20686c43134-tmp\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.846657 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.846455 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0c7ad148-5682-41c7-874a-a20686c43134-snapshots\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.846657 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.846481 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-stats-auth\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:29.846657 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.846495 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c7ad148-5682-41c7-874a-a20686c43134-serving-cert\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.846657 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.846573 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:29.846657 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.846596 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6gks\" (UniqueName: \"kubernetes.io/projected/0c7ad148-5682-41c7-874a-a20686c43134-kube-api-access-w6gks\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.846657 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.846650 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-default-certificate\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:29.846863 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.846683 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c7ad148-5682-41c7-874a-a20686c43134-service-ca-bundle\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.846863 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.846705 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:29.846863 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.846728 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c7ad148-5682-41c7-874a-a20686c43134-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.886063 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.886035 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769"] Apr 22 18:45:29.888800 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.888787 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" Apr 22 18:45:29.892684 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.892661 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:45:29.892684 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.892673 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-hl5rx\"" Apr 22 18:45:29.892847 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.892744 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 18:45:29.892847 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.892755 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 18:45:29.893159 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.893146 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 18:45:29.915452 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.915425 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769"] Apr 22 18:45:29.918009 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.917988 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-62t72"] Apr 22 18:45:29.920860 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.920843 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" Apr 22 18:45:29.929037 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.929008 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 18:45:29.930494 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.930476 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-rt4fv\"" Apr 22 18:45:29.935554 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.935536 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 18:45:29.947777 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.947754 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cd2nx\" (UniqueName: \"kubernetes.io/projected/78f709d1-ef5e-40ac-8845-3f0108fe6b96-kube-api-access-cd2nx\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:29.947870 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.947784 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0c7ad148-5682-41c7-874a-a20686c43134-tmp\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.947870 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.947807 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0c7ad148-5682-41c7-874a-a20686c43134-snapshots\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.947870 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.947844 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142521b5-1f3f-465a-ba35-247424907e87-config\") pod \"service-ca-operator-d6fc45fc5-t8769\" (UID: \"142521b5-1f3f-465a-ba35-247424907e87\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" Apr 22 18:45:29.948015 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.947878 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-stats-auth\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:29.948015 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.947902 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c7ad148-5682-41c7-874a-a20686c43134-serving-cert\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.948015 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.947984 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:29.948015 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.948009 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6gks\" (UniqueName: \"kubernetes.io/projected/0c7ad148-5682-41c7-874a-a20686c43134-kube-api-access-w6gks\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.948192 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.948031 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-default-certificate\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:29.948192 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.948065 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/142521b5-1f3f-465a-ba35-247424907e87-serving-cert\") pod \"service-ca-operator-d6fc45fc5-t8769\" (UID: \"142521b5-1f3f-465a-ba35-247424907e87\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" Apr 22 18:45:29.948192 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:29.948073 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:45:29.948192 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.948092 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svhcl\" (UniqueName: \"kubernetes.io/projected/142521b5-1f3f-465a-ba35-247424907e87-kube-api-access-svhcl\") pod \"service-ca-operator-d6fc45fc5-t8769\" (UID: \"142521b5-1f3f-465a-ba35-247424907e87\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" Apr 22 18:45:29.948192 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:29.948132 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs podName:78f709d1-ef5e-40ac-8845-3f0108fe6b96 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:30.44811259 +0000 UTC m=+121.431578509 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs") pod "router-default-88bf8fcd4-w7x4n" (UID: "78f709d1-ef5e-40ac-8845-3f0108fe6b96") : secret "router-metrics-certs-default" not found Apr 22 18:45:29.948192 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.948180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c7ad148-5682-41c7-874a-a20686c43134-service-ca-bundle\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.948460 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.948210 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:29.948460 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.948236 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c7ad148-5682-41c7-874a-a20686c43134-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.948460 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:29.948350 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle podName:78f709d1-ef5e-40ac-8845-3f0108fe6b96 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:30.448336826 +0000 UTC m=+121.431802785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle") pod "router-default-88bf8fcd4-w7x4n" (UID: "78f709d1-ef5e-40ac-8845-3f0108fe6b96") : configmap references non-existent config key: service-ca.crt Apr 22 18:45:29.948779 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.948756 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0c7ad148-5682-41c7-874a-a20686c43134-tmp\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.949041 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.949018 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0c7ad148-5682-41c7-874a-a20686c43134-snapshots\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.949223 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.949206 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c7ad148-5682-41c7-874a-a20686c43134-service-ca-bundle\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.949511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.949488 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c7ad148-5682-41c7-874a-a20686c43134-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.950450 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.950423 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-stats-auth\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:29.950746 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.950729 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0c7ad148-5682-41c7-874a-a20686c43134-serving-cert\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.950814 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.950752 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-default-certificate\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:29.960722 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.960701 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-62t72"] Apr 22 18:45:29.971379 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.971356 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6gks\" (UniqueName: \"kubernetes.io/projected/0c7ad148-5682-41c7-874a-a20686c43134-kube-api-access-w6gks\") pod \"insights-operator-585dfdc468-thx79\" (UID: \"0c7ad148-5682-41c7-874a-a20686c43134\") " pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:29.981071 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:29.981050 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd2nx\" (UniqueName: \"kubernetes.io/projected/78f709d1-ef5e-40ac-8845-3f0108fe6b96-kube-api-access-cd2nx\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:30.048800 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.048740 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142521b5-1f3f-465a-ba35-247424907e87-config\") pod \"service-ca-operator-d6fc45fc5-t8769\" (UID: \"142521b5-1f3f-465a-ba35-247424907e87\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" Apr 22 18:45:30.048886 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.048801 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/142521b5-1f3f-465a-ba35-247424907e87-serving-cert\") pod \"service-ca-operator-d6fc45fc5-t8769\" (UID: \"142521b5-1f3f-465a-ba35-247424907e87\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" Apr 22 18:45:30.048886 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.048829 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svhcl\" (UniqueName: \"kubernetes.io/projected/142521b5-1f3f-465a-ba35-247424907e87-kube-api-access-svhcl\") pod \"service-ca-operator-d6fc45fc5-t8769\" (UID: \"142521b5-1f3f-465a-ba35-247424907e87\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" Apr 22 18:45:30.048886 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.048861 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-62t72\" (UID: \"71f3cd65-16e4-4173-821a-48a924d80e7a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" Apr 22 18:45:30.049014 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.048920 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/71f3cd65-16e4-4173-821a-48a924d80e7a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-62t72\" (UID: \"71f3cd65-16e4-4173-821a-48a924d80e7a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" Apr 22 18:45:30.049269 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.049251 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/142521b5-1f3f-465a-ba35-247424907e87-config\") pod \"service-ca-operator-d6fc45fc5-t8769\" (UID: \"142521b5-1f3f-465a-ba35-247424907e87\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" Apr 22 18:45:30.050983 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.050957 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/142521b5-1f3f-465a-ba35-247424907e87-serving-cert\") pod \"service-ca-operator-d6fc45fc5-t8769\" (UID: \"142521b5-1f3f-465a-ba35-247424907e87\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" Apr 22 18:45:30.058009 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.057989 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svhcl\" (UniqueName: \"kubernetes.io/projected/142521b5-1f3f-465a-ba35-247424907e87-kube-api-access-svhcl\") pod \"service-ca-operator-d6fc45fc5-t8769\" (UID: \"142521b5-1f3f-465a-ba35-247424907e87\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" Apr 22 18:45:30.086057 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.086039 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-thx79" Apr 22 18:45:30.150230 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.150195 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-62t72\" (UID: \"71f3cd65-16e4-4173-821a-48a924d80e7a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" Apr 22 18:45:30.150362 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.150283 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/71f3cd65-16e4-4173-821a-48a924d80e7a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-62t72\" (UID: \"71f3cd65-16e4-4173-821a-48a924d80e7a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" Apr 22 18:45:30.150647 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:30.150495 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:45:30.150647 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:30.150559 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert podName:71f3cd65-16e4-4173-821a-48a924d80e7a nodeName:}" failed. No retries permitted until 2026-04-22 18:45:30.650543217 +0000 UTC m=+121.634009136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-62t72" (UID: "71f3cd65-16e4-4173-821a-48a924d80e7a") : secret "networking-console-plugin-cert" not found Apr 22 18:45:30.151140 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.151118 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/71f3cd65-16e4-4173-821a-48a924d80e7a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-62t72\" (UID: \"71f3cd65-16e4-4173-821a-48a924d80e7a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" Apr 22 18:45:30.198123 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.198054 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" Apr 22 18:45:30.200083 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.200054 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-thx79"] Apr 22 18:45:30.204143 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:45:30.204118 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c7ad148_5682_41c7_874a_a20686c43134.slice/crio-12a8b1c1bc8465311e18f9da60d51614b246be8651716b8e0a42bcb16e246a4a WatchSource:0}: Error finding container 12a8b1c1bc8465311e18f9da60d51614b246be8651716b8e0a42bcb16e246a4a: Status 404 returned error can't find the container with id 12a8b1c1bc8465311e18f9da60d51614b246be8651716b8e0a42bcb16e246a4a Apr 22 18:45:30.315149 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.315079 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769"] Apr 22 18:45:30.319077 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:45:30.319039 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod142521b5_1f3f_465a_ba35_247424907e87.slice/crio-eb6b9e49d04676c683f58d142429b685c2182141c51442a1cb32dd65b9ceacaa WatchSource:0}: Error finding container eb6b9e49d04676c683f58d142429b685c2182141c51442a1cb32dd65b9ceacaa: Status 404 returned error can't find the container with id eb6b9e49d04676c683f58d142429b685c2182141c51442a1cb32dd65b9ceacaa Apr 22 18:45:30.453085 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.453048 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:30.453254 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.453120 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:30.453254 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:30.453197 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:45:30.453254 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:30.453244 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle podName:78f709d1-ef5e-40ac-8845-3f0108fe6b96 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:31.453225102 +0000 UTC m=+122.436691022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle") pod "router-default-88bf8fcd4-w7x4n" (UID: "78f709d1-ef5e-40ac-8845-3f0108fe6b96") : configmap references non-existent config key: service-ca.crt Apr 22 18:45:30.453363 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:30.453259 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs podName:78f709d1-ef5e-40ac-8845-3f0108fe6b96 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:31.453252716 +0000 UTC m=+122.436718635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs") pod "router-default-88bf8fcd4-w7x4n" (UID: "78f709d1-ef5e-40ac-8845-3f0108fe6b96") : secret "router-metrics-certs-default" not found Apr 22 18:45:30.654702 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.654666 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-62t72\" (UID: \"71f3cd65-16e4-4173-821a-48a924d80e7a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" Apr 22 18:45:30.654877 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:30.654819 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:45:30.654920 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:30.654892 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert podName:71f3cd65-16e4-4173-821a-48a924d80e7a nodeName:}" failed. No retries permitted until 2026-04-22 18:45:31.654875185 +0000 UTC m=+122.638341104 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-62t72" (UID: "71f3cd65-16e4-4173-821a-48a924d80e7a") : secret "networking-console-plugin-cert" not found Apr 22 18:45:30.884973 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.884933 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-thx79" event={"ID":"0c7ad148-5682-41c7-874a-a20686c43134","Type":"ContainerStarted","Data":"12a8b1c1bc8465311e18f9da60d51614b246be8651716b8e0a42bcb16e246a4a"} Apr 22 18:45:30.886554 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:30.886525 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" event={"ID":"142521b5-1f3f-465a-ba35-247424907e87","Type":"ContainerStarted","Data":"eb6b9e49d04676c683f58d142429b685c2182141c51442a1cb32dd65b9ceacaa"} Apr 22 18:45:31.462886 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:31.462852 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:31.463093 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:31.462942 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:31.463093 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:31.463069 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:45:31.463213 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:31.463134 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs podName:78f709d1-ef5e-40ac-8845-3f0108fe6b96 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:33.463117642 +0000 UTC m=+124.446583560 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs") pod "router-default-88bf8fcd4-w7x4n" (UID: "78f709d1-ef5e-40ac-8845-3f0108fe6b96") : secret "router-metrics-certs-default" not found Apr 22 18:45:31.463433 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:31.463404 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle podName:78f709d1-ef5e-40ac-8845-3f0108fe6b96 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:33.463383325 +0000 UTC m=+124.446849264 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle") pod "router-default-88bf8fcd4-w7x4n" (UID: "78f709d1-ef5e-40ac-8845-3f0108fe6b96") : configmap references non-existent config key: service-ca.crt Apr 22 18:45:31.664872 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:31.664828 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-62t72\" (UID: \"71f3cd65-16e4-4173-821a-48a924d80e7a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" Apr 22 18:45:31.665035 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:31.664972 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:45:31.665082 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:31.665055 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert podName:71f3cd65-16e4-4173-821a-48a924d80e7a nodeName:}" failed. No retries permitted until 2026-04-22 18:45:33.665038676 +0000 UTC m=+124.648504600 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-62t72" (UID: "71f3cd65-16e4-4173-821a-48a924d80e7a") : secret "networking-console-plugin-cert" not found Apr 22 18:45:32.892176 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:32.892138 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-thx79" event={"ID":"0c7ad148-5682-41c7-874a-a20686c43134","Type":"ContainerStarted","Data":"a053ba7cd92437083d930cb029704dbb942981f203cc72f4f583b843bfc690c1"} Apr 22 18:45:32.893413 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:32.893387 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" event={"ID":"142521b5-1f3f-465a-ba35-247424907e87","Type":"ContainerStarted","Data":"54db002dc30594a83e815a915ab518d4ef508280f299fa3f780672ba08b84560"} Apr 22 18:45:32.909086 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:32.909031 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-thx79" podStartSLOduration=1.6159243189999999 podStartE2EDuration="3.909015763s" podCreationTimestamp="2026-04-22 18:45:29 +0000 UTC" firstStartedPulling="2026-04-22 18:45:30.206501398 +0000 UTC m=+121.189967317" lastFinishedPulling="2026-04-22 18:45:32.499592842 +0000 UTC m=+123.483058761" observedRunningTime="2026-04-22 18:45:32.908334502 +0000 UTC m=+123.891800443" watchObservedRunningTime="2026-04-22 18:45:32.909015763 +0000 UTC m=+123.892481705" Apr 22 18:45:32.926044 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:32.926000 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" podStartSLOduration=1.743688996 podStartE2EDuration="3.925988458s" podCreationTimestamp="2026-04-22 18:45:29 +0000 UTC" firstStartedPulling="2026-04-22 18:45:30.320868183 +0000 UTC m=+121.304334102" lastFinishedPulling="2026-04-22 18:45:32.503167644 +0000 UTC m=+123.486633564" observedRunningTime="2026-04-22 18:45:32.92498922 +0000 UTC m=+123.908455164" watchObservedRunningTime="2026-04-22 18:45:32.925988458 +0000 UTC m=+123.909454398" Apr 22 18:45:33.479545 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:33.479513 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:33.479769 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:33.479614 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:33.479769 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:33.479693 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle podName:78f709d1-ef5e-40ac-8845-3f0108fe6b96 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:37.479674375 +0000 UTC m=+128.463140299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle") pod "router-default-88bf8fcd4-w7x4n" (UID: "78f709d1-ef5e-40ac-8845-3f0108fe6b96") : configmap references non-existent config key: service-ca.crt Apr 22 18:45:33.479769 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:33.479711 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:45:33.479769 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:33.479769 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs podName:78f709d1-ef5e-40ac-8845-3f0108fe6b96 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:37.479756288 +0000 UTC m=+128.463222208 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs") pod "router-default-88bf8fcd4-w7x4n" (UID: "78f709d1-ef5e-40ac-8845-3f0108fe6b96") : secret "router-metrics-certs-default" not found Apr 22 18:45:33.681108 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:33.681041 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-62t72\" (UID: \"71f3cd65-16e4-4173-821a-48a924d80e7a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" Apr 22 18:45:33.681288 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:33.681235 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:45:33.681346 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:33.681316 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert podName:71f3cd65-16e4-4173-821a-48a924d80e7a nodeName:}" failed. No retries permitted until 2026-04-22 18:45:37.681294715 +0000 UTC m=+128.664760641 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-62t72" (UID: "71f3cd65-16e4-4173-821a-48a924d80e7a") : secret "networking-console-plugin-cert" not found Apr 22 18:45:34.718097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:34.718059 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-4whhs"] Apr 22 18:45:34.721147 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:34.721130 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4whhs" Apr 22 18:45:34.725533 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:34.725502 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-58z5q\"" Apr 22 18:45:34.725533 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:34.725515 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 18:45:34.725696 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:34.725558 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 18:45:34.731385 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:34.731365 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-4whhs"] Apr 22 18:45:34.790464 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:34.790426 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxp8h\" (UniqueName: \"kubernetes.io/projected/c3364bfb-9927-4cd3-89a5-1137295070fd-kube-api-access-bxp8h\") pod \"migrator-74bb7799d9-4whhs\" (UID: \"c3364bfb-9927-4cd3-89a5-1137295070fd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4whhs" Apr 22 18:45:34.891099 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:34.891065 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxp8h\" (UniqueName: \"kubernetes.io/projected/c3364bfb-9927-4cd3-89a5-1137295070fd-kube-api-access-bxp8h\") pod \"migrator-74bb7799d9-4whhs\" (UID: \"c3364bfb-9927-4cd3-89a5-1137295070fd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4whhs" Apr 22 18:45:34.904339 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:34.904307 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxp8h\" (UniqueName: \"kubernetes.io/projected/c3364bfb-9927-4cd3-89a5-1137295070fd-kube-api-access-bxp8h\") pod \"migrator-74bb7799d9-4whhs\" (UID: \"c3364bfb-9927-4cd3-89a5-1137295070fd\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4whhs" Apr 22 18:45:35.030260 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:35.030187 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4whhs" Apr 22 18:45:35.149516 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:35.149478 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-4whhs"] Apr 22 18:45:35.152757 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:45:35.152732 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3364bfb_9927_4cd3_89a5_1137295070fd.slice/crio-47cc206bc51cd48ab8c43610383fb65b1abf6fc6caac6caf97e73f092bcf105a WatchSource:0}: Error finding container 47cc206bc51cd48ab8c43610383fb65b1abf6fc6caac6caf97e73f092bcf105a: Status 404 returned error can't find the container with id 47cc206bc51cd48ab8c43610383fb65b1abf6fc6caac6caf97e73f092bcf105a Apr 22 18:45:35.458855 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:35.458826 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2z78z_4b955592-582e-4878-a4b9-99767a2aaefb/dns-node-resolver/0.log" Apr 22 18:45:35.900976 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:35.900937 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4whhs" event={"ID":"c3364bfb-9927-4cd3-89a5-1137295070fd","Type":"ContainerStarted","Data":"47cc206bc51cd48ab8c43610383fb65b1abf6fc6caac6caf97e73f092bcf105a"} Apr 22 18:45:36.142031 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.141995 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-w7ggk"] Apr 22 18:45:36.145257 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.145236 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-w7ggk" Apr 22 18:45:36.147857 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.147834 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-plcrw\"" Apr 22 18:45:36.148456 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.148434 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 18:45:36.148556 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.148473 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 18:45:36.148556 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.148514 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 18:45:36.148925 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.148908 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 18:45:36.156691 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.156669 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-w7ggk"] Apr 22 18:45:36.303195 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.303166 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/eedaedab-55b7-4947-8fb7-dfb3ad24b39c-signing-cabundle\") pod \"service-ca-865cb79987-w7ggk\" (UID: \"eedaedab-55b7-4947-8fb7-dfb3ad24b39c\") " pod="openshift-service-ca/service-ca-865cb79987-w7ggk" Apr 22 18:45:36.303353 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.303219 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/eedaedab-55b7-4947-8fb7-dfb3ad24b39c-signing-key\") pod \"service-ca-865cb79987-w7ggk\" (UID: \"eedaedab-55b7-4947-8fb7-dfb3ad24b39c\") " pod="openshift-service-ca/service-ca-865cb79987-w7ggk" Apr 22 18:45:36.303353 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.303245 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckz2r\" (UniqueName: \"kubernetes.io/projected/eedaedab-55b7-4947-8fb7-dfb3ad24b39c-kube-api-access-ckz2r\") pod \"service-ca-865cb79987-w7ggk\" (UID: \"eedaedab-55b7-4947-8fb7-dfb3ad24b39c\") " pod="openshift-service-ca/service-ca-865cb79987-w7ggk" Apr 22 18:45:36.403780 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.403758 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/eedaedab-55b7-4947-8fb7-dfb3ad24b39c-signing-key\") pod \"service-ca-865cb79987-w7ggk\" (UID: \"eedaedab-55b7-4947-8fb7-dfb3ad24b39c\") " pod="openshift-service-ca/service-ca-865cb79987-w7ggk" Apr 22 18:45:36.403875 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.403791 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckz2r\" (UniqueName: \"kubernetes.io/projected/eedaedab-55b7-4947-8fb7-dfb3ad24b39c-kube-api-access-ckz2r\") pod \"service-ca-865cb79987-w7ggk\" (UID: \"eedaedab-55b7-4947-8fb7-dfb3ad24b39c\") " pod="openshift-service-ca/service-ca-865cb79987-w7ggk" Apr 22 18:45:36.404007 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.403989 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/eedaedab-55b7-4947-8fb7-dfb3ad24b39c-signing-cabundle\") pod \"service-ca-865cb79987-w7ggk\" (UID: \"eedaedab-55b7-4947-8fb7-dfb3ad24b39c\") " pod="openshift-service-ca/service-ca-865cb79987-w7ggk" Apr 22 18:45:36.404595 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.404576 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/eedaedab-55b7-4947-8fb7-dfb3ad24b39c-signing-cabundle\") pod \"service-ca-865cb79987-w7ggk\" (UID: \"eedaedab-55b7-4947-8fb7-dfb3ad24b39c\") " pod="openshift-service-ca/service-ca-865cb79987-w7ggk" Apr 22 18:45:36.406083 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.406063 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/eedaedab-55b7-4947-8fb7-dfb3ad24b39c-signing-key\") pod \"service-ca-865cb79987-w7ggk\" (UID: \"eedaedab-55b7-4947-8fb7-dfb3ad24b39c\") " pod="openshift-service-ca/service-ca-865cb79987-w7ggk" Apr 22 18:45:36.411972 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.411954 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckz2r\" (UniqueName: \"kubernetes.io/projected/eedaedab-55b7-4947-8fb7-dfb3ad24b39c-kube-api-access-ckz2r\") pod \"service-ca-865cb79987-w7ggk\" (UID: \"eedaedab-55b7-4947-8fb7-dfb3ad24b39c\") " pod="openshift-service-ca/service-ca-865cb79987-w7ggk" Apr 22 18:45:36.455662 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.455633 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-w7ggk" Apr 22 18:45:36.458871 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.458855 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gmml2_d150b818-e3a3-47e2-835c-16ae11dff162/node-ca/0.log" Apr 22 18:45:36.568121 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.568094 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-w7ggk"] Apr 22 18:45:36.571124 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:45:36.571099 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeedaedab_55b7_4947_8fb7_dfb3ad24b39c.slice/crio-1f8b8dde7ed2d1f67cd3e4857a30a8ad23fc366e7d7dd0d236c19248184d5fc1 WatchSource:0}: Error finding container 1f8b8dde7ed2d1f67cd3e4857a30a8ad23fc366e7d7dd0d236c19248184d5fc1: Status 404 returned error can't find the container with id 1f8b8dde7ed2d1f67cd3e4857a30a8ad23fc366e7d7dd0d236c19248184d5fc1 Apr 22 18:45:36.908672 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.908613 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-w7ggk" event={"ID":"eedaedab-55b7-4947-8fb7-dfb3ad24b39c","Type":"ContainerStarted","Data":"f98f9bd65f0c1fda7b3e9e6e5bc2bc91359b0a13e15559564ea838654c58fb4a"} Apr 22 18:45:36.909072 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.908677 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-w7ggk" event={"ID":"eedaedab-55b7-4947-8fb7-dfb3ad24b39c","Type":"ContainerStarted","Data":"1f8b8dde7ed2d1f67cd3e4857a30a8ad23fc366e7d7dd0d236c19248184d5fc1"} Apr 22 18:45:36.910247 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.910216 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4whhs" event={"ID":"c3364bfb-9927-4cd3-89a5-1137295070fd","Type":"ContainerStarted","Data":"28c9d99b436ff6a69afc6993a2a3fa0c7898891f0a64ba31d6932d67bf636538"} Apr 22 18:45:36.910247 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.910244 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4whhs" event={"ID":"c3364bfb-9927-4cd3-89a5-1137295070fd","Type":"ContainerStarted","Data":"45f54adf55718ff7f500f0c7e8dafaad863094a2a4a3cffa01353438b86dad0c"} Apr 22 18:45:36.956588 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.956546 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-4whhs" podStartSLOduration=1.803763689 podStartE2EDuration="2.956532257s" podCreationTimestamp="2026-04-22 18:45:34 +0000 UTC" firstStartedPulling="2026-04-22 18:45:35.155064504 +0000 UTC m=+126.138530423" lastFinishedPulling="2026-04-22 18:45:36.307833071 +0000 UTC m=+127.291298991" observedRunningTime="2026-04-22 18:45:36.956360344 +0000 UTC m=+127.939826283" watchObservedRunningTime="2026-04-22 18:45:36.956532257 +0000 UTC m=+127.939998200" Apr 22 18:45:36.956696 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:36.956643 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-w7ggk" podStartSLOduration=0.956637196 podStartE2EDuration="956.637196ms" podCreationTimestamp="2026-04-22 18:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:45:36.931741303 +0000 UTC m=+127.915207244" watchObservedRunningTime="2026-04-22 18:45:36.956637196 +0000 UTC m=+127.940103129" Apr 22 18:45:37.513447 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:37.513410 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:37.513764 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:37.513514 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:37.513764 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:37.513603 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle podName:78f709d1-ef5e-40ac-8845-3f0108fe6b96 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:45.513580589 +0000 UTC m=+136.497046511 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle") pod "router-default-88bf8fcd4-w7x4n" (UID: "78f709d1-ef5e-40ac-8845-3f0108fe6b96") : configmap references non-existent config key: service-ca.crt Apr 22 18:45:37.513764 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:37.513674 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:45:37.513764 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:37.513730 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs podName:78f709d1-ef5e-40ac-8845-3f0108fe6b96 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:45.513714183 +0000 UTC m=+136.497180136 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs") pod "router-default-88bf8fcd4-w7x4n" (UID: "78f709d1-ef5e-40ac-8845-3f0108fe6b96") : secret "router-metrics-certs-default" not found Apr 22 18:45:37.715586 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:37.715547 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-62t72\" (UID: \"71f3cd65-16e4-4173-821a-48a924d80e7a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" Apr 22 18:45:37.715791 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:37.715724 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:45:37.715867 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:37.715799 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert podName:71f3cd65-16e4-4173-821a-48a924d80e7a nodeName:}" failed. No retries permitted until 2026-04-22 18:45:45.71577806 +0000 UTC m=+136.699243983 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-62t72" (UID: "71f3cd65-16e4-4173-821a-48a924d80e7a") : secret "networking-console-plugin-cert" not found Apr 22 18:45:39.226502 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:39.226464 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:45:39.226896 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:39.226605 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:45:39.226896 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:39.226686 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs podName:317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:41.226670285 +0000 UTC m=+252.210136204 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs") pod "network-metrics-daemon-w8q5c" (UID: "317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282") : secret "metrics-daemon-secret" not found Apr 22 18:45:45.578284 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:45.578246 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:45.578715 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:45.578335 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:45.578921 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:45.578901 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78f709d1-ef5e-40ac-8845-3f0108fe6b96-service-ca-bundle\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:45.580700 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:45.580681 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78f709d1-ef5e-40ac-8845-3f0108fe6b96-metrics-certs\") pod \"router-default-88bf8fcd4-w7x4n\" (UID: \"78f709d1-ef5e-40ac-8845-3f0108fe6b96\") " pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:45.696090 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:45.696051 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:45.780502 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:45.780473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-62t72\" (UID: \"71f3cd65-16e4-4173-821a-48a924d80e7a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" Apr 22 18:45:45.780650 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:45.780589 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:45:45.780715 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:45:45.780660 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert podName:71f3cd65-16e4-4173-821a-48a924d80e7a nodeName:}" failed. No retries permitted until 2026-04-22 18:46:01.780645421 +0000 UTC m=+152.764111340 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-62t72" (UID: "71f3cd65-16e4-4173-821a-48a924d80e7a") : secret "networking-console-plugin-cert" not found Apr 22 18:45:45.820550 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:45.820529 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-88bf8fcd4-w7x4n"] Apr 22 18:45:45.822858 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:45:45.822825 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78f709d1_ef5e_40ac_8845_3f0108fe6b96.slice/crio-27faa21a62eec740e45589658514b2314e8bb15d66bd47eb89ee8c255ac4140d WatchSource:0}: Error finding container 27faa21a62eec740e45589658514b2314e8bb15d66bd47eb89ee8c255ac4140d: Status 404 returned error can't find the container with id 27faa21a62eec740e45589658514b2314e8bb15d66bd47eb89ee8c255ac4140d Apr 22 18:45:45.933709 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:45.933675 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" event={"ID":"78f709d1-ef5e-40ac-8845-3f0108fe6b96","Type":"ContainerStarted","Data":"9e8966348316030074573fdf540f95d429e9fe28362b0254872274db8a19943e"} Apr 22 18:45:45.933851 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:45.933717 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" event={"ID":"78f709d1-ef5e-40ac-8845-3f0108fe6b96","Type":"ContainerStarted","Data":"27faa21a62eec740e45589658514b2314e8bb15d66bd47eb89ee8c255ac4140d"} Apr 22 18:45:45.954993 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:45.954948 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" podStartSLOduration=16.954931115 podStartE2EDuration="16.954931115s" podCreationTimestamp="2026-04-22 18:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:45:45.953493699 +0000 UTC m=+136.936959643" watchObservedRunningTime="2026-04-22 18:45:45.954931115 +0000 UTC m=+136.938397057" Apr 22 18:45:46.697231 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:46.697199 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:46.699552 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:46.699531 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:46.936861 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:46.936834 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:46.937965 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:46.937948 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-88bf8fcd4-w7x4n" Apr 22 18:45:58.896111 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:58.896077 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gvsxk"] Apr 22 18:45:58.899059 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:58.899030 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:58.901937 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:58.901911 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:45:58.902795 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:58.902772 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:45:58.902914 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:58.902860 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zv9q9\"" Apr 22 18:45:58.923767 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:58.923742 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gvsxk"] Apr 22 18:45:59.083696 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.083654 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9a880c07-41b1-4390-9b45-e37ff33a6bbc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.083897 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.083753 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9a880c07-41b1-4390-9b45-e37ff33a6bbc-crio-socket\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.083897 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.083838 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9a880c07-41b1-4390-9b45-e37ff33a6bbc-data-volume\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.083897 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.083867 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9a880c07-41b1-4390-9b45-e37ff33a6bbc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.083897 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.083885 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd2qr\" (UniqueName: \"kubernetes.io/projected/9a880c07-41b1-4390-9b45-e37ff33a6bbc-kube-api-access-vd2qr\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.185250 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.185142 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9a880c07-41b1-4390-9b45-e37ff33a6bbc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.185250 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.185188 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9a880c07-41b1-4390-9b45-e37ff33a6bbc-crio-socket\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.185486 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.185264 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/9a880c07-41b1-4390-9b45-e37ff33a6bbc-crio-socket\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.185486 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.185303 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9a880c07-41b1-4390-9b45-e37ff33a6bbc-data-volume\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.185486 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.185326 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9a880c07-41b1-4390-9b45-e37ff33a6bbc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.185486 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.185344 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vd2qr\" (UniqueName: \"kubernetes.io/projected/9a880c07-41b1-4390-9b45-e37ff33a6bbc-kube-api-access-vd2qr\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.185741 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.185718 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/9a880c07-41b1-4390-9b45-e37ff33a6bbc-data-volume\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.185789 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.185742 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/9a880c07-41b1-4390-9b45-e37ff33a6bbc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.187875 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.187857 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/9a880c07-41b1-4390-9b45-e37ff33a6bbc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.205264 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.205233 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd2qr\" (UniqueName: \"kubernetes.io/projected/9a880c07-41b1-4390-9b45-e37ff33a6bbc-kube-api-access-vd2qr\") pod \"insights-runtime-extractor-gvsxk\" (UID: \"9a880c07-41b1-4390-9b45-e37ff33a6bbc\") " pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.209214 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.209191 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gvsxk" Apr 22 18:45:59.359329 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.359302 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gvsxk"] Apr 22 18:45:59.361827 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:45:59.361777 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a880c07_41b1_4390_9b45_e37ff33a6bbc.slice/crio-b9f5185f3f1f3f9a92c33ed80b7fa5a3171e6e9bc2f959edbc5e196fe81b9a92 WatchSource:0}: Error finding container b9f5185f3f1f3f9a92c33ed80b7fa5a3171e6e9bc2f959edbc5e196fe81b9a92: Status 404 returned error can't find the container with id b9f5185f3f1f3f9a92c33ed80b7fa5a3171e6e9bc2f959edbc5e196fe81b9a92 Apr 22 18:45:59.972067 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.972036 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gvsxk" event={"ID":"9a880c07-41b1-4390-9b45-e37ff33a6bbc","Type":"ContainerStarted","Data":"d430e699dc14e0d09d0cdacad348d5afbd71757026ffac0c57f52adacb7535df"} Apr 22 18:45:59.972403 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:45:59.972075 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gvsxk" event={"ID":"9a880c07-41b1-4390-9b45-e37ff33a6bbc","Type":"ContainerStarted","Data":"b9f5185f3f1f3f9a92c33ed80b7fa5a3171e6e9bc2f959edbc5e196fe81b9a92"} Apr 22 18:46:00.976068 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:00.976033 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gvsxk" event={"ID":"9a880c07-41b1-4390-9b45-e37ff33a6bbc","Type":"ContainerStarted","Data":"991cc6bf17bd8ecb594e2c901dd921efaf65a9a660a8668f3c648159bd7c5247"} Apr 22 18:46:01.809394 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:01.809288 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-62t72\" (UID: \"71f3cd65-16e4-4173-821a-48a924d80e7a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" Apr 22 18:46:01.812057 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:01.812029 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/71f3cd65-16e4-4173-821a-48a924d80e7a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-62t72\" (UID: \"71f3cd65-16e4-4173-821a-48a924d80e7a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" Apr 22 18:46:01.980172 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:01.980133 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gvsxk" event={"ID":"9a880c07-41b1-4390-9b45-e37ff33a6bbc","Type":"ContainerStarted","Data":"b876b551760b556d8d9ec640f22cfe41a6917e1b61fe5cbcb833f95bacd42bcf"} Apr 22 18:46:02.016896 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:02.016848 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gvsxk" podStartSLOduration=2.052192439 podStartE2EDuration="4.016832275s" podCreationTimestamp="2026-04-22 18:45:58 +0000 UTC" firstStartedPulling="2026-04-22 18:45:59.416673982 +0000 UTC m=+150.400139905" lastFinishedPulling="2026-04-22 18:46:01.381313822 +0000 UTC m=+152.364779741" observedRunningTime="2026-04-22 18:46:02.013135712 +0000 UTC m=+152.996601653" watchObservedRunningTime="2026-04-22 18:46:02.016832275 +0000 UTC m=+153.000298259" Apr 22 18:46:02.030069 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:02.030040 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" Apr 22 18:46:02.150455 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:02.150426 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-62t72"] Apr 22 18:46:02.154006 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:46:02.153972 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71f3cd65_16e4_4173_821a_48a924d80e7a.slice/crio-4dd90c6eb43cf89b2d4643691aa95e0ae336962a6a64dc48f6f88fa65a60477c WatchSource:0}: Error finding container 4dd90c6eb43cf89b2d4643691aa95e0ae336962a6a64dc48f6f88fa65a60477c: Status 404 returned error can't find the container with id 4dd90c6eb43cf89b2d4643691aa95e0ae336962a6a64dc48f6f88fa65a60477c Apr 22 18:46:02.986078 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:02.986037 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" event={"ID":"71f3cd65-16e4-4173-821a-48a924d80e7a","Type":"ContainerStarted","Data":"4dd90c6eb43cf89b2d4643691aa95e0ae336962a6a64dc48f6f88fa65a60477c"} Apr 22 18:46:03.990406 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:03.990366 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" event={"ID":"71f3cd65-16e4-4173-821a-48a924d80e7a","Type":"ContainerStarted","Data":"beb69b7ae0c60037a21001d086fb86a9afbc9bda4f8fc1c8a5149a2f9d05b79e"} Apr 22 18:46:04.009374 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:04.009327 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-62t72" podStartSLOduration=33.732644325 podStartE2EDuration="35.009311552s" podCreationTimestamp="2026-04-22 18:45:29 +0000 UTC" firstStartedPulling="2026-04-22 18:46:02.156108312 +0000 UTC m=+153.139574235" lastFinishedPulling="2026-04-22 18:46:03.432775543 +0000 UTC m=+154.416241462" observedRunningTime="2026-04-22 18:46:04.008196426 +0000 UTC m=+154.991662363" watchObservedRunningTime="2026-04-22 18:46:04.009311552 +0000 UTC m=+154.992777494" Apr 22 18:46:05.909383 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:46:05.909336 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-6zh6w" podUID="ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d" Apr 22 18:46:05.915493 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:46:05.915459 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-ltknd" podUID="57a5c4fb-aa29-4923-b848-a57df5a62462" Apr 22 18:46:05.995813 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:05.995784 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:46:05.995976 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:05.995957 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6zh6w" Apr 22 18:46:07.420027 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.419991 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xg28t"] Apr 22 18:46:07.422983 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.422964 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.425654 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.425632 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:46:07.425836 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.425814 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:46:07.425967 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.425751 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:46:07.426098 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.426070 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:46:07.426207 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.426188 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xvkjw\"" Apr 22 18:46:07.426826 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.426689 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:46:07.426826 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.426692 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:46:07.449451 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.449423 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2669c4ce-5cb0-4d49-9870-a1001889ea0c-sys\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.449609 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.449464 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-textfile\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.449609 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.449499 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-tls\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.449609 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.449548 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2669c4ce-5cb0-4d49-9870-a1001889ea0c-metrics-client-ca\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.449609 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.449591 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-accelerators-collector-config\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.449855 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.449684 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-wtmp\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.449855 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.449753 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btkrm\" (UniqueName: \"kubernetes.io/projected/2669c4ce-5cb0-4d49-9870-a1001889ea0c-kube-api-access-btkrm\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.449855 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.449801 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.449855 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.449842 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2669c4ce-5cb0-4d49-9870-a1001889ea0c-root\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.550463 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.550429 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-wtmp\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.550649 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.550504 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btkrm\" (UniqueName: \"kubernetes.io/projected/2669c4ce-5cb0-4d49-9870-a1001889ea0c-kube-api-access-btkrm\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.550649 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.550545 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.550649 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.550579 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-wtmp\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.550826 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.550655 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2669c4ce-5cb0-4d49-9870-a1001889ea0c-root\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.550826 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.550586 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2669c4ce-5cb0-4d49-9870-a1001889ea0c-root\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.550826 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.550744 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2669c4ce-5cb0-4d49-9870-a1001889ea0c-sys\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.550826 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.550769 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-textfile\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.550826 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.550792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-tls\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.550826 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.550816 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2669c4ce-5cb0-4d49-9870-a1001889ea0c-metrics-client-ca\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.551122 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:46:07.551098 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:46:07.551183 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.551155 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-textfile\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.551183 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:46:07.551175 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-tls podName:2669c4ce-5cb0-4d49-9870-a1001889ea0c nodeName:}" failed. No retries permitted until 2026-04-22 18:46:08.05115285 +0000 UTC m=+159.034618777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-tls") pod "node-exporter-xg28t" (UID: "2669c4ce-5cb0-4d49-9870-a1001889ea0c") : secret "node-exporter-tls" not found Apr 22 18:46:07.551300 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.551238 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2669c4ce-5cb0-4d49-9870-a1001889ea0c-sys\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.551432 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.551408 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-accelerators-collector-config\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.551895 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.551832 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2669c4ce-5cb0-4d49-9870-a1001889ea0c-metrics-client-ca\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.552105 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.551934 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-accelerators-collector-config\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.553937 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.553911 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.559692 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:07.559645 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btkrm\" (UniqueName: \"kubernetes.io/projected/2669c4ce-5cb0-4d49-9870-a1001889ea0c-kube-api-access-btkrm\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:07.566033 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:46:07.566003 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-w8q5c" podUID="317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282" Apr 22 18:46:08.055608 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.055556 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-tls\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:08.058093 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.058063 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2669c4ce-5cb0-4d49-9870-a1001889ea0c-node-exporter-tls\") pod \"node-exporter-xg28t\" (UID: \"2669c4ce-5cb0-4d49-9870-a1001889ea0c\") " pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:08.334206 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.334116 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xg28t" Apr 22 18:46:08.345149 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:46:08.345111 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2669c4ce_5cb0_4d49_9870_a1001889ea0c.slice/crio-62b8febaba62a7170f6ff099766dc18e3c538474b0370679fe5bbb5183a0368e WatchSource:0}: Error finding container 62b8febaba62a7170f6ff099766dc18e3c538474b0370679fe5bbb5183a0368e: Status 404 returned error can't find the container with id 62b8febaba62a7170f6ff099766dc18e3c538474b0370679fe5bbb5183a0368e Apr 22 18:46:08.476841 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.476807 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:46:08.479677 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.479657 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.482207 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.482179 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:46:08.482207 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.482200 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:46:08.482402 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.482185 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:46:08.482458 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.482430 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:46:08.482458 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.482436 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:46:08.482715 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.482691 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:46:08.482883 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.482858 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-cjmc2\"" Apr 22 18:46:08.482883 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.482862 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:46:08.483044 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.482902 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:46:08.483153 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.483138 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:46:08.494095 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.494075 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:46:08.560010 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.559978 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-config-volume\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.560010 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.560014 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.560256 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.560033 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-web-config\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.560256 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.560052 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88da529e-809b-42f5-a5d5-bbb569dab654-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.560256 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.560067 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2f2f\" (UniqueName: \"kubernetes.io/projected/88da529e-809b-42f5-a5d5-bbb569dab654-kube-api-access-z2f2f\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.560256 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.560128 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.560256 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.560159 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/88da529e-809b-42f5-a5d5-bbb569dab654-config-out\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.560256 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.560174 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/88da529e-809b-42f5-a5d5-bbb569dab654-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.560256 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.560219 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.560461 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.560261 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.560461 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.560278 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88da529e-809b-42f5-a5d5-bbb569dab654-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.560461 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.560297 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/88da529e-809b-42f5-a5d5-bbb569dab654-tls-assets\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.560461 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.560319 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.660950 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.660909 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.661159 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.660957 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88da529e-809b-42f5-a5d5-bbb569dab654-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.661159 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.660991 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/88da529e-809b-42f5-a5d5-bbb569dab654-tls-assets\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.661159 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.661027 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.661159 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.661063 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-config-volume\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.661159 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.661087 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.661159 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.661112 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-web-config\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.661159 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.661143 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88da529e-809b-42f5-a5d5-bbb569dab654-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.661513 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.661166 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2f2f\" (UniqueName: \"kubernetes.io/projected/88da529e-809b-42f5-a5d5-bbb569dab654-kube-api-access-z2f2f\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.661513 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:46:08.661174 2570 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 22 18:46:08.661513 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.661206 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.661513 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.661241 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/88da529e-809b-42f5-a5d5-bbb569dab654-config-out\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.661513 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:46:08.661264 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-main-tls podName:88da529e-809b-42f5-a5d5-bbb569dab654 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:09.161241539 +0000 UTC m=+160.144707472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "88da529e-809b-42f5-a5d5-bbb569dab654") : secret "alertmanager-main-tls" not found Apr 22 18:46:08.661513 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.661308 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/88da529e-809b-42f5-a5d5-bbb569dab654-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.661513 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.661382 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.663122 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.661800 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/88da529e-809b-42f5-a5d5-bbb569dab654-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.663122 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.662283 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88da529e-809b-42f5-a5d5-bbb569dab654-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.663122 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.662543 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88da529e-809b-42f5-a5d5-bbb569dab654-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.664423 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.664394 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.664665 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.664644 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-config-volume\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.665314 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.665291 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-web-config\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.665424 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.665328 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.665786 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.665765 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/88da529e-809b-42f5-a5d5-bbb569dab654-config-out\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.665918 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.665899 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/88da529e-809b-42f5-a5d5-bbb569dab654-tls-assets\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.665958 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.665928 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.666735 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.666719 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:08.669480 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:08.669461 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2f2f\" (UniqueName: \"kubernetes.io/projected/88da529e-809b-42f5-a5d5-bbb569dab654-kube-api-access-z2f2f\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:09.005923 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.005844 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xg28t" event={"ID":"2669c4ce-5cb0-4d49-9870-a1001889ea0c","Type":"ContainerStarted","Data":"62b8febaba62a7170f6ff099766dc18e3c538474b0370679fe5bbb5183a0368e"} Apr 22 18:46:09.167199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.167164 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:09.169639 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.169595 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:09.389738 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.389694 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:09.487516 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.487482 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5578c6869-cvzhf"] Apr 22 18:46:09.494116 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.494086 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.497316 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.497289 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 18:46:09.497460 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.497323 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 18:46:09.497460 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.497306 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-phtn5\"" Apr 22 18:46:09.497703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.497455 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 18:46:09.497703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.497583 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 18:46:09.498195 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.498178 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-a0f0jhs20fv2q\"" Apr 22 18:46:09.498233 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.498206 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 18:46:09.507699 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.507679 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5578c6869-cvzhf"] Apr 22 18:46:09.531308 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.531238 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:46:09.534449 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:46:09.534414 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88da529e_809b_42f5_a5d5_bbb569dab654.slice/crio-19886abda5677155273e49fa760c7b27a0e23be1e57aefe5306e023a8df4f8a6 WatchSource:0}: Error finding container 19886abda5677155273e49fa760c7b27a0e23be1e57aefe5306e023a8df4f8a6: Status 404 returned error can't find the container with id 19886abda5677155273e49fa760c7b27a0e23be1e57aefe5306e023a8df4f8a6 Apr 22 18:46:09.570379 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.570353 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-grpc-tls\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.570511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.570399 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.570511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.570422 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwc9r\" (UniqueName: \"kubernetes.io/projected/bb72636f-5aad-4c98-b54d-2f66da79f35d-kube-api-access-vwc9r\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.570656 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.570612 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.570726 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.570682 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.570726 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.570710 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-tls\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.570816 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.570752 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.570816 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.570782 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb72636f-5aad-4c98-b54d-2f66da79f35d-metrics-client-ca\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.671972 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.671935 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-grpc-tls\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.672143 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.671985 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.672143 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.672009 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwc9r\" (UniqueName: \"kubernetes.io/projected/bb72636f-5aad-4c98-b54d-2f66da79f35d-kube-api-access-vwc9r\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.672143 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.672042 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.672274 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.672201 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.672274 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.672244 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-tls\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.672372 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.672293 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.672372 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.672319 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb72636f-5aad-4c98-b54d-2f66da79f35d-metrics-client-ca\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.673290 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.673259 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb72636f-5aad-4c98-b54d-2f66da79f35d-metrics-client-ca\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.674941 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.674908 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-tls\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.675027 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.674969 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.675111 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.675095 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-grpc-tls\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.675171 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.675148 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.675286 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.675272 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.675401 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.675377 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb72636f-5aad-4c98-b54d-2f66da79f35d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.680687 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.680660 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwc9r\" (UniqueName: \"kubernetes.io/projected/bb72636f-5aad-4c98-b54d-2f66da79f35d-kube-api-access-vwc9r\") pod \"thanos-querier-5578c6869-cvzhf\" (UID: \"bb72636f-5aad-4c98-b54d-2f66da79f35d\") " pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.803372 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.803290 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:09.927838 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:09.927807 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5578c6869-cvzhf"] Apr 22 18:46:09.931410 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:46:09.931382 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb72636f_5aad_4c98_b54d_2f66da79f35d.slice/crio-60450059bfc36d576345365f45f4279a08284c9301013a8592a947d608d9ec34 WatchSource:0}: Error finding container 60450059bfc36d576345365f45f4279a08284c9301013a8592a947d608d9ec34: Status 404 returned error can't find the container with id 60450059bfc36d576345365f45f4279a08284c9301013a8592a947d608d9ec34 Apr 22 18:46:10.010045 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:10.010002 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" event={"ID":"bb72636f-5aad-4c98-b54d-2f66da79f35d","Type":"ContainerStarted","Data":"60450059bfc36d576345365f45f4279a08284c9301013a8592a947d608d9ec34"} Apr 22 18:46:10.011294 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:10.011263 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerStarted","Data":"19886abda5677155273e49fa760c7b27a0e23be1e57aefe5306e023a8df4f8a6"} Apr 22 18:46:10.013086 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:10.013055 2570 generic.go:358] "Generic (PLEG): container finished" podID="2669c4ce-5cb0-4d49-9870-a1001889ea0c" containerID="d9afb88b59e17e84fbe8a8c343c830565a6c8903a0d014dafbe6ef159244ae41" exitCode=0 Apr 22 18:46:10.013230 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:10.013097 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xg28t" event={"ID":"2669c4ce-5cb0-4d49-9870-a1001889ea0c","Type":"ContainerDied","Data":"d9afb88b59e17e84fbe8a8c343c830565a6c8903a0d014dafbe6ef159244ae41"} Apr 22 18:46:10.881853 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:10.881800 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:46:10.882333 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:10.881869 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert\") pod \"ingress-canary-ltknd\" (UID: \"57a5c4fb-aa29-4923-b848-a57df5a62462\") " pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:46:10.884863 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:10.884833 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57a5c4fb-aa29-4923-b848-a57df5a62462-cert\") pod \"ingress-canary-ltknd\" (UID: \"57a5c4fb-aa29-4923-b848-a57df5a62462\") " pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:46:10.885275 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:10.885252 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d-metrics-tls\") pod \"dns-default-6zh6w\" (UID: \"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d\") " pod="openshift-dns/dns-default-6zh6w" Apr 22 18:46:11.017751 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.017704 2570 generic.go:358] "Generic (PLEG): container finished" podID="88da529e-809b-42f5-a5d5-bbb569dab654" containerID="7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048" exitCode=0 Apr 22 18:46:11.017917 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.017791 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerDied","Data":"7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048"} Apr 22 18:46:11.020277 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.020248 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xg28t" event={"ID":"2669c4ce-5cb0-4d49-9870-a1001889ea0c","Type":"ContainerStarted","Data":"09e169a591faf512eef4f2463cfc4659b059144897be3b158535f23a7b0f32d4"} Apr 22 18:46:11.020406 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.020290 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xg28t" event={"ID":"2669c4ce-5cb0-4d49-9870-a1001889ea0c","Type":"ContainerStarted","Data":"bb63ca6e6c00616b226b9fe5a11e273dec61c5188b88f6ac30914a48c2383801"} Apr 22 18:46:11.100223 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.100185 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xcpp5\"" Apr 22 18:46:11.100409 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.100381 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xmncv\"" Apr 22 18:46:11.107147 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.107121 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ltknd" Apr 22 18:46:11.107292 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.107120 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6zh6w" Apr 22 18:46:11.263993 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.263946 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xg28t" podStartSLOduration=3.531077773 podStartE2EDuration="4.2639271s" podCreationTimestamp="2026-04-22 18:46:07 +0000 UTC" firstStartedPulling="2026-04-22 18:46:08.347379637 +0000 UTC m=+159.330845559" lastFinishedPulling="2026-04-22 18:46:09.080228964 +0000 UTC m=+160.063694886" observedRunningTime="2026-04-22 18:46:11.189558742 +0000 UTC m=+162.173024877" watchObservedRunningTime="2026-04-22 18:46:11.2639271 +0000 UTC m=+162.247393041" Apr 22 18:46:11.264341 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.264324 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6zh6w"] Apr 22 18:46:11.266580 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:46:11.266555 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef9eca3e_08aa_4a39_b6ce_a3c3f71dc95d.slice/crio-dccd57e10fbeb092b8eac634aa7bdbf526af430c5f1fb5854e5b0399cf62057c WatchSource:0}: Error finding container dccd57e10fbeb092b8eac634aa7bdbf526af430c5f1fb5854e5b0399cf62057c: Status 404 returned error can't find the container with id dccd57e10fbeb092b8eac634aa7bdbf526af430c5f1fb5854e5b0399cf62057c Apr 22 18:46:11.281840 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.281816 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ltknd"] Apr 22 18:46:11.285719 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:46:11.285685 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a5c4fb_aa29_4923_b848_a57df5a62462.slice/crio-79a5cd9ad6ae4dfbd6887bbf39461f3eef7ec5a32971dbf1fc1ccda184a8a333 WatchSource:0}: Error finding container 79a5cd9ad6ae4dfbd6887bbf39461f3eef7ec5a32971dbf1fc1ccda184a8a333: Status 404 returned error can't find the container with id 79a5cd9ad6ae4dfbd6887bbf39461f3eef7ec5a32971dbf1fc1ccda184a8a333 Apr 22 18:46:11.785965 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.785922 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7d7464b78d-44zkq"] Apr 22 18:46:11.788470 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.788437 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.790971 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.790949 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 18:46:11.792134 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.791989 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 18:46:11.792134 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.792024 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-agbp49dg02g4c\"" Apr 22 18:46:11.792134 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.792054 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:46:11.792529 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.792511 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-fpmgw\"" Apr 22 18:46:11.792640 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.792584 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 18:46:11.799334 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.799314 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7d7464b78d-44zkq"] Apr 22 18:46:11.891931 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.891888 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95e79537-ff76-4401-abc7-6ccb62d28f5b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.892354 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.891943 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/95e79537-ff76-4401-abc7-6ccb62d28f5b-metrics-server-audit-profiles\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.892354 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.892083 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/95e79537-ff76-4401-abc7-6ccb62d28f5b-audit-log\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.892354 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.892111 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e79537-ff76-4401-abc7-6ccb62d28f5b-client-ca-bundle\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.892354 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.892148 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/95e79537-ff76-4401-abc7-6ccb62d28f5b-secret-metrics-server-tls\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.892354 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.892177 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/95e79537-ff76-4401-abc7-6ccb62d28f5b-secret-metrics-server-client-certs\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.892354 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.892204 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h94q6\" (UniqueName: \"kubernetes.io/projected/95e79537-ff76-4401-abc7-6ccb62d28f5b-kube-api-access-h94q6\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.993699 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.993652 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95e79537-ff76-4401-abc7-6ccb62d28f5b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.993811 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.993717 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/95e79537-ff76-4401-abc7-6ccb62d28f5b-metrics-server-audit-profiles\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.993862 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.993826 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/95e79537-ff76-4401-abc7-6ccb62d28f5b-audit-log\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.993862 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.993853 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e79537-ff76-4401-abc7-6ccb62d28f5b-client-ca-bundle\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.993951 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.993887 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/95e79537-ff76-4401-abc7-6ccb62d28f5b-secret-metrics-server-tls\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.993951 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.993919 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/95e79537-ff76-4401-abc7-6ccb62d28f5b-secret-metrics-server-client-certs\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.994049 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.993945 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h94q6\" (UniqueName: \"kubernetes.io/projected/95e79537-ff76-4401-abc7-6ccb62d28f5b-kube-api-access-h94q6\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.994531 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.994465 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/95e79537-ff76-4401-abc7-6ccb62d28f5b-audit-log\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.994649 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.994610 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95e79537-ff76-4401-abc7-6ccb62d28f5b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.995648 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.995600 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/95e79537-ff76-4401-abc7-6ccb62d28f5b-metrics-server-audit-profiles\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.997460 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.997436 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/95e79537-ff76-4401-abc7-6ccb62d28f5b-secret-metrics-server-tls\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.997659 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.997638 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e79537-ff76-4401-abc7-6ccb62d28f5b-client-ca-bundle\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:11.998788 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:11.998763 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/95e79537-ff76-4401-abc7-6ccb62d28f5b-secret-metrics-server-client-certs\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:12.003182 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.003156 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h94q6\" (UniqueName: \"kubernetes.io/projected/95e79537-ff76-4401-abc7-6ccb62d28f5b-kube-api-access-h94q6\") pod \"metrics-server-7d7464b78d-44zkq\" (UID: \"95e79537-ff76-4401-abc7-6ccb62d28f5b\") " pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:12.024893 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.024855 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6zh6w" event={"ID":"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d","Type":"ContainerStarted","Data":"dccd57e10fbeb092b8eac634aa7bdbf526af430c5f1fb5854e5b0399cf62057c"} Apr 22 18:46:12.025976 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.025947 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ltknd" event={"ID":"57a5c4fb-aa29-4923-b848-a57df5a62462","Type":"ContainerStarted","Data":"79a5cd9ad6ae4dfbd6887bbf39461f3eef7ec5a32971dbf1fc1ccda184a8a333"} Apr 22 18:46:12.102226 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.102194 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:12.160290 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.159561 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-7fjpj"] Apr 22 18:46:12.162297 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.162270 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7fjpj" Apr 22 18:46:12.164870 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.164844 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 18:46:12.165252 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.165012 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-z6zzk\"" Apr 22 18:46:12.170444 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.170403 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-7fjpj"] Apr 22 18:46:12.196478 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.196371 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a92952e7-39fe-4887-b105-21e90a80d306-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-7fjpj\" (UID: \"a92952e7-39fe-4887-b105-21e90a80d306\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7fjpj" Apr 22 18:46:12.297703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.297649 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a92952e7-39fe-4887-b105-21e90a80d306-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-7fjpj\" (UID: \"a92952e7-39fe-4887-b105-21e90a80d306\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7fjpj" Apr 22 18:46:12.298296 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:46:12.297879 2570 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 18:46:12.298296 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:46:12.297953 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a92952e7-39fe-4887-b105-21e90a80d306-monitoring-plugin-cert podName:a92952e7-39fe-4887-b105-21e90a80d306 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:12.797931915 +0000 UTC m=+163.781397835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/a92952e7-39fe-4887-b105-21e90a80d306-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-7fjpj" (UID: "a92952e7-39fe-4887-b105-21e90a80d306") : secret "monitoring-plugin-cert" not found Apr 22 18:46:12.618694 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.618655 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-64fdcd7476-5r6w8"] Apr 22 18:46:12.621995 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.621951 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.625419 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.624995 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-ljhv8\"" Apr 22 18:46:12.625419 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.625248 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 18:46:12.625419 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.625272 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 18:46:12.625962 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.625656 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 18:46:12.625962 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.625848 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 18:46:12.626208 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.626135 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 18:46:12.648938 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.642914 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 18:46:12.652249 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.652177 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-64fdcd7476-5r6w8"] Apr 22 18:46:12.701004 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.700976 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-metrics-client-ca\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.701128 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.701012 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-telemeter-client-tls\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.701128 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.701035 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-serving-certs-ca-bundle\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.701128 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.701057 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-federate-client-tls\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.701128 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.701090 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-secret-telemeter-client\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.701128 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.701107 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.701383 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.701162 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.701383 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.701178 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvkgq\" (UniqueName: \"kubernetes.io/projected/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-kube-api-access-jvkgq\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.781939 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.781803 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7d7464b78d-44zkq"] Apr 22 18:46:12.787896 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:46:12.787868 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95e79537_ff76_4401_abc7_6ccb62d28f5b.slice/crio-2fd4b377b88f4dc085aaa7134e99bfecacf385e30b0c6fe9f85c14ddc6d45613 WatchSource:0}: Error finding container 2fd4b377b88f4dc085aaa7134e99bfecacf385e30b0c6fe9f85c14ddc6d45613: Status 404 returned error can't find the container with id 2fd4b377b88f4dc085aaa7134e99bfecacf385e30b0c6fe9f85c14ddc6d45613 Apr 22 18:46:12.802642 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.802027 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.802642 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.802068 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvkgq\" (UniqueName: \"kubernetes.io/projected/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-kube-api-access-jvkgq\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.802642 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.802120 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-metrics-client-ca\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.802642 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.802147 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-telemeter-client-tls\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.802642 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.802176 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-serving-certs-ca-bundle\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.802642 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.802212 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-federate-client-tls\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.802642 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.802260 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-secret-telemeter-client\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.802642 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.802286 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.802642 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.802354 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a92952e7-39fe-4887-b105-21e90a80d306-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-7fjpj\" (UID: \"a92952e7-39fe-4887-b105-21e90a80d306\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7fjpj" Apr 22 18:46:12.803558 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.803168 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-telemeter-trusted-ca-bundle\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.804180 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.804149 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-metrics-client-ca\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.804735 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.804478 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-serving-certs-ca-bundle\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.807213 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.807184 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-secret-telemeter-client\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.807399 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.807353 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-telemeter-client-tls\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.808608 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.808564 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a92952e7-39fe-4887-b105-21e90a80d306-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-7fjpj\" (UID: \"a92952e7-39fe-4887-b105-21e90a80d306\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7fjpj" Apr 22 18:46:12.809014 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.808893 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.809014 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.809006 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-federate-client-tls\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.812733 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.812706 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvkgq\" (UniqueName: \"kubernetes.io/projected/4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af-kube-api-access-jvkgq\") pod \"telemeter-client-64fdcd7476-5r6w8\" (UID: \"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af\") " pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:12.937851 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:12.937758 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" Apr 22 18:46:13.031803 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.031765 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" event={"ID":"bb72636f-5aad-4c98-b54d-2f66da79f35d","Type":"ContainerStarted","Data":"9fc7ed9e6ca829183074d94735de8fa9ba386b2d4b80aff6062d8f2b17107e30"} Apr 22 18:46:13.031803 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.031808 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" event={"ID":"bb72636f-5aad-4c98-b54d-2f66da79f35d","Type":"ContainerStarted","Data":"3fc811bd74037d6dba3f896fa44ddd16b5b8090285c37addd5bd9b69b4f683cc"} Apr 22 18:46:13.031803 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.031818 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" event={"ID":"bb72636f-5aad-4c98-b54d-2f66da79f35d","Type":"ContainerStarted","Data":"1630c6306fdc9d6d01536db1952bfe4af0a2b48616e90f48078aeac6207a36e4"} Apr 22 18:46:13.034049 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.033999 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerStarted","Data":"5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b"} Apr 22 18:46:13.034178 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.034060 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerStarted","Data":"28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb"} Apr 22 18:46:13.035053 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.035016 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" event={"ID":"95e79537-ff76-4401-abc7-6ccb62d28f5b","Type":"ContainerStarted","Data":"2fd4b377b88f4dc085aaa7134e99bfecacf385e30b0c6fe9f85c14ddc6d45613"} Apr 22 18:46:13.078489 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.078448 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7fjpj" Apr 22 18:46:13.683892 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.683855 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:46:13.687384 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.687353 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.691857 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.691828 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:46:13.692126 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.692106 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:46:13.692689 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.692667 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:46:13.692789 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.692725 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:46:13.693056 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.693031 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:46:13.693297 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.693272 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:46:13.693413 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.693394 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-12g2s2bc200p7\"" Apr 22 18:46:13.693475 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.693279 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:46:13.693475 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.693451 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:46:13.693582 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.693497 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:46:13.693582 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.693453 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:46:13.693714 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.693695 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-785mk\"" Apr 22 18:46:13.698365 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.698346 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:46:13.699473 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.699451 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:46:13.709926 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.709901 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710037 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.709959 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710090 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710059 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710144 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710109 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzgf8\" (UniqueName: \"kubernetes.io/projected/fb8ac54e-a082-459a-a5ea-76e30144ed07-kube-api-access-zzgf8\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710144 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710130 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-config\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710261 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710186 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710261 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710232 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710369 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710266 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710369 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710302 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710369 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710360 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710462 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710376 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-web-config\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710462 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710404 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710462 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710430 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710462 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710450 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710637 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710467 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb8ac54e-a082-459a-a5ea-76e30144ed07-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710637 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710504 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710637 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710531 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.710637 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.710600 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb8ac54e-a082-459a-a5ea-76e30144ed07-config-out\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.715421 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.715400 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:46:13.811385 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811349 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.811385 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811396 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.811594 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811435 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb8ac54e-a082-459a-a5ea-76e30144ed07-config-out\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.811594 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811467 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.811594 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811506 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.811594 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811569 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.811808 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811640 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzgf8\" (UniqueName: \"kubernetes.io/projected/fb8ac54e-a082-459a-a5ea-76e30144ed07-kube-api-access-zzgf8\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.811808 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811667 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-config\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.811808 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811696 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.811808 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811749 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.811808 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811778 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.812034 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811811 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.812034 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811846 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.812034 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811871 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-web-config\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.812034 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811903 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.812034 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811930 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.812034 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811963 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.812034 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.811990 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb8ac54e-a082-459a-a5ea-76e30144ed07-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.812365 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.812254 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.812365 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.812293 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.814874 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.812804 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.814874 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.813859 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.814874 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.814539 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.815668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.815530 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.815668 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.815587 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.815926 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.815881 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb8ac54e-a082-459a-a5ea-76e30144ed07-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.816511 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.816485 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.816903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.816882 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb8ac54e-a082-459a-a5ea-76e30144ed07-config-out\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.817544 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.817518 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.818844 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.818456 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.818844 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.818787 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.818998 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.818903 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.819230 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.819132 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.819563 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.819543 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-web-config\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.820374 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.820352 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-config\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:13.831198 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:13.831163 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzgf8\" (UniqueName: \"kubernetes.io/projected/fb8ac54e-a082-459a-a5ea-76e30144ed07-kube-api-access-zzgf8\") pod \"prometheus-k8s-0\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:14.001158 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:14.001038 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:14.349717 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:14.348866 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-7fjpj"] Apr 22 18:46:14.397552 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:14.397462 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:46:14.421282 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:14.421238 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-64fdcd7476-5r6w8"] Apr 22 18:46:14.530831 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:46:14.530793 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d11bfd0_3b3f_43f2_ae0c_bc7b4f8598af.slice/crio-1daf27199ea298dd2db4568ddf7b5690448f1bddf54cc96d3dbf25e605ce5d91 WatchSource:0}: Error finding container 1daf27199ea298dd2db4568ddf7b5690448f1bddf54cc96d3dbf25e605ce5d91: Status 404 returned error can't find the container with id 1daf27199ea298dd2db4568ddf7b5690448f1bddf54cc96d3dbf25e605ce5d91 Apr 22 18:46:14.531435 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:46:14.531405 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb8ac54e_a082_459a_a5ea_76e30144ed07.slice/crio-1b58437918ed8bda735686bd807e9589e0bdb25de2e9a73750deb5fd2f1586e0 WatchSource:0}: Error finding container 1b58437918ed8bda735686bd807e9589e0bdb25de2e9a73750deb5fd2f1586e0: Status 404 returned error can't find the container with id 1b58437918ed8bda735686bd807e9589e0bdb25de2e9a73750deb5fd2f1586e0 Apr 22 18:46:15.047390 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.047297 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ltknd" event={"ID":"57a5c4fb-aa29-4923-b848-a57df5a62462","Type":"ContainerStarted","Data":"f0e764656092bfc6f456f4195760c8b7c9e0cf3ef3f18287b2e46b7f6b996b5b"} Apr 22 18:46:15.050230 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.050198 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" event={"ID":"95e79537-ff76-4401-abc7-6ccb62d28f5b","Type":"ContainerStarted","Data":"e69773c6e48758109ee136ee04635b6350bfbfb5d04a929ded35e9166009a570"} Apr 22 18:46:15.055660 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.055510 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" event={"ID":"bb72636f-5aad-4c98-b54d-2f66da79f35d","Type":"ContainerStarted","Data":"658e494d9a827812aa3746c550b5eab87b9b981e17c6145ec6c6e41ffde0958f"} Apr 22 18:46:15.055660 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.055545 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" event={"ID":"bb72636f-5aad-4c98-b54d-2f66da79f35d","Type":"ContainerStarted","Data":"98265c5a44eed6976b4e08052fcbfb55543e6befc9d2cc9f15b086d49967044d"} Apr 22 18:46:15.055660 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.055558 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" event={"ID":"bb72636f-5aad-4c98-b54d-2f66da79f35d","Type":"ContainerStarted","Data":"f07bbe89ece7886f74e95576bab8cdc97f3b1499da7a98350c216573ee36eb29"} Apr 22 18:46:15.055982 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.055954 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:15.060929 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.060883 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerStarted","Data":"6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a"} Apr 22 18:46:15.060929 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.060917 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerStarted","Data":"ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439"} Apr 22 18:46:15.060929 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.060934 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerStarted","Data":"a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1"} Apr 22 18:46:15.061138 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.060947 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerStarted","Data":"3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6"} Apr 22 18:46:15.063891 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.063454 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6zh6w" event={"ID":"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d","Type":"ContainerStarted","Data":"acb94f6e3ee1c28faf90cb9ffbfdaa573160786067fb0d6b0b7d04e4275eca5a"} Apr 22 18:46:15.063891 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.063517 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6zh6w" event={"ID":"ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d","Type":"ContainerStarted","Data":"5a034a31a34f4a4523560b0892ecfe39344de982b4117feaf841e3ac2999ddcd"} Apr 22 18:46:15.064048 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.063911 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-6zh6w" Apr 22 18:46:15.065122 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.065095 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7fjpj" event={"ID":"a92952e7-39fe-4887-b105-21e90a80d306","Type":"ContainerStarted","Data":"4ee92cb24cd2ec8b3e9d2da44a99b1af1449fd0b6255950f83cc26e2e1842528"} Apr 22 18:46:15.066990 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.066964 2570 generic.go:358] "Generic (PLEG): container finished" podID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerID="7436b19beaf7703a1ab5990581e009f354e9033b96f0791a8be329c4560a5e20" exitCode=0 Apr 22 18:46:15.067281 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.067212 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerDied","Data":"7436b19beaf7703a1ab5990581e009f354e9033b96f0791a8be329c4560a5e20"} Apr 22 18:46:15.067281 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.067248 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerStarted","Data":"1b58437918ed8bda735686bd807e9589e0bdb25de2e9a73750deb5fd2f1586e0"} Apr 22 18:46:15.069651 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.068930 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ltknd" podStartSLOduration=130.181449205 podStartE2EDuration="2m13.068914522s" podCreationTimestamp="2026-04-22 18:44:02 +0000 UTC" firstStartedPulling="2026-04-22 18:46:11.287496691 +0000 UTC m=+162.270962610" lastFinishedPulling="2026-04-22 18:46:14.174961994 +0000 UTC m=+165.158427927" observedRunningTime="2026-04-22 18:46:15.067832339 +0000 UTC m=+166.051298280" watchObservedRunningTime="2026-04-22 18:46:15.068914522 +0000 UTC m=+166.052380464" Apr 22 18:46:15.070597 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.070253 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" event={"ID":"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af","Type":"ContainerStarted","Data":"1daf27199ea298dd2db4568ddf7b5690448f1bddf54cc96d3dbf25e605ce5d91"} Apr 22 18:46:15.095648 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.095016 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" podStartSLOduration=1.469158163 podStartE2EDuration="6.094992715s" podCreationTimestamp="2026-04-22 18:46:09 +0000 UTC" firstStartedPulling="2026-04-22 18:46:09.933262041 +0000 UTC m=+160.916727963" lastFinishedPulling="2026-04-22 18:46:14.559096584 +0000 UTC m=+165.542562515" observedRunningTime="2026-04-22 18:46:15.093218675 +0000 UTC m=+166.076684617" watchObservedRunningTime="2026-04-22 18:46:15.094992715 +0000 UTC m=+166.078458657" Apr 22 18:46:15.178410 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.178338 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.069778205 podStartE2EDuration="7.178316544s" podCreationTimestamp="2026-04-22 18:46:08 +0000 UTC" firstStartedPulling="2026-04-22 18:46:09.53647804 +0000 UTC m=+160.519943963" lastFinishedPulling="2026-04-22 18:46:12.645016369 +0000 UTC m=+163.628482302" observedRunningTime="2026-04-22 18:46:15.176098384 +0000 UTC m=+166.159564345" watchObservedRunningTime="2026-04-22 18:46:15.178316544 +0000 UTC m=+166.161782486" Apr 22 18:46:15.211498 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.211438 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6zh6w" podStartSLOduration=130.316366856 podStartE2EDuration="2m13.211421431s" podCreationTimestamp="2026-04-22 18:44:02 +0000 UTC" firstStartedPulling="2026-04-22 18:46:11.269217891 +0000 UTC m=+162.252683812" lastFinishedPulling="2026-04-22 18:46:14.164272463 +0000 UTC m=+165.147738387" observedRunningTime="2026-04-22 18:46:15.209030271 +0000 UTC m=+166.192496251" watchObservedRunningTime="2026-04-22 18:46:15.211421431 +0000 UTC m=+166.194887374" Apr 22 18:46:15.233891 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:15.233454 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" podStartSLOduration=2.847258514 podStartE2EDuration="4.233435267s" podCreationTimestamp="2026-04-22 18:46:11 +0000 UTC" firstStartedPulling="2026-04-22 18:46:12.791989502 +0000 UTC m=+163.775455437" lastFinishedPulling="2026-04-22 18:46:14.178166264 +0000 UTC m=+165.161632190" observedRunningTime="2026-04-22 18:46:15.231064598 +0000 UTC m=+166.214530564" watchObservedRunningTime="2026-04-22 18:46:15.233435267 +0000 UTC m=+166.216901210" Apr 22 18:46:16.076765 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:16.076666 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7fjpj" event={"ID":"a92952e7-39fe-4887-b105-21e90a80d306","Type":"ContainerStarted","Data":"88e99520d318216d4ef754bb26a499d26034e7d3d48cc16fb4aa2530bf111414"} Apr 22 18:46:16.077192 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:16.077079 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7fjpj" Apr 22 18:46:16.082496 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:16.082456 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7fjpj" Apr 22 18:46:16.106439 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:16.106378 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7fjpj" podStartSLOduration=2.820302749 podStartE2EDuration="4.106362819s" podCreationTimestamp="2026-04-22 18:46:12 +0000 UTC" firstStartedPulling="2026-04-22 18:46:14.371118472 +0000 UTC m=+165.354584395" lastFinishedPulling="2026-04-22 18:46:15.657178538 +0000 UTC m=+166.640644465" observedRunningTime="2026-04-22 18:46:16.091529362 +0000 UTC m=+167.074995304" watchObservedRunningTime="2026-04-22 18:46:16.106362819 +0000 UTC m=+167.089828763" Apr 22 18:46:17.083053 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:17.083018 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" event={"ID":"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af","Type":"ContainerStarted","Data":"4cce88ae260e263223f02a572accae1a6c615717506d837f6440a34354681896"} Apr 22 18:46:17.083053 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:17.083058 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" event={"ID":"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af","Type":"ContainerStarted","Data":"d7bc078f10df1c2a62df8cf87ee4a0c20acb8737a5a1ac2d9f7d525f6ebe516c"} Apr 22 18:46:17.083591 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:17.083067 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" event={"ID":"4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af","Type":"ContainerStarted","Data":"6d9aa2adba55d66db7af488681b02fa5e5dfcfaf3cd00a4c01756300963ffc79"} Apr 22 18:46:19.091565 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:19.091523 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerStarted","Data":"b985b0d57fb29542afb079318677e517dfd3f222e65a503cd2bb14b67c90776b"} Apr 22 18:46:19.091565 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:19.091561 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerStarted","Data":"e9be094c06b5687a611ebf0ec74075f879aeef475771e561576b0e782cc7ba02"} Apr 22 18:46:19.091565 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:19.091571 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerStarted","Data":"30d9c17d9a434865a754493bf87789b75ba742ba0e610e9fbb467008fa9e41a2"} Apr 22 18:46:19.092037 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:19.091580 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerStarted","Data":"cc5d41e47bf7ef26d5dffbae4ff98a2fb139849a5c26b2e3e9d0085b6ebf53f1"} Apr 22 18:46:19.092037 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:19.091589 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerStarted","Data":"3b783696f9600b29c9d61f9522c9ee0bdcbd2f9be1dae914664f75af1ed073cb"} Apr 22 18:46:19.092037 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:19.091596 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerStarted","Data":"3ecd98642e846447ea536a9b1be2f9860973fe3eec43b8309f438a5ced928e78"} Apr 22 18:46:19.123237 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:19.123187 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-64fdcd7476-5r6w8" podStartSLOduration=5.438918123 podStartE2EDuration="7.123173785s" podCreationTimestamp="2026-04-22 18:46:12 +0000 UTC" firstStartedPulling="2026-04-22 18:46:14.533704291 +0000 UTC m=+165.517170225" lastFinishedPulling="2026-04-22 18:46:16.217959964 +0000 UTC m=+167.201425887" observedRunningTime="2026-04-22 18:46:17.115047691 +0000 UTC m=+168.098513634" watchObservedRunningTime="2026-04-22 18:46:19.123173785 +0000 UTC m=+170.106639726" Apr 22 18:46:19.124139 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:19.124112 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.151227314 podStartE2EDuration="6.124104037s" podCreationTimestamp="2026-04-22 18:46:13 +0000 UTC" firstStartedPulling="2026-04-22 18:46:15.069161604 +0000 UTC m=+166.052627526" lastFinishedPulling="2026-04-22 18:46:18.042038328 +0000 UTC m=+169.025504249" observedRunningTime="2026-04-22 18:46:19.121775891 +0000 UTC m=+170.105241843" watchObservedRunningTime="2026-04-22 18:46:19.124104037 +0000 UTC m=+170.107569977" Apr 22 18:46:21.083538 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:21.083505 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5578c6869-cvzhf" Apr 22 18:46:22.545681 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:22.545612 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:46:24.001256 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:24.001202 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:46:25.080664 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:25.080604 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6zh6w" Apr 22 18:46:32.102607 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:32.102555 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:32.102607 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:32.102610 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:52.107994 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:52.107961 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:52.111898 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:52.111873 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7d7464b78d-44zkq" Apr 22 18:46:53.196322 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:53.196291 2570 generic.go:358] "Generic (PLEG): container finished" podID="0c7ad148-5682-41c7-874a-a20686c43134" containerID="a053ba7cd92437083d930cb029704dbb942981f203cc72f4f583b843bfc690c1" exitCode=0 Apr 22 18:46:53.196703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:53.196342 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-thx79" event={"ID":"0c7ad148-5682-41c7-874a-a20686c43134","Type":"ContainerDied","Data":"a053ba7cd92437083d930cb029704dbb942981f203cc72f4f583b843bfc690c1"} Apr 22 18:46:53.196703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:53.196701 2570 scope.go:117] "RemoveContainer" containerID="a053ba7cd92437083d930cb029704dbb942981f203cc72f4f583b843bfc690c1" Apr 22 18:46:54.051946 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:54.051901 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-88bf8fcd4-w7x4n_78f709d1-ef5e-40ac-8845-3f0108fe6b96/router/0.log" Apr 22 18:46:54.104129 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:54.104096 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ltknd_57a5c4fb-aa29-4923-b848-a57df5a62462/serve-healthcheck-canary/0.log" Apr 22 18:46:54.201413 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:46:54.201380 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-thx79" event={"ID":"0c7ad148-5682-41c7-874a-a20686c43134","Type":"ContainerStarted","Data":"f91ad886790c5161fe20b59be829c73d3a1d67ba0b14ab95b9c0100083d423fc"} Apr 22 18:47:04.231757 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:04.231722 2570 generic.go:358] "Generic (PLEG): container finished" podID="142521b5-1f3f-465a-ba35-247424907e87" containerID="54db002dc30594a83e815a915ab518d4ef508280f299fa3f780672ba08b84560" exitCode=0 Apr 22 18:47:04.232201 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:04.231768 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" event={"ID":"142521b5-1f3f-465a-ba35-247424907e87","Type":"ContainerDied","Data":"54db002dc30594a83e815a915ab518d4ef508280f299fa3f780672ba08b84560"} Apr 22 18:47:04.232201 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:04.232178 2570 scope.go:117] "RemoveContainer" containerID="54db002dc30594a83e815a915ab518d4ef508280f299fa3f780672ba08b84560" Apr 22 18:47:05.236854 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:05.236815 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-t8769" event={"ID":"142521b5-1f3f-465a-ba35-247424907e87","Type":"ContainerStarted","Data":"6a9b332571ec0d935f9ad1b28c7a61735bef31ec118d5f1905a21ad766af79c6"} Apr 22 18:47:14.001166 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:14.001128 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:14.017403 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:14.017374 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:14.282753 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:14.282688 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:27.869467 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:27.869426 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:47:27.870197 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:27.870120 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="alertmanager" containerID="cri-o://28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb" gracePeriod=120 Apr 22 18:47:27.870276 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:27.870154 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="kube-rbac-proxy-metric" containerID="cri-o://ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439" gracePeriod=120 Apr 22 18:47:27.870276 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:27.870193 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="kube-rbac-proxy-web" containerID="cri-o://3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6" gracePeriod=120 Apr 22 18:47:27.870276 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:27.870215 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="prom-label-proxy" containerID="cri-o://6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a" gracePeriod=120 Apr 22 18:47:27.870431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:27.870286 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="kube-rbac-proxy" containerID="cri-o://a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1" gracePeriod=120 Apr 22 18:47:27.870431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:27.870317 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="config-reloader" containerID="cri-o://5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b" gracePeriod=120 Apr 22 18:47:28.312795 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:28.312704 2570 generic.go:358] "Generic (PLEG): container finished" podID="88da529e-809b-42f5-a5d5-bbb569dab654" containerID="6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a" exitCode=0 Apr 22 18:47:28.312795 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:28.312731 2570 generic.go:358] "Generic (PLEG): container finished" podID="88da529e-809b-42f5-a5d5-bbb569dab654" containerID="a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1" exitCode=0 Apr 22 18:47:28.312795 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:28.312738 2570 generic.go:358] "Generic (PLEG): container finished" podID="88da529e-809b-42f5-a5d5-bbb569dab654" containerID="5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b" exitCode=0 Apr 22 18:47:28.312795 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:28.312744 2570 generic.go:358] "Generic (PLEG): container finished" podID="88da529e-809b-42f5-a5d5-bbb569dab654" containerID="28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb" exitCode=0 Apr 22 18:47:28.312795 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:28.312772 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerDied","Data":"6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a"} Apr 22 18:47:28.313071 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:28.312805 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerDied","Data":"a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1"} Apr 22 18:47:28.313071 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:28.312815 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerDied","Data":"5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b"} Apr 22 18:47:28.313071 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:28.312824 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerDied","Data":"28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb"} Apr 22 18:47:29.110101 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.110074 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.308352 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.308250 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/88da529e-809b-42f5-a5d5-bbb569dab654-config-out\") pod \"88da529e-809b-42f5-a5d5-bbb569dab654\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " Apr 22 18:47:29.308352 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.308325 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy-metric\") pod \"88da529e-809b-42f5-a5d5-bbb569dab654\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " Apr 22 18:47:29.308352 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.308353 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88da529e-809b-42f5-a5d5-bbb569dab654-metrics-client-ca\") pod \"88da529e-809b-42f5-a5d5-bbb569dab654\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " Apr 22 18:47:29.308646 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.308403 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-main-tls\") pod \"88da529e-809b-42f5-a5d5-bbb569dab654\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " Apr 22 18:47:29.308646 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.308436 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-config-volume\") pod \"88da529e-809b-42f5-a5d5-bbb569dab654\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " Apr 22 18:47:29.308646 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.308463 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2f2f\" (UniqueName: \"kubernetes.io/projected/88da529e-809b-42f5-a5d5-bbb569dab654-kube-api-access-z2f2f\") pod \"88da529e-809b-42f5-a5d5-bbb569dab654\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " Apr 22 18:47:29.308646 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.308485 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/88da529e-809b-42f5-a5d5-bbb569dab654-alertmanager-main-db\") pod \"88da529e-809b-42f5-a5d5-bbb569dab654\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " Apr 22 18:47:29.308646 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.308515 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy-web\") pod \"88da529e-809b-42f5-a5d5-bbb569dab654\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " Apr 22 18:47:29.308646 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.308540 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/88da529e-809b-42f5-a5d5-bbb569dab654-tls-assets\") pod \"88da529e-809b-42f5-a5d5-bbb569dab654\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " Apr 22 18:47:29.308946 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.308786 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88da529e-809b-42f5-a5d5-bbb569dab654-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "88da529e-809b-42f5-a5d5-bbb569dab654" (UID: "88da529e-809b-42f5-a5d5-bbb569dab654"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:29.309156 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.309096 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88da529e-809b-42f5-a5d5-bbb569dab654-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "88da529e-809b-42f5-a5d5-bbb569dab654" (UID: "88da529e-809b-42f5-a5d5-bbb569dab654"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:47:29.309293 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.309171 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-cluster-tls-config\") pod \"88da529e-809b-42f5-a5d5-bbb569dab654\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " Apr 22 18:47:29.309293 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.309233 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88da529e-809b-42f5-a5d5-bbb569dab654-alertmanager-trusted-ca-bundle\") pod \"88da529e-809b-42f5-a5d5-bbb569dab654\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " Apr 22 18:47:29.309293 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.309273 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-web-config\") pod \"88da529e-809b-42f5-a5d5-bbb569dab654\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " Apr 22 18:47:29.309439 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.309315 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy\") pod \"88da529e-809b-42f5-a5d5-bbb569dab654\" (UID: \"88da529e-809b-42f5-a5d5-bbb569dab654\") " Apr 22 18:47:29.309928 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.309662 2570 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/88da529e-809b-42f5-a5d5-bbb569dab654-metrics-client-ca\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:29.309928 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.309686 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/88da529e-809b-42f5-a5d5-bbb569dab654-alertmanager-main-db\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:29.309928 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.309891 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88da529e-809b-42f5-a5d5-bbb569dab654-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "88da529e-809b-42f5-a5d5-bbb569dab654" (UID: "88da529e-809b-42f5-a5d5-bbb569dab654"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:29.312261 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.312226 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88da529e-809b-42f5-a5d5-bbb569dab654-config-out" (OuterVolumeSpecName: "config-out") pod "88da529e-809b-42f5-a5d5-bbb569dab654" (UID: "88da529e-809b-42f5-a5d5-bbb569dab654"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:47:29.312261 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.312235 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "88da529e-809b-42f5-a5d5-bbb569dab654" (UID: "88da529e-809b-42f5-a5d5-bbb569dab654"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:29.312434 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.312281 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "88da529e-809b-42f5-a5d5-bbb569dab654" (UID: "88da529e-809b-42f5-a5d5-bbb569dab654"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:29.312434 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.312320 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88da529e-809b-42f5-a5d5-bbb569dab654-kube-api-access-z2f2f" (OuterVolumeSpecName: "kube-api-access-z2f2f") pod "88da529e-809b-42f5-a5d5-bbb569dab654" (UID: "88da529e-809b-42f5-a5d5-bbb569dab654"). InnerVolumeSpecName "kube-api-access-z2f2f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:47:29.312434 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.312392 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "88da529e-809b-42f5-a5d5-bbb569dab654" (UID: "88da529e-809b-42f5-a5d5-bbb569dab654"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:29.312577 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.312533 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-config-volume" (OuterVolumeSpecName: "config-volume") pod "88da529e-809b-42f5-a5d5-bbb569dab654" (UID: "88da529e-809b-42f5-a5d5-bbb569dab654"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:29.312686 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.312669 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88da529e-809b-42f5-a5d5-bbb569dab654-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "88da529e-809b-42f5-a5d5-bbb569dab654" (UID: "88da529e-809b-42f5-a5d5-bbb569dab654"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:47:29.312903 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.312856 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "88da529e-809b-42f5-a5d5-bbb569dab654" (UID: "88da529e-809b-42f5-a5d5-bbb569dab654"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:29.318186 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.318101 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "88da529e-809b-42f5-a5d5-bbb569dab654" (UID: "88da529e-809b-42f5-a5d5-bbb569dab654"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:29.320122 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.319859 2570 generic.go:358] "Generic (PLEG): container finished" podID="88da529e-809b-42f5-a5d5-bbb569dab654" containerID="ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439" exitCode=0 Apr 22 18:47:29.320122 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.319885 2570 generic.go:358] "Generic (PLEG): container finished" podID="88da529e-809b-42f5-a5d5-bbb569dab654" containerID="3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6" exitCode=0 Apr 22 18:47:29.320122 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.319967 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerDied","Data":"ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439"} Apr 22 18:47:29.320122 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.320006 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerDied","Data":"3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6"} Apr 22 18:47:29.320122 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.320011 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.320122 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.320024 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"88da529e-809b-42f5-a5d5-bbb569dab654","Type":"ContainerDied","Data":"19886abda5677155273e49fa760c7b27a0e23be1e57aefe5306e023a8df4f8a6"} Apr 22 18:47:29.320122 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.320042 2570 scope.go:117] "RemoveContainer" containerID="6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a" Apr 22 18:47:29.326144 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.326116 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-web-config" (OuterVolumeSpecName: "web-config") pod "88da529e-809b-42f5-a5d5-bbb569dab654" (UID: "88da529e-809b-42f5-a5d5-bbb569dab654"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:29.328939 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.328923 2570 scope.go:117] "RemoveContainer" containerID="ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439" Apr 22 18:47:29.335826 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.335807 2570 scope.go:117] "RemoveContainer" containerID="a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1" Apr 22 18:47:29.342579 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.342561 2570 scope.go:117] "RemoveContainer" containerID="3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6" Apr 22 18:47:29.349123 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.349092 2570 scope.go:117] "RemoveContainer" containerID="5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b" Apr 22 18:47:29.355751 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.355730 2570 scope.go:117] "RemoveContainer" containerID="28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb" Apr 22 18:47:29.362253 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.362237 2570 scope.go:117] "RemoveContainer" containerID="7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048" Apr 22 18:47:29.368858 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.368842 2570 scope.go:117] "RemoveContainer" containerID="6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a" Apr 22 18:47:29.369102 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:47:29.369085 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a\": container with ID starting with 6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a not found: ID does not exist" containerID="6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a" Apr 22 18:47:29.369159 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.369110 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a"} err="failed to get container status \"6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a\": rpc error: code = NotFound desc = could not find container \"6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a\": container with ID starting with 6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a not found: ID does not exist" Apr 22 18:47:29.369159 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.369139 2570 scope.go:117] "RemoveContainer" containerID="ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439" Apr 22 18:47:29.369329 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:47:29.369310 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439\": container with ID starting with ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439 not found: ID does not exist" containerID="ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439" Apr 22 18:47:29.369407 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.369331 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439"} err="failed to get container status \"ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439\": rpc error: code = NotFound desc = could not find container \"ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439\": container with ID starting with ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439 not found: ID does not exist" Apr 22 18:47:29.369407 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.369343 2570 scope.go:117] "RemoveContainer" containerID="a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1" Apr 22 18:47:29.369556 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:47:29.369536 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1\": container with ID starting with a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1 not found: ID does not exist" containerID="a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1" Apr 22 18:47:29.369593 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.369560 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1"} err="failed to get container status \"a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1\": rpc error: code = NotFound desc = could not find container \"a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1\": container with ID starting with a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1 not found: ID does not exist" Apr 22 18:47:29.369593 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.369575 2570 scope.go:117] "RemoveContainer" containerID="3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6" Apr 22 18:47:29.369905 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:47:29.369884 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6\": container with ID starting with 3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6 not found: ID does not exist" containerID="3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6" Apr 22 18:47:29.369966 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.369913 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6"} err="failed to get container status \"3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6\": rpc error: code = NotFound desc = could not find container \"3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6\": container with ID starting with 3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6 not found: ID does not exist" Apr 22 18:47:29.369966 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.369928 2570 scope.go:117] "RemoveContainer" containerID="5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b" Apr 22 18:47:29.370173 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:47:29.370154 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b\": container with ID starting with 5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b not found: ID does not exist" containerID="5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b" Apr 22 18:47:29.370210 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.370179 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b"} err="failed to get container status \"5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b\": rpc error: code = NotFound desc = could not find container \"5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b\": container with ID starting with 5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b not found: ID does not exist" Apr 22 18:47:29.370210 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.370195 2570 scope.go:117] "RemoveContainer" containerID="28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb" Apr 22 18:47:29.370414 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:47:29.370398 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb\": container with ID starting with 28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb not found: ID does not exist" containerID="28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb" Apr 22 18:47:29.370455 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.370417 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb"} err="failed to get container status \"28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb\": rpc error: code = NotFound desc = could not find container \"28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb\": container with ID starting with 28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb not found: ID does not exist" Apr 22 18:47:29.370455 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.370430 2570 scope.go:117] "RemoveContainer" containerID="7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048" Apr 22 18:47:29.370660 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:47:29.370643 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048\": container with ID starting with 7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048 not found: ID does not exist" containerID="7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048" Apr 22 18:47:29.370701 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.370666 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048"} err="failed to get container status \"7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048\": rpc error: code = NotFound desc = could not find container \"7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048\": container with ID starting with 7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048 not found: ID does not exist" Apr 22 18:47:29.370701 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.370681 2570 scope.go:117] "RemoveContainer" containerID="6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a" Apr 22 18:47:29.370910 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.370890 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a"} err="failed to get container status \"6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a\": rpc error: code = NotFound desc = could not find container \"6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a\": container with ID starting with 6f8f287c28fe2fe37fc2bfc8bf5061d4abf99f75bee3c0c003b8fac58d974e6a not found: ID does not exist" Apr 22 18:47:29.370957 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.370910 2570 scope.go:117] "RemoveContainer" containerID="ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439" Apr 22 18:47:29.371113 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.371097 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439"} err="failed to get container status \"ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439\": rpc error: code = NotFound desc = could not find container \"ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439\": container with ID starting with ad9fbd9de5f03c8d8d72f0cd278ea4530640a20133125eb836aa727b351dd439 not found: ID does not exist" Apr 22 18:47:29.371156 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.371114 2570 scope.go:117] "RemoveContainer" containerID="a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1" Apr 22 18:47:29.371299 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.371281 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1"} err="failed to get container status \"a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1\": rpc error: code = NotFound desc = could not find container \"a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1\": container with ID starting with a385d1d12b4245a93ac086cd86e7235c32ece13c6c871db0babcdec5da72c4c1 not found: ID does not exist" Apr 22 18:47:29.371340 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.371301 2570 scope.go:117] "RemoveContainer" containerID="3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6" Apr 22 18:47:29.371498 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.371483 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6"} err="failed to get container status \"3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6\": rpc error: code = NotFound desc = could not find container \"3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6\": container with ID starting with 3705f6c95a0793e3922b310b446a43ee27a3f7f714c1d5e0fbd1f7b36e8b64a6 not found: ID does not exist" Apr 22 18:47:29.371498 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.371497 2570 scope.go:117] "RemoveContainer" containerID="5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b" Apr 22 18:47:29.371729 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.371710 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b"} err="failed to get container status \"5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b\": rpc error: code = NotFound desc = could not find container \"5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b\": container with ID starting with 5eca2b8009486d44e114598cc1d227c0038b4d1194086c44bdf2cf6e36eeb11b not found: ID does not exist" Apr 22 18:47:29.371729 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.371729 2570 scope.go:117] "RemoveContainer" containerID="28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb" Apr 22 18:47:29.371966 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.371947 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb"} err="failed to get container status \"28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb\": rpc error: code = NotFound desc = could not find container \"28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb\": container with ID starting with 28760800c5bed6c235aaf2aeb1721e4878f244e122163140138a4d3f653633bb not found: ID does not exist" Apr 22 18:47:29.372020 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.371966 2570 scope.go:117] "RemoveContainer" containerID="7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048" Apr 22 18:47:29.372175 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.372158 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048"} err="failed to get container status \"7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048\": rpc error: code = NotFound desc = could not find container \"7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048\": container with ID starting with 7a2e446b17c5a9cba1dde65fcaf113b037957641331117a4ec52e33ab1f63048 not found: ID does not exist" Apr 22 18:47:29.410404 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.410378 2570 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-config-volume\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:29.410404 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.410407 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z2f2f\" (UniqueName: \"kubernetes.io/projected/88da529e-809b-42f5-a5d5-bbb569dab654-kube-api-access-z2f2f\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:29.410548 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.410419 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:29.410548 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.410431 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/88da529e-809b-42f5-a5d5-bbb569dab654-tls-assets\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:29.410548 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.410440 2570 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-cluster-tls-config\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:29.410548 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.410449 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88da529e-809b-42f5-a5d5-bbb569dab654-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:29.410548 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.410459 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-web-config\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:29.410548 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.410468 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:29.410548 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.410476 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/88da529e-809b-42f5-a5d5-bbb569dab654-config-out\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:29.410548 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.410485 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:29.410548 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.410494 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/88da529e-809b-42f5-a5d5-bbb569dab654-secret-alertmanager-main-tls\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:29.639493 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.639460 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:47:29.643241 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.643212 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:47:29.671952 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.671916 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:47:29.672243 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672230 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="kube-rbac-proxy-metric" Apr 22 18:47:29.672305 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672246 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="kube-rbac-proxy-metric" Apr 22 18:47:29.672305 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672255 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="init-config-reloader" Apr 22 18:47:29.672305 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672261 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="init-config-reloader" Apr 22 18:47:29.672305 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672271 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="config-reloader" Apr 22 18:47:29.672305 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672278 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="config-reloader" Apr 22 18:47:29.672305 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672286 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="alertmanager" Apr 22 18:47:29.672305 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672291 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="alertmanager" Apr 22 18:47:29.672305 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672301 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="prom-label-proxy" Apr 22 18:47:29.672305 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672306 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="prom-label-proxy" Apr 22 18:47:29.672590 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672314 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="kube-rbac-proxy-web" Apr 22 18:47:29.672590 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672319 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="kube-rbac-proxy-web" Apr 22 18:47:29.672590 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672327 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="kube-rbac-proxy" Apr 22 18:47:29.672590 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672332 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="kube-rbac-proxy" Apr 22 18:47:29.672590 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672380 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="kube-rbac-proxy-web" Apr 22 18:47:29.672590 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672390 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="kube-rbac-proxy" Apr 22 18:47:29.672590 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672397 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="prom-label-proxy" Apr 22 18:47:29.672590 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672404 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="kube-rbac-proxy-metric" Apr 22 18:47:29.672590 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672411 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="alertmanager" Apr 22 18:47:29.672590 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.672417 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" containerName="config-reloader" Apr 22 18:47:29.679398 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.679367 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.682873 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.682845 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:47:29.682970 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.682845 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:47:29.683037 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.682971 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:47:29.683114 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.683100 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:47:29.683162 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.683151 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:47:29.683419 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.683394 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:47:29.683532 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.683427 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:47:29.683532 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.683399 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-cjmc2\"" Apr 22 18:47:29.683919 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.683902 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:47:29.690282 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.690261 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:47:29.690369 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.690318 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:47:29.713085 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.713055 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1640a33-1eb6-4132-b617-fe75c229730f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.713245 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.713099 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-web-config\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.713245 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.713121 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b1640a33-1eb6-4132-b617-fe75c229730f-config-out\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.713245 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.713167 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wznf\" (UniqueName: \"kubernetes.io/projected/b1640a33-1eb6-4132-b617-fe75c229730f-kube-api-access-9wznf\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.713245 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.713195 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.713245 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.713219 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.713245 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.713238 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b1640a33-1eb6-4132-b617-fe75c229730f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.713431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.713282 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-config-volume\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.713431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.713309 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.713431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.713327 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1640a33-1eb6-4132-b617-fe75c229730f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.713431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.713388 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b1640a33-1eb6-4132-b617-fe75c229730f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.713431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.713415 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.713579 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.713440 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.814411 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.814376 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b1640a33-1eb6-4132-b617-fe75c229730f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.814411 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.814424 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.814734 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.814461 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.814734 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.814508 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1640a33-1eb6-4132-b617-fe75c229730f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.814844 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.814737 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-web-config\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.814844 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.814773 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b1640a33-1eb6-4132-b617-fe75c229730f-config-out\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.814951 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.814890 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wznf\" (UniqueName: \"kubernetes.io/projected/b1640a33-1eb6-4132-b617-fe75c229730f-kube-api-access-9wznf\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.815001 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.814951 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.815054 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.815001 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.815054 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.815030 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b1640a33-1eb6-4132-b617-fe75c229730f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.815158 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.815113 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-config-volume\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.815158 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.815138 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.815262 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.815180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1640a33-1eb6-4132-b617-fe75c229730f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.815389 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.815364 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1640a33-1eb6-4132-b617-fe75c229730f-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.816250 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.816220 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1640a33-1eb6-4132-b617-fe75c229730f-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.816982 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.816931 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b1640a33-1eb6-4132-b617-fe75c229730f-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.817737 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.817713 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-web-config\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.817891 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.817870 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.817980 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.817900 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b1640a33-1eb6-4132-b617-fe75c229730f-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.818074 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.818056 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b1640a33-1eb6-4132-b617-fe75c229730f-config-out\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.818222 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.818203 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.818434 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.818416 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.818653 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.818638 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-config-volume\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.819380 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.819365 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.819887 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.819869 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b1640a33-1eb6-4132-b617-fe75c229730f-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.824344 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.824322 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wznf\" (UniqueName: \"kubernetes.io/projected/b1640a33-1eb6-4132-b617-fe75c229730f-kube-api-access-9wznf\") pod \"alertmanager-main-0\" (UID: \"b1640a33-1eb6-4132-b617-fe75c229730f\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:29.989688 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:29.989655 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:47:30.126111 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:30.126076 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:47:30.129086 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:47:30.129057 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1640a33_1eb6_4132_b617_fe75c229730f.slice/crio-0972e56a7f307341af6f266d7c93ca3677608df2c100415fbd52dbd38a4889bb WatchSource:0}: Error finding container 0972e56a7f307341af6f266d7c93ca3677608df2c100415fbd52dbd38a4889bb: Status 404 returned error can't find the container with id 0972e56a7f307341af6f266d7c93ca3677608df2c100415fbd52dbd38a4889bb Apr 22 18:47:30.324133 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:30.324093 2570 generic.go:358] "Generic (PLEG): container finished" podID="b1640a33-1eb6-4132-b617-fe75c229730f" containerID="cf31e1b8a9fc2fd30bda9b4fee75b6caa3049be32f76e120139c3555f3a5a0d4" exitCode=0 Apr 22 18:47:30.324300 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:30.324183 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1640a33-1eb6-4132-b617-fe75c229730f","Type":"ContainerDied","Data":"cf31e1b8a9fc2fd30bda9b4fee75b6caa3049be32f76e120139c3555f3a5a0d4"} Apr 22 18:47:30.324300 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:30.324220 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1640a33-1eb6-4132-b617-fe75c229730f","Type":"ContainerStarted","Data":"0972e56a7f307341af6f266d7c93ca3677608df2c100415fbd52dbd38a4889bb"} Apr 22 18:47:31.330398 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:31.330359 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1640a33-1eb6-4132-b617-fe75c229730f","Type":"ContainerStarted","Data":"884f01e74c3fd0451cb84b59c6a35bc543c856d6851783f9d797e7777c6457e5"} Apr 22 18:47:31.330766 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:31.330404 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1640a33-1eb6-4132-b617-fe75c229730f","Type":"ContainerStarted","Data":"2ed422fd6137187bf41bb6514bd6833a8f50ebd6cbb981aa04e43b1a161abf7a"} Apr 22 18:47:31.330766 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:31.330421 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1640a33-1eb6-4132-b617-fe75c229730f","Type":"ContainerStarted","Data":"753b4a62752f0afdb66fd0f9ec6a61fa6f6ac23ee0121e3065687fd358037c1a"} Apr 22 18:47:31.330766 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:31.330434 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1640a33-1eb6-4132-b617-fe75c229730f","Type":"ContainerStarted","Data":"a1e0dd5dd9fe8876efdd334ede907b804aa509f07763edd36499a167c40f8a11"} Apr 22 18:47:31.330766 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:31.330447 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1640a33-1eb6-4132-b617-fe75c229730f","Type":"ContainerStarted","Data":"d97e4024778c02c493966d667fcd623bc867019e13bd3879f960baa9d06ace64"} Apr 22 18:47:31.330766 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:31.330459 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1640a33-1eb6-4132-b617-fe75c229730f","Type":"ContainerStarted","Data":"3528ff1bcd9031a7e3b965a5f94410112922245a460ab0f6636e4ebdd75db09e"} Apr 22 18:47:31.364116 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:31.364037 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.36401902 podStartE2EDuration="2.36401902s" podCreationTimestamp="2026-04-22 18:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:47:31.356857344 +0000 UTC m=+242.340323310" watchObservedRunningTime="2026-04-22 18:47:31.36401902 +0000 UTC m=+242.347484963" Apr 22 18:47:31.550232 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:31.550197 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88da529e-809b-42f5-a5d5-bbb569dab654" path="/var/lib/kubelet/pods/88da529e-809b-42f5-a5d5-bbb569dab654/volumes" Apr 22 18:47:32.122525 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.122494 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:47:32.122941 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.122914 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="prometheus" containerID="cri-o://3ecd98642e846447ea536a9b1be2f9860973fe3eec43b8309f438a5ced928e78" gracePeriod=600 Apr 22 18:47:32.123049 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.122934 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="kube-rbac-proxy" containerID="cri-o://e9be094c06b5687a611ebf0ec74075f879aeef475771e561576b0e782cc7ba02" gracePeriod=600 Apr 22 18:47:32.123049 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.122969 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="kube-rbac-proxy-web" containerID="cri-o://30d9c17d9a434865a754493bf87789b75ba742ba0e610e9fbb467008fa9e41a2" gracePeriod=600 Apr 22 18:47:32.123049 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.122950 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="config-reloader" containerID="cri-o://3b783696f9600b29c9d61f9522c9ee0bdcbd2f9be1dae914664f75af1ed073cb" gracePeriod=600 Apr 22 18:47:32.123049 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.122998 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="kube-rbac-proxy-thanos" containerID="cri-o://b985b0d57fb29542afb079318677e517dfd3f222e65a503cd2bb14b67c90776b" gracePeriod=600 Apr 22 18:47:32.123049 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.122947 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="thanos-sidecar" containerID="cri-o://cc5d41e47bf7ef26d5dffbae4ff98a2fb139849a5c26b2e3e9d0085b6ebf53f1" gracePeriod=600 Apr 22 18:47:32.338275 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.338238 2570 generic.go:358] "Generic (PLEG): container finished" podID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerID="b985b0d57fb29542afb079318677e517dfd3f222e65a503cd2bb14b67c90776b" exitCode=0 Apr 22 18:47:32.338275 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.338271 2570 generic.go:358] "Generic (PLEG): container finished" podID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerID="e9be094c06b5687a611ebf0ec74075f879aeef475771e561576b0e782cc7ba02" exitCode=0 Apr 22 18:47:32.338275 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.338281 2570 generic.go:358] "Generic (PLEG): container finished" podID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerID="30d9c17d9a434865a754493bf87789b75ba742ba0e610e9fbb467008fa9e41a2" exitCode=0 Apr 22 18:47:32.338837 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.338290 2570 generic.go:358] "Generic (PLEG): container finished" podID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerID="cc5d41e47bf7ef26d5dffbae4ff98a2fb139849a5c26b2e3e9d0085b6ebf53f1" exitCode=0 Apr 22 18:47:32.338837 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.338299 2570 generic.go:358] "Generic (PLEG): container finished" podID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerID="3b783696f9600b29c9d61f9522c9ee0bdcbd2f9be1dae914664f75af1ed073cb" exitCode=0 Apr 22 18:47:32.338837 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.338306 2570 generic.go:358] "Generic (PLEG): container finished" podID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerID="3ecd98642e846447ea536a9b1be2f9860973fe3eec43b8309f438a5ced928e78" exitCode=0 Apr 22 18:47:32.338837 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.338308 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerDied","Data":"b985b0d57fb29542afb079318677e517dfd3f222e65a503cd2bb14b67c90776b"} Apr 22 18:47:32.338837 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.338346 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerDied","Data":"e9be094c06b5687a611ebf0ec74075f879aeef475771e561576b0e782cc7ba02"} Apr 22 18:47:32.338837 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.338357 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerDied","Data":"30d9c17d9a434865a754493bf87789b75ba742ba0e610e9fbb467008fa9e41a2"} Apr 22 18:47:32.338837 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.338366 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerDied","Data":"cc5d41e47bf7ef26d5dffbae4ff98a2fb139849a5c26b2e3e9d0085b6ebf53f1"} Apr 22 18:47:32.338837 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.338374 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerDied","Data":"3b783696f9600b29c9d61f9522c9ee0bdcbd2f9be1dae914664f75af1ed073cb"} Apr 22 18:47:32.338837 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.338384 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerDied","Data":"3ecd98642e846447ea536a9b1be2f9860973fe3eec43b8309f438a5ced928e78"} Apr 22 18:47:32.366417 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.366394 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:32.435942 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.435848 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzgf8\" (UniqueName: \"kubernetes.io/projected/fb8ac54e-a082-459a-a5ea-76e30144ed07-kube-api-access-zzgf8\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.435942 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.435893 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-serving-certs-ca-bundle\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.435942 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.435914 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-metrics-client-ca\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.435942 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.435940 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-k8s-db\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.436244 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.435975 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.436244 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.435994 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-grpc-tls\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.436244 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.436033 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb8ac54e-a082-459a-a5ea-76e30144ed07-tls-assets\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.436244 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.436091 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-kubelet-serving-ca-bundle\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.436244 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.436115 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-thanos-prometheus-http-client-file\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.436556 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.436565 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.436648 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-trusted-ca-bundle\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.436687 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-metrics-client-certs\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.436948 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.436782 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-config\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.437016 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-tls\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.437044 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-k8s-rulefiles-0\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.437070 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.437113 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-web-config\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.437142 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb8ac54e-a082-459a-a5ea-76e30144ed07-config-out\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.437175 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-kube-rbac-proxy\") pod \"fb8ac54e-a082-459a-a5ea-76e30144ed07\" (UID: \"fb8ac54e-a082-459a-a5ea-76e30144ed07\") " Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.437335 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.437977 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.438000 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-metrics-client-ca\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.438017 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.438732 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.438035 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-trusted-ca-bundle\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.439881 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.438126 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:47:32.440161 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.440063 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:32.440575 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.440545 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:32.440686 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.440583 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8ac54e-a082-459a-a5ea-76e30144ed07-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:47:32.440990 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.440964 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8ac54e-a082-459a-a5ea-76e30144ed07-kube-api-access-zzgf8" (OuterVolumeSpecName: "kube-api-access-zzgf8") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "kube-api-access-zzgf8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:47:32.441059 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.440969 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:32.441274 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.441238 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:32.441548 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.441510 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:32.441762 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.441729 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:32.442125 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.442097 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8ac54e-a082-459a-a5ea-76e30144ed07-config-out" (OuterVolumeSpecName: "config-out") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:47:32.442256 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.442240 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:32.442964 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.442936 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-config" (OuterVolumeSpecName: "config") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:32.443268 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.443241 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:32.453704 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.453674 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-web-config" (OuterVolumeSpecName: "web-config") pod "fb8ac54e-a082-459a-a5ea-76e30144ed07" (UID: "fb8ac54e-a082-459a-a5ea-76e30144ed07"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:32.539067 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.539025 2570 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-metrics-client-certs\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.539067 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.539060 2570 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-config\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.539067 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.539074 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-tls\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.539334 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.539090 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.539334 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.539104 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.539334 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.539117 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-web-config\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.539334 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.539129 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb8ac54e-a082-459a-a5ea-76e30144ed07-config-out\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.539334 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.539141 2570 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-kube-rbac-proxy\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.539334 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.539155 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zzgf8\" (UniqueName: \"kubernetes.io/projected/fb8ac54e-a082-459a-a5ea-76e30144ed07-kube-api-access-zzgf8\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.539334 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.539168 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fb8ac54e-a082-459a-a5ea-76e30144ed07-prometheus-k8s-db\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.539334 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.539180 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.539334 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.539193 2570 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-secret-grpc-tls\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.539334 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.539206 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb8ac54e-a082-459a-a5ea-76e30144ed07-tls-assets\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:32.539334 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:32.539220 2570 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb8ac54e-a082-459a-a5ea-76e30144ed07-thanos-prometheus-http-client-file\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:47:33.344446 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.344407 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb8ac54e-a082-459a-a5ea-76e30144ed07","Type":"ContainerDied","Data":"1b58437918ed8bda735686bd807e9589e0bdb25de2e9a73750deb5fd2f1586e0"} Apr 22 18:47:33.344446 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.344455 2570 scope.go:117] "RemoveContainer" containerID="b985b0d57fb29542afb079318677e517dfd3f222e65a503cd2bb14b67c90776b" Apr 22 18:47:33.344959 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.344453 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.352713 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.352487 2570 scope.go:117] "RemoveContainer" containerID="e9be094c06b5687a611ebf0ec74075f879aeef475771e561576b0e782cc7ba02" Apr 22 18:47:33.360066 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.360048 2570 scope.go:117] "RemoveContainer" containerID="30d9c17d9a434865a754493bf87789b75ba742ba0e610e9fbb467008fa9e41a2" Apr 22 18:47:33.366457 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.366435 2570 scope.go:117] "RemoveContainer" containerID="cc5d41e47bf7ef26d5dffbae4ff98a2fb139849a5c26b2e3e9d0085b6ebf53f1" Apr 22 18:47:33.369503 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.369478 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:47:33.373434 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.373409 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:47:33.373761 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.373742 2570 scope.go:117] "RemoveContainer" containerID="3b783696f9600b29c9d61f9522c9ee0bdcbd2f9be1dae914664f75af1ed073cb" Apr 22 18:47:33.380182 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.380166 2570 scope.go:117] "RemoveContainer" containerID="3ecd98642e846447ea536a9b1be2f9860973fe3eec43b8309f438a5ced928e78" Apr 22 18:47:33.388898 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.388762 2570 scope.go:117] "RemoveContainer" containerID="7436b19beaf7703a1ab5990581e009f354e9033b96f0791a8be329c4560a5e20" Apr 22 18:47:33.404768 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.404739 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:47:33.405071 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405059 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="init-config-reloader" Apr 22 18:47:33.405120 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405076 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="init-config-reloader" Apr 22 18:47:33.405120 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405094 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="kube-rbac-proxy-thanos" Apr 22 18:47:33.405120 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405100 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="kube-rbac-proxy-thanos" Apr 22 18:47:33.405120 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405110 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="kube-rbac-proxy" Apr 22 18:47:33.405120 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405115 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="kube-rbac-proxy" Apr 22 18:47:33.405263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405122 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="thanos-sidecar" Apr 22 18:47:33.405263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405127 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="thanos-sidecar" Apr 22 18:47:33.405263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405133 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="kube-rbac-proxy-web" Apr 22 18:47:33.405263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405138 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="kube-rbac-proxy-web" Apr 22 18:47:33.405263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405146 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="config-reloader" Apr 22 18:47:33.405263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405151 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="config-reloader" Apr 22 18:47:33.405263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405171 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="prometheus" Apr 22 18:47:33.405263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405176 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="prometheus" Apr 22 18:47:33.405263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405225 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="config-reloader" Apr 22 18:47:33.405263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405236 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="kube-rbac-proxy-web" Apr 22 18:47:33.405263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405245 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="kube-rbac-proxy-thanos" Apr 22 18:47:33.405263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405254 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="thanos-sidecar" Apr 22 18:47:33.405263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405260 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="prometheus" Apr 22 18:47:33.405263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.405267 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" containerName="kube-rbac-proxy" Apr 22 18:47:33.410860 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.410844 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.413572 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.413551 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:47:33.413726 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.413554 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:47:33.413726 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.413554 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-785mk\"" Apr 22 18:47:33.413938 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.413923 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:47:33.414036 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.413948 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:47:33.414187 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.414171 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-12g2s2bc200p7\"" Apr 22 18:47:33.414533 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.414497 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:47:33.414607 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.414575 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:47:33.414607 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.414572 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:47:33.414790 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.414771 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:47:33.415388 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.415369 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:47:33.415458 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.415369 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:47:33.417682 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.417611 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:47:33.421546 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.421517 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:47:33.427420 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.424257 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:47:33.445993 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.445958 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6be795e0-0dd9-48ab-8279-902c6314c44d-config-out\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446094 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446002 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-config\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446094 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446023 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6be795e0-0dd9-48ab-8279-902c6314c44d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446094 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446078 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446102 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446118 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446144 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446170 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446187 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446337 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446206 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446337 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446221 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446337 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446235 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5rhx\" (UniqueName: \"kubernetes.io/projected/6be795e0-0dd9-48ab-8279-902c6314c44d-kube-api-access-c5rhx\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446337 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446261 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446337 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446308 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-web-config\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446337 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446329 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6be795e0-0dd9-48ab-8279-902c6314c44d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446499 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446393 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446499 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446430 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.446499 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.446455 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547249 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547214 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-web-config\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547249 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547258 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6be795e0-0dd9-48ab-8279-902c6314c44d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547491 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547398 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547491 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547443 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547491 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547470 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547678 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547505 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6be795e0-0dd9-48ab-8279-902c6314c44d-config-out\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547678 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547535 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-config\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547678 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547563 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6be795e0-0dd9-48ab-8279-902c6314c44d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547678 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547651 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547878 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547680 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547878 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547704 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547878 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547746 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547878 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547793 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547878 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547821 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547878 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547847 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.547878 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547872 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.548268 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547894 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5rhx\" (UniqueName: \"kubernetes.io/projected/6be795e0-0dd9-48ab-8279-902c6314c44d-kube-api-access-c5rhx\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.548268 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.547934 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.549895 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.549419 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.551350 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.551319 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6be795e0-0dd9-48ab-8279-902c6314c44d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.551461 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.551327 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb8ac54e-a082-459a-a5ea-76e30144ed07" path="/var/lib/kubelet/pods/fb8ac54e-a082-459a-a5ea-76e30144ed07/volumes" Apr 22 18:47:33.551461 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.551450 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6be795e0-0dd9-48ab-8279-902c6314c44d-config-out\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.551873 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.551851 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.552002 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.551878 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.552096 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.551963 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.552173 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.552024 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.552355 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.552311 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.552468 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.552443 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-web-config\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.552657 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.552559 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6be795e0-0dd9-48ab-8279-902c6314c44d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.553005 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.552978 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.553089 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.553036 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.553860 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.553834 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.554377 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.554358 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-config\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.554904 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.554882 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6be795e0-0dd9-48ab-8279-902c6314c44d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.555078 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.555057 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.555349 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.555329 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6be795e0-0dd9-48ab-8279-902c6314c44d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.559772 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.559756 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5rhx\" (UniqueName: \"kubernetes.io/projected/6be795e0-0dd9-48ab-8279-902c6314c44d-kube-api-access-c5rhx\") pod \"prometheus-k8s-0\" (UID: \"6be795e0-0dd9-48ab-8279-902c6314c44d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.721504 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.721401 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:33.857510 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:33.857477 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:47:33.860470 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:47:33.860442 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6be795e0_0dd9_48ab_8279_902c6314c44d.slice/crio-56ca009d9c39bc885fbb7579b76f31db9e975ccf75e7ddfe4767e78933a1373b WatchSource:0}: Error finding container 56ca009d9c39bc885fbb7579b76f31db9e975ccf75e7ddfe4767e78933a1373b: Status 404 returned error can't find the container with id 56ca009d9c39bc885fbb7579b76f31db9e975ccf75e7ddfe4767e78933a1373b Apr 22 18:47:34.349371 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:34.349337 2570 generic.go:358] "Generic (PLEG): container finished" podID="6be795e0-0dd9-48ab-8279-902c6314c44d" containerID="ccba4391cf91a0bf4b0a7e3094bca845a1614fef420d9a8a807768b320c265ea" exitCode=0 Apr 22 18:47:34.349957 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:34.349431 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6be795e0-0dd9-48ab-8279-902c6314c44d","Type":"ContainerDied","Data":"ccba4391cf91a0bf4b0a7e3094bca845a1614fef420d9a8a807768b320c265ea"} Apr 22 18:47:34.349957 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:34.349476 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6be795e0-0dd9-48ab-8279-902c6314c44d","Type":"ContainerStarted","Data":"56ca009d9c39bc885fbb7579b76f31db9e975ccf75e7ddfe4767e78933a1373b"} Apr 22 18:47:35.356713 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:35.356677 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6be795e0-0dd9-48ab-8279-902c6314c44d","Type":"ContainerStarted","Data":"fd3498ae06d320d00a519bca7c717a732d13d55ef6b1e9c85669785ee2da00f1"} Apr 22 18:47:35.357087 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:35.356721 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6be795e0-0dd9-48ab-8279-902c6314c44d","Type":"ContainerStarted","Data":"3f9ed08f06746edff40a5ad9cdd9cb7128a89ec2f3fb2e6f26bbff275586373d"} Apr 22 18:47:35.357087 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:35.356736 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6be795e0-0dd9-48ab-8279-902c6314c44d","Type":"ContainerStarted","Data":"9b82de67940ece3c9499b707d8a3dc36031894b4669ebaefac6be70e7630614a"} Apr 22 18:47:35.357087 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:35.356748 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6be795e0-0dd9-48ab-8279-902c6314c44d","Type":"ContainerStarted","Data":"1304d12860e64083af4f16be3312b7623fdfdea70ef148ba7c17e60dbffe357b"} Apr 22 18:47:35.357087 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:35.356761 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6be795e0-0dd9-48ab-8279-902c6314c44d","Type":"ContainerStarted","Data":"75e7673088993677fd483b72340bbe8ae64c3c6a6331364e84e741a05517d22f"} Apr 22 18:47:35.357087 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:35.356773 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6be795e0-0dd9-48ab-8279-902c6314c44d","Type":"ContainerStarted","Data":"26a557ec85d10b1ea28e69515320bd5a392eabf6d05e88b4dc12d0fee9c2e8cd"} Apr 22 18:47:35.413214 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:35.413159 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.413142427 podStartE2EDuration="2.413142427s" podCreationTimestamp="2026-04-22 18:47:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:47:35.412484702 +0000 UTC m=+246.395950646" watchObservedRunningTime="2026-04-22 18:47:35.413142427 +0000 UTC m=+246.396608368" Apr 22 18:47:38.722279 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:38.722227 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:47:41.322531 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:41.322485 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:47:41.325110 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:41.325083 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282-metrics-certs\") pod \"network-metrics-daemon-w8q5c\" (UID: \"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282\") " pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:47:41.449433 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:41.449401 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mgrtw\"" Apr 22 18:47:41.457599 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:41.457565 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-w8q5c" Apr 22 18:47:41.582018 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:41.581990 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-w8q5c"] Apr 22 18:47:41.584640 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:47:41.584595 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod317bb3bc_9a8f_408c_8cc6_ab0ccf5a0282.slice/crio-6cf567ba5925666f9dfdab100425f92edcace491a64eaac160d95f099879e5fa WatchSource:0}: Error finding container 6cf567ba5925666f9dfdab100425f92edcace491a64eaac160d95f099879e5fa: Status 404 returned error can't find the container with id 6cf567ba5925666f9dfdab100425f92edcace491a64eaac160d95f099879e5fa Apr 22 18:47:42.380097 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:42.380040 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-w8q5c" event={"ID":"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282","Type":"ContainerStarted","Data":"6cf567ba5925666f9dfdab100425f92edcace491a64eaac160d95f099879e5fa"} Apr 22 18:47:43.385241 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:43.385202 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-w8q5c" event={"ID":"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282","Type":"ContainerStarted","Data":"deb392aea20f9cb2a82d10aea7631da91f160233a9c24c47b15ba6017bc2d210"} Apr 22 18:47:43.385241 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:43.385239 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-w8q5c" event={"ID":"317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282","Type":"ContainerStarted","Data":"89967cadc3272bef3691bc164f418bcc062695cfbe487af91bc5f64913a59eb1"} Apr 22 18:47:43.405477 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:47:43.405423 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-w8q5c" podStartSLOduration=253.430465306 podStartE2EDuration="4m14.405407764s" podCreationTimestamp="2026-04-22 18:43:29 +0000 UTC" firstStartedPulling="2026-04-22 18:47:41.586548536 +0000 UTC m=+252.570014462" lastFinishedPulling="2026-04-22 18:47:42.561490987 +0000 UTC m=+253.544956920" observedRunningTime="2026-04-22 18:47:43.404250481 +0000 UTC m=+254.387716435" watchObservedRunningTime="2026-04-22 18:47:43.405407764 +0000 UTC m=+254.388873765" Apr 22 18:48:29.402939 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:48:29.402900 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/ovn-acl-logging/0.log" Apr 22 18:48:29.403476 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:48:29.402944 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/ovn-acl-logging/0.log" Apr 22 18:48:29.410043 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:48:29.410025 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:48:33.722263 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:48:33.722226 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:33.738128 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:48:33.738101 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:48:34.553722 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:48:34.553691 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:19.600470 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.600382 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52"] Apr 22 18:49:19.602817 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.602793 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" Apr 22 18:49:19.605298 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.605266 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-7nxpj\"" Apr 22 18:49:19.605899 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.605886 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 18:49:19.606012 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.605996 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 18:49:19.606189 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.606167 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 18:49:19.607002 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.606987 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 18:49:19.621184 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.621162 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52"] Apr 22 18:49:19.662703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.662671 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4l2s\" (UniqueName: \"kubernetes.io/projected/ddedcae0-3939-4ffe-8320-bdfb64ce1341-kube-api-access-f4l2s\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-gkc52\" (UID: \"ddedcae0-3939-4ffe-8320-bdfb64ce1341\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" Apr 22 18:49:19.662703 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.662705 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ddedcae0-3939-4ffe-8320-bdfb64ce1341-webhook-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-gkc52\" (UID: \"ddedcae0-3939-4ffe-8320-bdfb64ce1341\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" Apr 22 18:49:19.662890 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.662796 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ddedcae0-3939-4ffe-8320-bdfb64ce1341-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-gkc52\" (UID: \"ddedcae0-3939-4ffe-8320-bdfb64ce1341\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" Apr 22 18:49:19.763230 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.763196 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4l2s\" (UniqueName: \"kubernetes.io/projected/ddedcae0-3939-4ffe-8320-bdfb64ce1341-kube-api-access-f4l2s\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-gkc52\" (UID: \"ddedcae0-3939-4ffe-8320-bdfb64ce1341\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" Apr 22 18:49:19.763230 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.763236 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ddedcae0-3939-4ffe-8320-bdfb64ce1341-webhook-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-gkc52\" (UID: \"ddedcae0-3939-4ffe-8320-bdfb64ce1341\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" Apr 22 18:49:19.763458 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.763282 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ddedcae0-3939-4ffe-8320-bdfb64ce1341-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-gkc52\" (UID: \"ddedcae0-3939-4ffe-8320-bdfb64ce1341\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" Apr 22 18:49:19.765976 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.765947 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ddedcae0-3939-4ffe-8320-bdfb64ce1341-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-gkc52\" (UID: \"ddedcae0-3939-4ffe-8320-bdfb64ce1341\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" Apr 22 18:49:19.766053 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.765988 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ddedcae0-3939-4ffe-8320-bdfb64ce1341-webhook-cert\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-gkc52\" (UID: \"ddedcae0-3939-4ffe-8320-bdfb64ce1341\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" Apr 22 18:49:19.772204 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.772181 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4l2s\" (UniqueName: \"kubernetes.io/projected/ddedcae0-3939-4ffe-8320-bdfb64ce1341-kube-api-access-f4l2s\") pod \"opendatahub-operator-controller-manager-7c59bb5d7b-gkc52\" (UID: \"ddedcae0-3939-4ffe-8320-bdfb64ce1341\") " pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" Apr 22 18:49:19.914254 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:19.914157 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" Apr 22 18:49:20.046099 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:20.046032 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52"] Apr 22 18:49:20.048591 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:49:20.048550 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddedcae0_3939_4ffe_8320_bdfb64ce1341.slice/crio-4692a96ea71d3f586cab8403d6351e28b29323cc44075ef690e2b35cc03c5ba8 WatchSource:0}: Error finding container 4692a96ea71d3f586cab8403d6351e28b29323cc44075ef690e2b35cc03c5ba8: Status 404 returned error can't find the container with id 4692a96ea71d3f586cab8403d6351e28b29323cc44075ef690e2b35cc03c5ba8 Apr 22 18:49:20.050287 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:20.050267 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:49:20.679969 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:20.679924 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" event={"ID":"ddedcae0-3939-4ffe-8320-bdfb64ce1341","Type":"ContainerStarted","Data":"4692a96ea71d3f586cab8403d6351e28b29323cc44075ef690e2b35cc03c5ba8"} Apr 22 18:49:22.690108 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:22.690064 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" event={"ID":"ddedcae0-3939-4ffe-8320-bdfb64ce1341","Type":"ContainerStarted","Data":"d5a5ce5e15ed4e0d47b083468eae9d5818f6d33e179b726d01ec85b06b1b02ab"} Apr 22 18:49:22.690571 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:22.690190 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" Apr 22 18:49:22.713652 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:22.713567 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" podStartSLOduration=1.215714703 podStartE2EDuration="3.713550854s" podCreationTimestamp="2026-04-22 18:49:19 +0000 UTC" firstStartedPulling="2026-04-22 18:49:20.050395453 +0000 UTC m=+351.033861373" lastFinishedPulling="2026-04-22 18:49:22.548231606 +0000 UTC m=+353.531697524" observedRunningTime="2026-04-22 18:49:22.712677997 +0000 UTC m=+353.696143939" watchObservedRunningTime="2026-04-22 18:49:22.713550854 +0000 UTC m=+353.697016777" Apr 22 18:49:25.264658 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.264599 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s"] Apr 22 18:49:25.266979 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.266960 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:25.270782 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.270756 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 18:49:25.270920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.270756 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 18:49:25.270920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.270806 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:49:25.271023 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.270944 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 18:49:25.271158 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.271144 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 18:49:25.271239 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.271217 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-4w9pb\"" Apr 22 18:49:25.291577 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.291545 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s"] Apr 22 18:49:25.317638 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.317581 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce-metrics-cert\") pod \"lws-controller-manager-59544c8f7-fq47s\" (UID: \"1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:25.317811 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.317655 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jswhp\" (UniqueName: \"kubernetes.io/projected/1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce-kube-api-access-jswhp\") pod \"lws-controller-manager-59544c8f7-fq47s\" (UID: \"1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:25.317811 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.317687 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce-cert\") pod \"lws-controller-manager-59544c8f7-fq47s\" (UID: \"1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:25.317811 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.317710 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce-manager-config\") pod \"lws-controller-manager-59544c8f7-fq47s\" (UID: \"1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:25.418893 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.418842 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce-metrics-cert\") pod \"lws-controller-manager-59544c8f7-fq47s\" (UID: \"1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:25.419103 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.418916 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jswhp\" (UniqueName: \"kubernetes.io/projected/1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce-kube-api-access-jswhp\") pod \"lws-controller-manager-59544c8f7-fq47s\" (UID: \"1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:25.419103 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.418951 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce-cert\") pod \"lws-controller-manager-59544c8f7-fq47s\" (UID: \"1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:25.419103 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.418984 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce-manager-config\") pod \"lws-controller-manager-59544c8f7-fq47s\" (UID: \"1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:25.419642 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.419567 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce-manager-config\") pod \"lws-controller-manager-59544c8f7-fq47s\" (UID: \"1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:25.421567 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.421544 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce-metrics-cert\") pod \"lws-controller-manager-59544c8f7-fq47s\" (UID: \"1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:25.421682 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.421590 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce-cert\") pod \"lws-controller-manager-59544c8f7-fq47s\" (UID: \"1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:25.428257 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.428232 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jswhp\" (UniqueName: \"kubernetes.io/projected/1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce-kube-api-access-jswhp\") pod \"lws-controller-manager-59544c8f7-fq47s\" (UID: \"1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce\") " pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:25.576178 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.576065 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:25.706611 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:25.706543 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s"] Apr 22 18:49:25.709537 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:49:25.709508 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b5fc6cd_1f41_4ea0_b121_cc5f8c8846ce.slice/crio-9dd06ecdf8d3edd4f0318b1372ebee6ae78a8a6d946e6a83f496f1ad16acd0d4 WatchSource:0}: Error finding container 9dd06ecdf8d3edd4f0318b1372ebee6ae78a8a6d946e6a83f496f1ad16acd0d4: Status 404 returned error can't find the container with id 9dd06ecdf8d3edd4f0318b1372ebee6ae78a8a6d946e6a83f496f1ad16acd0d4 Apr 22 18:49:26.704335 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:26.704300 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" event={"ID":"1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce","Type":"ContainerStarted","Data":"9dd06ecdf8d3edd4f0318b1372ebee6ae78a8a6d946e6a83f496f1ad16acd0d4"} Apr 22 18:49:29.716777 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:29.716740 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" event={"ID":"1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce","Type":"ContainerStarted","Data":"439a61a26f7961a476816adc0b164b7216dafa43ea31b07d9ef60ee5ebb21087"} Apr 22 18:49:29.717190 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:29.716871 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:29.737289 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:29.737237 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" podStartSLOduration=1.25056515 podStartE2EDuration="4.737221542s" podCreationTimestamp="2026-04-22 18:49:25 +0000 UTC" firstStartedPulling="2026-04-22 18:49:25.711355257 +0000 UTC m=+356.694821176" lastFinishedPulling="2026-04-22 18:49:29.198011645 +0000 UTC m=+360.181477568" observedRunningTime="2026-04-22 18:49:29.735178163 +0000 UTC m=+360.718644103" watchObservedRunningTime="2026-04-22 18:49:29.737221542 +0000 UTC m=+360.720687482" Apr 22 18:49:33.695606 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:33.695574 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-7c59bb5d7b-gkc52" Apr 22 18:49:37.247373 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.247327 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf"] Apr 22 18:49:37.249545 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.249523 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf" Apr 22 18:49:37.252659 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.252612 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 22 18:49:37.252821 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.252707 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 22 18:49:37.252821 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.252763 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-jt6pc\"" Apr 22 18:49:37.263983 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.263957 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf"] Apr 22 18:49:37.329210 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.329174 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrxcd\" (UniqueName: \"kubernetes.io/projected/0ddfad48-49ca-4fcd-9ed4-abfe5922044b-kube-api-access-zrxcd\") pod \"kube-auth-proxy-6bc9b7f4d-2vxvf\" (UID: \"0ddfad48-49ca-4fcd-9ed4-abfe5922044b\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf" Apr 22 18:49:37.329210 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.329218 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0ddfad48-49ca-4fcd-9ed4-abfe5922044b-tls-certs\") pod \"kube-auth-proxy-6bc9b7f4d-2vxvf\" (UID: \"0ddfad48-49ca-4fcd-9ed4-abfe5922044b\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf" Apr 22 18:49:37.329443 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.329312 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0ddfad48-49ca-4fcd-9ed4-abfe5922044b-tmp\") pod \"kube-auth-proxy-6bc9b7f4d-2vxvf\" (UID: \"0ddfad48-49ca-4fcd-9ed4-abfe5922044b\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf" Apr 22 18:49:37.430726 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.430670 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0ddfad48-49ca-4fcd-9ed4-abfe5922044b-tmp\") pod \"kube-auth-proxy-6bc9b7f4d-2vxvf\" (UID: \"0ddfad48-49ca-4fcd-9ed4-abfe5922044b\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf" Apr 22 18:49:37.430989 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.430774 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrxcd\" (UniqueName: \"kubernetes.io/projected/0ddfad48-49ca-4fcd-9ed4-abfe5922044b-kube-api-access-zrxcd\") pod \"kube-auth-proxy-6bc9b7f4d-2vxvf\" (UID: \"0ddfad48-49ca-4fcd-9ed4-abfe5922044b\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf" Apr 22 18:49:37.430989 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.430802 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0ddfad48-49ca-4fcd-9ed4-abfe5922044b-tls-certs\") pod \"kube-auth-proxy-6bc9b7f4d-2vxvf\" (UID: \"0ddfad48-49ca-4fcd-9ed4-abfe5922044b\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf" Apr 22 18:49:37.433142 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.433118 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0ddfad48-49ca-4fcd-9ed4-abfe5922044b-tmp\") pod \"kube-auth-proxy-6bc9b7f4d-2vxvf\" (UID: \"0ddfad48-49ca-4fcd-9ed4-abfe5922044b\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf" Apr 22 18:49:37.433450 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.433425 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0ddfad48-49ca-4fcd-9ed4-abfe5922044b-tls-certs\") pod \"kube-auth-proxy-6bc9b7f4d-2vxvf\" (UID: \"0ddfad48-49ca-4fcd-9ed4-abfe5922044b\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf" Apr 22 18:49:37.440815 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.440787 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrxcd\" (UniqueName: \"kubernetes.io/projected/0ddfad48-49ca-4fcd-9ed4-abfe5922044b-kube-api-access-zrxcd\") pod \"kube-auth-proxy-6bc9b7f4d-2vxvf\" (UID: \"0ddfad48-49ca-4fcd-9ed4-abfe5922044b\") " pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf" Apr 22 18:49:37.559417 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.559336 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf" Apr 22 18:49:37.702528 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.702475 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf"] Apr 22 18:49:37.705611 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:49:37.705582 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ddfad48_49ca_4fcd_9ed4_abfe5922044b.slice/crio-fe0375832329c018f8b176e40088650228a13c7a811777aaf5ca4a8856cf2b1f WatchSource:0}: Error finding container fe0375832329c018f8b176e40088650228a13c7a811777aaf5ca4a8856cf2b1f: Status 404 returned error can't find the container with id fe0375832329c018f8b176e40088650228a13c7a811777aaf5ca4a8856cf2b1f Apr 22 18:49:37.744452 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:37.744412 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf" event={"ID":"0ddfad48-49ca-4fcd-9ed4-abfe5922044b","Type":"ContainerStarted","Data":"fe0375832329c018f8b176e40088650228a13c7a811777aaf5ca4a8856cf2b1f"} Apr 22 18:49:40.723962 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:40.723921 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-59544c8f7-fq47s" Apr 22 18:49:41.765321 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:41.765289 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf" event={"ID":"0ddfad48-49ca-4fcd-9ed4-abfe5922044b","Type":"ContainerStarted","Data":"5bdb147b7e7cc6178490d433b4c6a6ec14d3363821ee772d29bf4e8938f688fb"} Apr 22 18:49:41.785971 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:49:41.785908 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-6bc9b7f4d-2vxvf" podStartSLOduration=1.323784237 podStartE2EDuration="4.785888435s" podCreationTimestamp="2026-04-22 18:49:37 +0000 UTC" firstStartedPulling="2026-04-22 18:49:37.707543081 +0000 UTC m=+368.691009002" lastFinishedPulling="2026-04-22 18:49:41.169647267 +0000 UTC m=+372.153113200" observedRunningTime="2026-04-22 18:49:41.784270289 +0000 UTC m=+372.767736230" watchObservedRunningTime="2026-04-22 18:49:41.785888435 +0000 UTC m=+372.769354378" Apr 22 18:51:35.684682 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:35.684648 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf"] Apr 22 18:51:35.687669 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:35.687653 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" Apr 22 18:51:35.690876 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:35.690858 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:51:35.692299 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:35.692279 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:51:35.692395 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:35.692322 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-8vdqp\"" Apr 22 18:51:35.701231 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:35.701209 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf"] Apr 22 18:51:35.753037 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:35.753008 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65t68\" (UniqueName: \"kubernetes.io/projected/8114cacb-ce56-426b-b9ff-1f808f55a706-kube-api-access-65t68\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q5tzf\" (UID: \"8114cacb-ce56-426b-b9ff-1f808f55a706\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" Apr 22 18:51:35.753129 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:35.753042 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8114cacb-ce56-426b-b9ff-1f808f55a706-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q5tzf\" (UID: \"8114cacb-ce56-426b-b9ff-1f808f55a706\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" Apr 22 18:51:35.854515 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:35.854488 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65t68\" (UniqueName: \"kubernetes.io/projected/8114cacb-ce56-426b-b9ff-1f808f55a706-kube-api-access-65t68\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q5tzf\" (UID: \"8114cacb-ce56-426b-b9ff-1f808f55a706\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" Apr 22 18:51:35.854640 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:35.854521 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8114cacb-ce56-426b-b9ff-1f808f55a706-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q5tzf\" (UID: \"8114cacb-ce56-426b-b9ff-1f808f55a706\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" Apr 22 18:51:35.854911 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:35.854892 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8114cacb-ce56-426b-b9ff-1f808f55a706-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q5tzf\" (UID: \"8114cacb-ce56-426b-b9ff-1f808f55a706\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" Apr 22 18:51:35.874675 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:35.874651 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65t68\" (UniqueName: \"kubernetes.io/projected/8114cacb-ce56-426b-b9ff-1f808f55a706-kube-api-access-65t68\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-q5tzf\" (UID: \"8114cacb-ce56-426b-b9ff-1f808f55a706\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" Apr 22 18:51:35.998040 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:35.997988 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" Apr 22 18:51:36.141832 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:36.141685 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf"] Apr 22 18:51:36.144794 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:51:36.144767 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8114cacb_ce56_426b_b9ff_1f808f55a706.slice/crio-506748d8c6e4270358e765f99afaeb192308ff4719f5526848cbb4b73b0bea7f WatchSource:0}: Error finding container 506748d8c6e4270358e765f99afaeb192308ff4719f5526848cbb4b73b0bea7f: Status 404 returned error can't find the container with id 506748d8c6e4270358e765f99afaeb192308ff4719f5526848cbb4b73b0bea7f Apr 22 18:51:36.150440 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:36.150410 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" event={"ID":"8114cacb-ce56-426b-b9ff-1f808f55a706","Type":"ContainerStarted","Data":"506748d8c6e4270358e765f99afaeb192308ff4719f5526848cbb4b73b0bea7f"} Apr 22 18:51:43.175025 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:43.174985 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" event={"ID":"8114cacb-ce56-426b-b9ff-1f808f55a706","Type":"ContainerStarted","Data":"4555d540006ddb12fd1cff9f41eeea9d2b7825a911957f7fb5b6781940c231f2"} Apr 22 18:51:43.175446 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:43.175232 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" Apr 22 18:51:54.181038 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:54.181008 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" Apr 22 18:51:54.211055 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:54.211002 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" podStartSLOduration=13.098848733 podStartE2EDuration="19.210983943s" podCreationTimestamp="2026-04-22 18:51:35 +0000 UTC" firstStartedPulling="2026-04-22 18:51:36.147262615 +0000 UTC m=+487.130728534" lastFinishedPulling="2026-04-22 18:51:42.259397817 +0000 UTC m=+493.242863744" observedRunningTime="2026-04-22 18:51:43.217584574 +0000 UTC m=+494.201050515" watchObservedRunningTime="2026-04-22 18:51:54.210983943 +0000 UTC m=+505.194449884" Apr 22 18:51:55.944730 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:55.944690 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf"] Apr 22 18:51:55.945119 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:55.944929 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" podUID="8114cacb-ce56-426b-b9ff-1f808f55a706" containerName="manager" containerID="cri-o://4555d540006ddb12fd1cff9f41eeea9d2b7825a911957f7fb5b6781940c231f2" gracePeriod=2 Apr 22 18:51:55.957968 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:55.957940 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf"] Apr 22 18:51:55.992128 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:55.992099 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm"] Apr 22 18:51:55.992466 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:55.992452 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8114cacb-ce56-426b-b9ff-1f808f55a706" containerName="manager" Apr 22 18:51:55.992513 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:55.992468 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8114cacb-ce56-426b-b9ff-1f808f55a706" containerName="manager" Apr 22 18:51:55.992547 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:55.992523 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8114cacb-ce56-426b-b9ff-1f808f55a706" containerName="manager" Apr 22 18:51:55.994465 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:55.994444 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" Apr 22 18:51:56.015569 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.015538 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm"] Apr 22 18:51:56.044914 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.044871 2570 status_manager.go:895] "Failed to get status for pod" podUID="8114cacb-ce56-426b-b9ff-1f808f55a706" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-q5tzf\" is forbidden: User \"system:node:ip-10-0-134-244.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-244.ec2.internal' and this object" Apr 22 18:51:56.139982 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.139953 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t85pj\" (UniqueName: \"kubernetes.io/projected/b92e7ccb-be3d-464e-bd1e-683db2e34311-kube-api-access-t85pj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-k6lzm\" (UID: \"b92e7ccb-be3d-464e-bd1e-683db2e34311\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" Apr 22 18:51:56.140138 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.140037 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b92e7ccb-be3d-464e-bd1e-683db2e34311-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-k6lzm\" (UID: \"b92e7ccb-be3d-464e-bd1e-683db2e34311\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" Apr 22 18:51:56.173335 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.173311 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" Apr 22 18:51:56.176373 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.176347 2570 status_manager.go:895] "Failed to get status for pod" podUID="8114cacb-ce56-426b-b9ff-1f808f55a706" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-q5tzf\" is forbidden: User \"system:node:ip-10-0-134-244.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-244.ec2.internal' and this object" Apr 22 18:51:56.223743 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.223656 2570 generic.go:358] "Generic (PLEG): container finished" podID="8114cacb-ce56-426b-b9ff-1f808f55a706" containerID="4555d540006ddb12fd1cff9f41eeea9d2b7825a911957f7fb5b6781940c231f2" exitCode=0 Apr 22 18:51:56.223743 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.223708 2570 scope.go:117] "RemoveContainer" containerID="4555d540006ddb12fd1cff9f41eeea9d2b7825a911957f7fb5b6781940c231f2" Apr 22 18:51:56.223743 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.223714 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" Apr 22 18:51:56.226456 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.226427 2570 status_manager.go:895] "Failed to get status for pod" podUID="8114cacb-ce56-426b-b9ff-1f808f55a706" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-q5tzf\" is forbidden: User \"system:node:ip-10-0-134-244.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-244.ec2.internal' and this object" Apr 22 18:51:56.231935 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.231917 2570 scope.go:117] "RemoveContainer" containerID="4555d540006ddb12fd1cff9f41eeea9d2b7825a911957f7fb5b6781940c231f2" Apr 22 18:51:56.232204 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:51:56.232188 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4555d540006ddb12fd1cff9f41eeea9d2b7825a911957f7fb5b6781940c231f2\": container with ID starting with 4555d540006ddb12fd1cff9f41eeea9d2b7825a911957f7fb5b6781940c231f2 not found: ID does not exist" containerID="4555d540006ddb12fd1cff9f41eeea9d2b7825a911957f7fb5b6781940c231f2" Apr 22 18:51:56.232243 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.232229 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4555d540006ddb12fd1cff9f41eeea9d2b7825a911957f7fb5b6781940c231f2"} err="failed to get container status \"4555d540006ddb12fd1cff9f41eeea9d2b7825a911957f7fb5b6781940c231f2\": rpc error: code = NotFound desc = could not find container \"4555d540006ddb12fd1cff9f41eeea9d2b7825a911957f7fb5b6781940c231f2\": container with ID starting with 4555d540006ddb12fd1cff9f41eeea9d2b7825a911957f7fb5b6781940c231f2 not found: ID does not exist" Apr 22 18:51:56.240537 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.240517 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65t68\" (UniqueName: \"kubernetes.io/projected/8114cacb-ce56-426b-b9ff-1f808f55a706-kube-api-access-65t68\") pod \"8114cacb-ce56-426b-b9ff-1f808f55a706\" (UID: \"8114cacb-ce56-426b-b9ff-1f808f55a706\") " Apr 22 18:51:56.240609 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.240551 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8114cacb-ce56-426b-b9ff-1f808f55a706-extensions-socket-volume\") pod \"8114cacb-ce56-426b-b9ff-1f808f55a706\" (UID: \"8114cacb-ce56-426b-b9ff-1f808f55a706\") " Apr 22 18:51:56.240775 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.240760 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b92e7ccb-be3d-464e-bd1e-683db2e34311-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-k6lzm\" (UID: \"b92e7ccb-be3d-464e-bd1e-683db2e34311\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" Apr 22 18:51:56.240846 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.240808 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t85pj\" (UniqueName: \"kubernetes.io/projected/b92e7ccb-be3d-464e-bd1e-683db2e34311-kube-api-access-t85pj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-k6lzm\" (UID: \"b92e7ccb-be3d-464e-bd1e-683db2e34311\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" Apr 22 18:51:56.241086 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.241064 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8114cacb-ce56-426b-b9ff-1f808f55a706-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "8114cacb-ce56-426b-b9ff-1f808f55a706" (UID: "8114cacb-ce56-426b-b9ff-1f808f55a706"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:56.241188 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.241104 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b92e7ccb-be3d-464e-bd1e-683db2e34311-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-k6lzm\" (UID: \"b92e7ccb-be3d-464e-bd1e-683db2e34311\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" Apr 22 18:51:56.243028 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.243005 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8114cacb-ce56-426b-b9ff-1f808f55a706-kube-api-access-65t68" (OuterVolumeSpecName: "kube-api-access-65t68") pod "8114cacb-ce56-426b-b9ff-1f808f55a706" (UID: "8114cacb-ce56-426b-b9ff-1f808f55a706"). InnerVolumeSpecName "kube-api-access-65t68". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:56.259159 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.259131 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t85pj\" (UniqueName: \"kubernetes.io/projected/b92e7ccb-be3d-464e-bd1e-683db2e34311-kube-api-access-t85pj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-k6lzm\" (UID: \"b92e7ccb-be3d-464e-bd1e-683db2e34311\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" Apr 22 18:51:56.323785 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.323751 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" Apr 22 18:51:56.342186 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.342154 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65t68\" (UniqueName: \"kubernetes.io/projected/8114cacb-ce56-426b-b9ff-1f808f55a706-kube-api-access-65t68\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:51:56.342314 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.342191 2570 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8114cacb-ce56-426b-b9ff-1f808f55a706-extensions-socket-volume\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:51:56.475898 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.475871 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm"] Apr 22 18:51:56.478480 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:51:56.478449 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb92e7ccb_be3d_464e_bd1e_683db2e34311.slice/crio-730290d38c7addf8fb985add9fa99c645b5081ac6de68c5f7ced0b77ac2abecf WatchSource:0}: Error finding container 730290d38c7addf8fb985add9fa99c645b5081ac6de68c5f7ced0b77ac2abecf: Status 404 returned error can't find the container with id 730290d38c7addf8fb985add9fa99c645b5081ac6de68c5f7ced0b77ac2abecf Apr 22 18:51:56.535364 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:56.535333 2570 status_manager.go:895] "Failed to get status for pod" podUID="8114cacb-ce56-426b-b9ff-1f808f55a706" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-q5tzf\" is forbidden: User \"system:node:ip-10-0-134-244.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-244.ec2.internal' and this object" Apr 22 18:51:57.229534 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:57.229494 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" event={"ID":"b92e7ccb-be3d-464e-bd1e-683db2e34311","Type":"ContainerStarted","Data":"20ddafec01e9f11e15db5272277093fb9310ae53ebdbbc6e3406fb4956ba18ad"} Apr 22 18:51:57.229534 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:57.229543 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" event={"ID":"b92e7ccb-be3d-464e-bd1e-683db2e34311","Type":"ContainerStarted","Data":"730290d38c7addf8fb985add9fa99c645b5081ac6de68c5f7ced0b77ac2abecf"} Apr 22 18:51:57.230057 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:57.229589 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" Apr 22 18:51:57.277125 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:57.277067 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" podStartSLOduration=2.27705027 podStartE2EDuration="2.27705027s" podCreationTimestamp="2026-04-22 18:51:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:51:57.274546725 +0000 UTC m=+508.258012666" watchObservedRunningTime="2026-04-22 18:51:57.27705027 +0000 UTC m=+508.260516211" Apr 22 18:51:57.277322 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:57.277297 2570 status_manager.go:895] "Failed to get status for pod" podUID="8114cacb-ce56-426b-b9ff-1f808f55a706" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-q5tzf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-q5tzf\" is forbidden: User \"system:node:ip-10-0-134-244.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-244.ec2.internal' and this object" Apr 22 18:51:57.552599 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:51:57.552504 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8114cacb-ce56-426b-b9ff-1f808f55a706" path="/var/lib/kubelet/pods/8114cacb-ce56-426b-b9ff-1f808f55a706/volumes" Apr 22 18:52:08.235207 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:08.235172 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" Apr 22 18:52:27.515308 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:27.515225 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm"] Apr 22 18:52:27.515852 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:27.515479 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" podUID="b92e7ccb-be3d-464e-bd1e-683db2e34311" containerName="manager" containerID="cri-o://20ddafec01e9f11e15db5272277093fb9310ae53ebdbbc6e3406fb4956ba18ad" gracePeriod=10 Apr 22 18:52:27.754891 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:27.754859 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" Apr 22 18:52:27.832483 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:27.832401 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t85pj\" (UniqueName: \"kubernetes.io/projected/b92e7ccb-be3d-464e-bd1e-683db2e34311-kube-api-access-t85pj\") pod \"b92e7ccb-be3d-464e-bd1e-683db2e34311\" (UID: \"b92e7ccb-be3d-464e-bd1e-683db2e34311\") " Apr 22 18:52:27.832614 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:27.832541 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b92e7ccb-be3d-464e-bd1e-683db2e34311-extensions-socket-volume\") pod \"b92e7ccb-be3d-464e-bd1e-683db2e34311\" (UID: \"b92e7ccb-be3d-464e-bd1e-683db2e34311\") " Apr 22 18:52:27.832914 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:27.832891 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b92e7ccb-be3d-464e-bd1e-683db2e34311-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "b92e7ccb-be3d-464e-bd1e-683db2e34311" (UID: "b92e7ccb-be3d-464e-bd1e-683db2e34311"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:27.834852 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:27.834829 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b92e7ccb-be3d-464e-bd1e-683db2e34311-kube-api-access-t85pj" (OuterVolumeSpecName: "kube-api-access-t85pj") pod "b92e7ccb-be3d-464e-bd1e-683db2e34311" (UID: "b92e7ccb-be3d-464e-bd1e-683db2e34311"). InnerVolumeSpecName "kube-api-access-t85pj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:52:27.934098 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:27.934058 2570 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/b92e7ccb-be3d-464e-bd1e-683db2e34311-extensions-socket-volume\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:52:27.934098 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:27.934091 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t85pj\" (UniqueName: \"kubernetes.io/projected/b92e7ccb-be3d-464e-bd1e-683db2e34311-kube-api-access-t85pj\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:52:28.336195 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:28.336153 2570 generic.go:358] "Generic (PLEG): container finished" podID="b92e7ccb-be3d-464e-bd1e-683db2e34311" containerID="20ddafec01e9f11e15db5272277093fb9310ae53ebdbbc6e3406fb4956ba18ad" exitCode=0 Apr 22 18:52:28.336407 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:28.336217 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" Apr 22 18:52:28.336407 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:28.336247 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" event={"ID":"b92e7ccb-be3d-464e-bd1e-683db2e34311","Type":"ContainerDied","Data":"20ddafec01e9f11e15db5272277093fb9310ae53ebdbbc6e3406fb4956ba18ad"} Apr 22 18:52:28.336407 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:28.336291 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm" event={"ID":"b92e7ccb-be3d-464e-bd1e-683db2e34311","Type":"ContainerDied","Data":"730290d38c7addf8fb985add9fa99c645b5081ac6de68c5f7ced0b77ac2abecf"} Apr 22 18:52:28.336407 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:28.336311 2570 scope.go:117] "RemoveContainer" containerID="20ddafec01e9f11e15db5272277093fb9310ae53ebdbbc6e3406fb4956ba18ad" Apr 22 18:52:28.345950 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:28.345933 2570 scope.go:117] "RemoveContainer" containerID="20ddafec01e9f11e15db5272277093fb9310ae53ebdbbc6e3406fb4956ba18ad" Apr 22 18:52:28.346219 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:52:28.346202 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20ddafec01e9f11e15db5272277093fb9310ae53ebdbbc6e3406fb4956ba18ad\": container with ID starting with 20ddafec01e9f11e15db5272277093fb9310ae53ebdbbc6e3406fb4956ba18ad not found: ID does not exist" containerID="20ddafec01e9f11e15db5272277093fb9310ae53ebdbbc6e3406fb4956ba18ad" Apr 22 18:52:28.346285 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:28.346226 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20ddafec01e9f11e15db5272277093fb9310ae53ebdbbc6e3406fb4956ba18ad"} err="failed to get container status \"20ddafec01e9f11e15db5272277093fb9310ae53ebdbbc6e3406fb4956ba18ad\": rpc error: code = NotFound desc = could not find container \"20ddafec01e9f11e15db5272277093fb9310ae53ebdbbc6e3406fb4956ba18ad\": container with ID starting with 20ddafec01e9f11e15db5272277093fb9310ae53ebdbbc6e3406fb4956ba18ad not found: ID does not exist" Apr 22 18:52:28.364797 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:28.364594 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm"] Apr 22 18:52:28.366432 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:28.366409 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k6lzm"] Apr 22 18:52:29.550523 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:29.550490 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b92e7ccb-be3d-464e-bd1e-683db2e34311" path="/var/lib/kubelet/pods/b92e7ccb-be3d-464e-bd1e-683db2e34311/volumes" Apr 22 18:52:48.421840 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.421807 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-m7tt6"] Apr 22 18:52:48.422197 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.422153 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b92e7ccb-be3d-464e-bd1e-683db2e34311" containerName="manager" Apr 22 18:52:48.422197 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.422165 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92e7ccb-be3d-464e-bd1e-683db2e34311" containerName="manager" Apr 22 18:52:48.422292 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.422241 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="b92e7ccb-be3d-464e-bd1e-683db2e34311" containerName="manager" Apr 22 18:52:48.423956 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.423940 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" Apr 22 18:52:48.426508 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.426485 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 18:52:48.426649 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.426553 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:52:48.426649 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.426601 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-wnkm8\"" Apr 22 18:52:48.427426 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.427409 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:52:48.431237 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.431052 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-m7tt6"] Apr 22 18:52:48.518041 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.518007 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/210231e5-c4a2-4b65-8769-cdd560156499-config-file\") pod \"limitador-limitador-7d549b5b-m7tt6\" (UID: \"210231e5-c4a2-4b65-8769-cdd560156499\") " pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" Apr 22 18:52:48.518225 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.518208 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f5gl\" (UniqueName: \"kubernetes.io/projected/210231e5-c4a2-4b65-8769-cdd560156499-kube-api-access-6f5gl\") pod \"limitador-limitador-7d549b5b-m7tt6\" (UID: \"210231e5-c4a2-4b65-8769-cdd560156499\") " pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" Apr 22 18:52:48.523602 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.523575 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-m7tt6"] Apr 22 18:52:48.618961 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.618921 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/210231e5-c4a2-4b65-8769-cdd560156499-config-file\") pod \"limitador-limitador-7d549b5b-m7tt6\" (UID: \"210231e5-c4a2-4b65-8769-cdd560156499\") " pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" Apr 22 18:52:48.619144 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.619026 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6f5gl\" (UniqueName: \"kubernetes.io/projected/210231e5-c4a2-4b65-8769-cdd560156499-kube-api-access-6f5gl\") pod \"limitador-limitador-7d549b5b-m7tt6\" (UID: \"210231e5-c4a2-4b65-8769-cdd560156499\") " pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" Apr 22 18:52:48.619592 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.619567 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/210231e5-c4a2-4b65-8769-cdd560156499-config-file\") pod \"limitador-limitador-7d549b5b-m7tt6\" (UID: \"210231e5-c4a2-4b65-8769-cdd560156499\") " pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" Apr 22 18:52:48.626874 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.626852 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f5gl\" (UniqueName: \"kubernetes.io/projected/210231e5-c4a2-4b65-8769-cdd560156499-kube-api-access-6f5gl\") pod \"limitador-limitador-7d549b5b-m7tt6\" (UID: \"210231e5-c4a2-4b65-8769-cdd560156499\") " pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" Apr 22 18:52:48.735172 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.735107 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" Apr 22 18:52:48.858112 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:48.858091 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-m7tt6"] Apr 22 18:52:48.860841 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:52:48.860810 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod210231e5_c4a2_4b65_8769_cdd560156499.slice/crio-4243f049e21c9f68e7a53cac305f12edd3d8f858f75f44fd882511f1803dbce9 WatchSource:0}: Error finding container 4243f049e21c9f68e7a53cac305f12edd3d8f858f75f44fd882511f1803dbce9: Status 404 returned error can't find the container with id 4243f049e21c9f68e7a53cac305f12edd3d8f858f75f44fd882511f1803dbce9 Apr 22 18:52:49.291304 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:49.291274 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-k75wc"] Apr 22 18:52:49.294314 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:49.294294 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-k75wc" Apr 22 18:52:49.296608 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:49.296588 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-sgc2m\"" Apr 22 18:52:49.303404 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:49.303381 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-k75wc"] Apr 22 18:52:49.407016 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:49.406984 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" event={"ID":"210231e5-c4a2-4b65-8769-cdd560156499","Type":"ContainerStarted","Data":"4243f049e21c9f68e7a53cac305f12edd3d8f858f75f44fd882511f1803dbce9"} Apr 22 18:52:49.427485 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:49.427455 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zglzt\" (UniqueName: \"kubernetes.io/projected/f80d251d-515e-411a-9fb8-c9996236d8bb-kube-api-access-zglzt\") pod \"authorino-f99f4b5cd-k75wc\" (UID: \"f80d251d-515e-411a-9fb8-c9996236d8bb\") " pod="kuadrant-system/authorino-f99f4b5cd-k75wc" Apr 22 18:52:49.529921 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:49.529418 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zglzt\" (UniqueName: \"kubernetes.io/projected/f80d251d-515e-411a-9fb8-c9996236d8bb-kube-api-access-zglzt\") pod \"authorino-f99f4b5cd-k75wc\" (UID: \"f80d251d-515e-411a-9fb8-c9996236d8bb\") " pod="kuadrant-system/authorino-f99f4b5cd-k75wc" Apr 22 18:52:49.543430 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:49.543360 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zglzt\" (UniqueName: \"kubernetes.io/projected/f80d251d-515e-411a-9fb8-c9996236d8bb-kube-api-access-zglzt\") pod \"authorino-f99f4b5cd-k75wc\" (UID: \"f80d251d-515e-411a-9fb8-c9996236d8bb\") " pod="kuadrant-system/authorino-f99f4b5cd-k75wc" Apr 22 18:52:49.613176 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:49.613138 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-k75wc" Apr 22 18:52:49.773277 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:49.773228 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-k75wc"] Apr 22 18:52:49.777417 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:52:49.777383 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf80d251d_515e_411a_9fb8_c9996236d8bb.slice/crio-7a2145eebc3dd9bad2382a26b30d10ff029b063545349be680f202fe064598af WatchSource:0}: Error finding container 7a2145eebc3dd9bad2382a26b30d10ff029b063545349be680f202fe064598af: Status 404 returned error can't find the container with id 7a2145eebc3dd9bad2382a26b30d10ff029b063545349be680f202fe064598af Apr 22 18:52:50.419456 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:50.419026 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-k75wc" event={"ID":"f80d251d-515e-411a-9fb8-c9996236d8bb","Type":"ContainerStarted","Data":"7a2145eebc3dd9bad2382a26b30d10ff029b063545349be680f202fe064598af"} Apr 22 18:52:53.433540 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:53.433502 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" event={"ID":"210231e5-c4a2-4b65-8769-cdd560156499","Type":"ContainerStarted","Data":"4074fbe65d9bc67438a106067be90980e67dd73db96f38e600f1b3333f7e80fb"} Apr 22 18:52:53.434028 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:53.433676 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" Apr 22 18:52:53.435206 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:53.435173 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-k75wc" event={"ID":"f80d251d-515e-411a-9fb8-c9996236d8bb","Type":"ContainerStarted","Data":"e368469eaefc5fc6c76146adec023d801a9ed83e47cc34e834e7d5324c7450af"} Apr 22 18:52:53.451300 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:53.451249 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" podStartSLOduration=1.008189766 podStartE2EDuration="5.451236686s" podCreationTimestamp="2026-04-22 18:52:48 +0000 UTC" firstStartedPulling="2026-04-22 18:52:48.862563684 +0000 UTC m=+559.846029607" lastFinishedPulling="2026-04-22 18:52:53.305610606 +0000 UTC m=+564.289076527" observedRunningTime="2026-04-22 18:52:53.448856188 +0000 UTC m=+564.432322140" watchObservedRunningTime="2026-04-22 18:52:53.451236686 +0000 UTC m=+564.434702686" Apr 22 18:52:53.462897 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:53.462845 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-k75wc" podStartSLOduration=0.934232537 podStartE2EDuration="4.462827961s" podCreationTimestamp="2026-04-22 18:52:49 +0000 UTC" firstStartedPulling="2026-04-22 18:52:49.77994558 +0000 UTC m=+560.763411505" lastFinishedPulling="2026-04-22 18:52:53.30854101 +0000 UTC m=+564.292006929" observedRunningTime="2026-04-22 18:52:53.461183103 +0000 UTC m=+564.444649041" watchObservedRunningTime="2026-04-22 18:52:53.462827961 +0000 UTC m=+564.446293954" Apr 22 18:52:53.628961 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:53.628927 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-k75wc"] Apr 22 18:52:55.442312 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:55.442269 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-k75wc" podUID="f80d251d-515e-411a-9fb8-c9996236d8bb" containerName="authorino" containerID="cri-o://e368469eaefc5fc6c76146adec023d801a9ed83e47cc34e834e7d5324c7450af" gracePeriod=30 Apr 22 18:52:55.684516 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:55.684495 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-k75wc" Apr 22 18:52:55.784398 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:55.784321 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zglzt\" (UniqueName: \"kubernetes.io/projected/f80d251d-515e-411a-9fb8-c9996236d8bb-kube-api-access-zglzt\") pod \"f80d251d-515e-411a-9fb8-c9996236d8bb\" (UID: \"f80d251d-515e-411a-9fb8-c9996236d8bb\") " Apr 22 18:52:55.786534 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:55.786507 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f80d251d-515e-411a-9fb8-c9996236d8bb-kube-api-access-zglzt" (OuterVolumeSpecName: "kube-api-access-zglzt") pod "f80d251d-515e-411a-9fb8-c9996236d8bb" (UID: "f80d251d-515e-411a-9fb8-c9996236d8bb"). InnerVolumeSpecName "kube-api-access-zglzt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:52:55.885716 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:55.885689 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zglzt\" (UniqueName: \"kubernetes.io/projected/f80d251d-515e-411a-9fb8-c9996236d8bb-kube-api-access-zglzt\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:52:56.451410 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:56.451376 2570 generic.go:358] "Generic (PLEG): container finished" podID="f80d251d-515e-411a-9fb8-c9996236d8bb" containerID="e368469eaefc5fc6c76146adec023d801a9ed83e47cc34e834e7d5324c7450af" exitCode=0 Apr 22 18:52:56.451836 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:56.451430 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-k75wc" event={"ID":"f80d251d-515e-411a-9fb8-c9996236d8bb","Type":"ContainerDied","Data":"e368469eaefc5fc6c76146adec023d801a9ed83e47cc34e834e7d5324c7450af"} Apr 22 18:52:56.451836 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:56.451458 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-k75wc" event={"ID":"f80d251d-515e-411a-9fb8-c9996236d8bb","Type":"ContainerDied","Data":"7a2145eebc3dd9bad2382a26b30d10ff029b063545349be680f202fe064598af"} Apr 22 18:52:56.451836 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:56.451455 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-k75wc" Apr 22 18:52:56.451836 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:56.451474 2570 scope.go:117] "RemoveContainer" containerID="e368469eaefc5fc6c76146adec023d801a9ed83e47cc34e834e7d5324c7450af" Apr 22 18:52:56.460304 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:56.460290 2570 scope.go:117] "RemoveContainer" containerID="e368469eaefc5fc6c76146adec023d801a9ed83e47cc34e834e7d5324c7450af" Apr 22 18:52:56.460560 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:52:56.460543 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e368469eaefc5fc6c76146adec023d801a9ed83e47cc34e834e7d5324c7450af\": container with ID starting with e368469eaefc5fc6c76146adec023d801a9ed83e47cc34e834e7d5324c7450af not found: ID does not exist" containerID="e368469eaefc5fc6c76146adec023d801a9ed83e47cc34e834e7d5324c7450af" Apr 22 18:52:56.460605 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:56.460569 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e368469eaefc5fc6c76146adec023d801a9ed83e47cc34e834e7d5324c7450af"} err="failed to get container status \"e368469eaefc5fc6c76146adec023d801a9ed83e47cc34e834e7d5324c7450af\": rpc error: code = NotFound desc = could not find container \"e368469eaefc5fc6c76146adec023d801a9ed83e47cc34e834e7d5324c7450af\": container with ID starting with e368469eaefc5fc6c76146adec023d801a9ed83e47cc34e834e7d5324c7450af not found: ID does not exist" Apr 22 18:52:56.471083 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:56.471063 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-k75wc"] Apr 22 18:52:56.475024 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:56.475002 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-k75wc"] Apr 22 18:52:57.551943 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:52:57.551903 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f80d251d-515e-411a-9fb8-c9996236d8bb" path="/var/lib/kubelet/pods/f80d251d-515e-411a-9fb8-c9996236d8bb/volumes" Apr 22 18:53:03.107821 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:03.107776 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-m7tt6"] Apr 22 18:53:03.108379 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:03.108100 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" podUID="210231e5-c4a2-4b65-8769-cdd560156499" containerName="limitador" containerID="cri-o://4074fbe65d9bc67438a106067be90980e67dd73db96f38e600f1b3333f7e80fb" gracePeriod=30 Apr 22 18:53:03.108795 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:03.108759 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" Apr 22 18:53:03.651104 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:03.651083 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" Apr 22 18:53:03.744669 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:03.744583 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f5gl\" (UniqueName: \"kubernetes.io/projected/210231e5-c4a2-4b65-8769-cdd560156499-kube-api-access-6f5gl\") pod \"210231e5-c4a2-4b65-8769-cdd560156499\" (UID: \"210231e5-c4a2-4b65-8769-cdd560156499\") " Apr 22 18:53:03.744669 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:03.744641 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/210231e5-c4a2-4b65-8769-cdd560156499-config-file\") pod \"210231e5-c4a2-4b65-8769-cdd560156499\" (UID: \"210231e5-c4a2-4b65-8769-cdd560156499\") " Apr 22 18:53:03.744963 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:03.744943 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210231e5-c4a2-4b65-8769-cdd560156499-config-file" (OuterVolumeSpecName: "config-file") pod "210231e5-c4a2-4b65-8769-cdd560156499" (UID: "210231e5-c4a2-4b65-8769-cdd560156499"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:53:03.746559 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:03.746536 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210231e5-c4a2-4b65-8769-cdd560156499-kube-api-access-6f5gl" (OuterVolumeSpecName: "kube-api-access-6f5gl") pod "210231e5-c4a2-4b65-8769-cdd560156499" (UID: "210231e5-c4a2-4b65-8769-cdd560156499"). InnerVolumeSpecName "kube-api-access-6f5gl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:03.845862 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:03.845822 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6f5gl\" (UniqueName: \"kubernetes.io/projected/210231e5-c4a2-4b65-8769-cdd560156499-kube-api-access-6f5gl\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:53:03.845862 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:03.845861 2570 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/210231e5-c4a2-4b65-8769-cdd560156499-config-file\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:53:04.437566 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.437536 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-9rsrk"] Apr 22 18:53:04.437968 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.437948 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f80d251d-515e-411a-9fb8-c9996236d8bb" containerName="authorino" Apr 22 18:53:04.437968 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.437960 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f80d251d-515e-411a-9fb8-c9996236d8bb" containerName="authorino" Apr 22 18:53:04.438038 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.437984 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="210231e5-c4a2-4b65-8769-cdd560156499" containerName="limitador" Apr 22 18:53:04.438038 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.437989 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="210231e5-c4a2-4b65-8769-cdd560156499" containerName="limitador" Apr 22 18:53:04.438098 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.438042 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="210231e5-c4a2-4b65-8769-cdd560156499" containerName="limitador" Apr 22 18:53:04.438098 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.438050 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f80d251d-515e-411a-9fb8-c9996236d8bb" containerName="authorino" Apr 22 18:53:04.440162 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.440141 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-9rsrk" Apr 22 18:53:04.443129 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.443108 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 22 18:53:04.443317 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.443300 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-jzw66\"" Apr 22 18:53:04.450770 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.450614 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/50557419-fb54-49b5-976b-4cbed1aec8d0-data\") pod \"postgres-868db5846d-9rsrk\" (UID: \"50557419-fb54-49b5-976b-4cbed1aec8d0\") " pod="opendatahub/postgres-868db5846d-9rsrk" Apr 22 18:53:04.450770 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.450679 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvdzw\" (UniqueName: \"kubernetes.io/projected/50557419-fb54-49b5-976b-4cbed1aec8d0-kube-api-access-rvdzw\") pod \"postgres-868db5846d-9rsrk\" (UID: \"50557419-fb54-49b5-976b-4cbed1aec8d0\") " pod="opendatahub/postgres-868db5846d-9rsrk" Apr 22 18:53:04.451482 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.451465 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-9rsrk"] Apr 22 18:53:04.482363 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.482334 2570 generic.go:358] "Generic (PLEG): container finished" podID="210231e5-c4a2-4b65-8769-cdd560156499" containerID="4074fbe65d9bc67438a106067be90980e67dd73db96f38e600f1b3333f7e80fb" exitCode=0 Apr 22 18:53:04.482500 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.482393 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" Apr 22 18:53:04.482500 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.482423 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" event={"ID":"210231e5-c4a2-4b65-8769-cdd560156499","Type":"ContainerDied","Data":"4074fbe65d9bc67438a106067be90980e67dd73db96f38e600f1b3333f7e80fb"} Apr 22 18:53:04.482500 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.482461 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-m7tt6" event={"ID":"210231e5-c4a2-4b65-8769-cdd560156499","Type":"ContainerDied","Data":"4243f049e21c9f68e7a53cac305f12edd3d8f858f75f44fd882511f1803dbce9"} Apr 22 18:53:04.482500 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.482479 2570 scope.go:117] "RemoveContainer" containerID="4074fbe65d9bc67438a106067be90980e67dd73db96f38e600f1b3333f7e80fb" Apr 22 18:53:04.490660 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.490636 2570 scope.go:117] "RemoveContainer" containerID="4074fbe65d9bc67438a106067be90980e67dd73db96f38e600f1b3333f7e80fb" Apr 22 18:53:04.490939 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:53:04.490920 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4074fbe65d9bc67438a106067be90980e67dd73db96f38e600f1b3333f7e80fb\": container with ID starting with 4074fbe65d9bc67438a106067be90980e67dd73db96f38e600f1b3333f7e80fb not found: ID does not exist" containerID="4074fbe65d9bc67438a106067be90980e67dd73db96f38e600f1b3333f7e80fb" Apr 22 18:53:04.491020 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.490952 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4074fbe65d9bc67438a106067be90980e67dd73db96f38e600f1b3333f7e80fb"} err="failed to get container status \"4074fbe65d9bc67438a106067be90980e67dd73db96f38e600f1b3333f7e80fb\": rpc error: code = NotFound desc = could not find container \"4074fbe65d9bc67438a106067be90980e67dd73db96f38e600f1b3333f7e80fb\": container with ID starting with 4074fbe65d9bc67438a106067be90980e67dd73db96f38e600f1b3333f7e80fb not found: ID does not exist" Apr 22 18:53:04.503200 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.503176 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-m7tt6"] Apr 22 18:53:04.506484 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.506464 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-m7tt6"] Apr 22 18:53:04.551874 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.551849 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/50557419-fb54-49b5-976b-4cbed1aec8d0-data\") pod \"postgres-868db5846d-9rsrk\" (UID: \"50557419-fb54-49b5-976b-4cbed1aec8d0\") " pod="opendatahub/postgres-868db5846d-9rsrk" Apr 22 18:53:04.552021 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.551883 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvdzw\" (UniqueName: \"kubernetes.io/projected/50557419-fb54-49b5-976b-4cbed1aec8d0-kube-api-access-rvdzw\") pod \"postgres-868db5846d-9rsrk\" (UID: \"50557419-fb54-49b5-976b-4cbed1aec8d0\") " pod="opendatahub/postgres-868db5846d-9rsrk" Apr 22 18:53:04.552268 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.552248 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/50557419-fb54-49b5-976b-4cbed1aec8d0-data\") pod \"postgres-868db5846d-9rsrk\" (UID: \"50557419-fb54-49b5-976b-4cbed1aec8d0\") " pod="opendatahub/postgres-868db5846d-9rsrk" Apr 22 18:53:04.559848 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.559825 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvdzw\" (UniqueName: \"kubernetes.io/projected/50557419-fb54-49b5-976b-4cbed1aec8d0-kube-api-access-rvdzw\") pod \"postgres-868db5846d-9rsrk\" (UID: \"50557419-fb54-49b5-976b-4cbed1aec8d0\") " pod="opendatahub/postgres-868db5846d-9rsrk" Apr 22 18:53:04.752652 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.752543 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-9rsrk" Apr 22 18:53:04.875344 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:04.875322 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-9rsrk"] Apr 22 18:53:04.877465 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:53:04.877434 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50557419_fb54_49b5_976b_4cbed1aec8d0.slice/crio-e5b132cd65d544bd35f27cae15c653e78e669d6f8b851871ffc875c2d0b694ff WatchSource:0}: Error finding container e5b132cd65d544bd35f27cae15c653e78e669d6f8b851871ffc875c2d0b694ff: Status 404 returned error can't find the container with id e5b132cd65d544bd35f27cae15c653e78e669d6f8b851871ffc875c2d0b694ff Apr 22 18:53:05.487103 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:05.487069 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-9rsrk" event={"ID":"50557419-fb54-49b5-976b-4cbed1aec8d0","Type":"ContainerStarted","Data":"e5b132cd65d544bd35f27cae15c653e78e669d6f8b851871ffc875c2d0b694ff"} Apr 22 18:53:05.550364 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:05.550332 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210231e5-c4a2-4b65-8769-cdd560156499" path="/var/lib/kubelet/pods/210231e5-c4a2-4b65-8769-cdd560156499/volumes" Apr 22 18:53:11.512108 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:11.512068 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-9rsrk" event={"ID":"50557419-fb54-49b5-976b-4cbed1aec8d0","Type":"ContainerStarted","Data":"d2d4b75bfcee9809bc7b8a992ed1038283839eba10165b74f9564ab91678b127"} Apr 22 18:53:11.512494 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:11.512131 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-9rsrk" Apr 22 18:53:11.529431 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:11.529388 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-9rsrk" podStartSLOduration=1.426463089 podStartE2EDuration="7.529375643s" podCreationTimestamp="2026-04-22 18:53:04 +0000 UTC" firstStartedPulling="2026-04-22 18:53:04.879252197 +0000 UTC m=+575.862718117" lastFinishedPulling="2026-04-22 18:53:10.982164752 +0000 UTC m=+581.965630671" observedRunningTime="2026-04-22 18:53:11.527880211 +0000 UTC m=+582.511346153" watchObservedRunningTime="2026-04-22 18:53:11.529375643 +0000 UTC m=+582.512841584" Apr 22 18:53:17.549359 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:17.549332 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-9rsrk" Apr 22 18:53:18.054858 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.054828 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5s49l"] Apr 22 18:53:18.057975 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.057955 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-5s49l" Apr 22 18:53:18.060352 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.060315 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:53:18.060352 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.060326 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-sgc2m\"" Apr 22 18:53:18.060514 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.060396 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:53:18.064182 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.064153 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5s49l"] Apr 22 18:53:18.176393 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.176366 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brq65\" (UniqueName: \"kubernetes.io/projected/0c122d48-7d63-4667-861d-9fe8556f97fc-kube-api-access-brq65\") pod \"authorino-8b475cf9f-5s49l\" (UID: \"0c122d48-7d63-4667-861d-9fe8556f97fc\") " pod="kuadrant-system/authorino-8b475cf9f-5s49l" Apr 22 18:53:18.267960 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.267937 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5s49l"] Apr 22 18:53:18.268157 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:53:18.268139 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-brq65], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-5s49l" podUID="0c122d48-7d63-4667-861d-9fe8556f97fc" Apr 22 18:53:18.277538 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.277510 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brq65\" (UniqueName: \"kubernetes.io/projected/0c122d48-7d63-4667-861d-9fe8556f97fc-kube-api-access-brq65\") pod \"authorino-8b475cf9f-5s49l\" (UID: \"0c122d48-7d63-4667-861d-9fe8556f97fc\") " pod="kuadrant-system/authorino-8b475cf9f-5s49l" Apr 22 18:53:18.288024 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.287998 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brq65\" (UniqueName: \"kubernetes.io/projected/0c122d48-7d63-4667-861d-9fe8556f97fc-kube-api-access-brq65\") pod \"authorino-8b475cf9f-5s49l\" (UID: \"0c122d48-7d63-4667-861d-9fe8556f97fc\") " pod="kuadrant-system/authorino-8b475cf9f-5s49l" Apr 22 18:53:18.292125 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.292104 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-67669fc9f6-gqxgg"] Apr 22 18:53:18.295310 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.295295 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-67669fc9f6-gqxgg" Apr 22 18:53:18.302010 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.301990 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-67669fc9f6-gqxgg"] Apr 22 18:53:18.378597 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.378570 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbx4f\" (UniqueName: \"kubernetes.io/projected/1f46d90a-257c-479d-a7f1-8e1ef8d8ab75-kube-api-access-mbx4f\") pod \"authorino-67669fc9f6-gqxgg\" (UID: \"1f46d90a-257c-479d-a7f1-8e1ef8d8ab75\") " pod="kuadrant-system/authorino-67669fc9f6-gqxgg" Apr 22 18:53:18.479795 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.479774 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbx4f\" (UniqueName: \"kubernetes.io/projected/1f46d90a-257c-479d-a7f1-8e1ef8d8ab75-kube-api-access-mbx4f\") pod \"authorino-67669fc9f6-gqxgg\" (UID: \"1f46d90a-257c-479d-a7f1-8e1ef8d8ab75\") " pod="kuadrant-system/authorino-67669fc9f6-gqxgg" Apr 22 18:53:18.488691 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.488669 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbx4f\" (UniqueName: \"kubernetes.io/projected/1f46d90a-257c-479d-a7f1-8e1ef8d8ab75-kube-api-access-mbx4f\") pod \"authorino-67669fc9f6-gqxgg\" (UID: \"1f46d90a-257c-479d-a7f1-8e1ef8d8ab75\") " pod="kuadrant-system/authorino-67669fc9f6-gqxgg" Apr 22 18:53:18.489536 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.489512 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-67669fc9f6-gqxgg"] Apr 22 18:53:18.489730 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.489717 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-67669fc9f6-gqxgg" Apr 22 18:53:18.516170 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.516137 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-85d76994c6-9mntd"] Apr 22 18:53:18.520254 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.520237 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85d76994c6-9mntd" Apr 22 18:53:18.522445 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.522428 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 18:53:18.527723 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.527699 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-85d76994c6-9mntd"] Apr 22 18:53:18.533920 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.533901 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-5s49l" Apr 22 18:53:18.551406 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.551386 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-5s49l" Apr 22 18:53:18.581155 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.581091 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4rd7\" (UniqueName: \"kubernetes.io/projected/566f6de6-0c47-4cd1-994a-cbe6796fc413-kube-api-access-r4rd7\") pod \"authorino-85d76994c6-9mntd\" (UID: \"566f6de6-0c47-4cd1-994a-cbe6796fc413\") " pod="kuadrant-system/authorino-85d76994c6-9mntd" Apr 22 18:53:18.581155 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.581127 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/566f6de6-0c47-4cd1-994a-cbe6796fc413-tls-cert\") pod \"authorino-85d76994c6-9mntd\" (UID: \"566f6de6-0c47-4cd1-994a-cbe6796fc413\") " pod="kuadrant-system/authorino-85d76994c6-9mntd" Apr 22 18:53:18.618295 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.618275 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-67669fc9f6-gqxgg"] Apr 22 18:53:18.620973 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:53:18.620944 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f46d90a_257c_479d_a7f1_8e1ef8d8ab75.slice/crio-d302b04bd4560bfe8b345be2b3ff1ff42efefc695a600ad6198cc6660cd2ff7d WatchSource:0}: Error finding container d302b04bd4560bfe8b345be2b3ff1ff42efefc695a600ad6198cc6660cd2ff7d: Status 404 returned error can't find the container with id d302b04bd4560bfe8b345be2b3ff1ff42efefc695a600ad6198cc6660cd2ff7d Apr 22 18:53:18.682201 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.682147 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brq65\" (UniqueName: \"kubernetes.io/projected/0c122d48-7d63-4667-861d-9fe8556f97fc-kube-api-access-brq65\") pod \"0c122d48-7d63-4667-861d-9fe8556f97fc\" (UID: \"0c122d48-7d63-4667-861d-9fe8556f97fc\") " Apr 22 18:53:18.682329 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.682312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4rd7\" (UniqueName: \"kubernetes.io/projected/566f6de6-0c47-4cd1-994a-cbe6796fc413-kube-api-access-r4rd7\") pod \"authorino-85d76994c6-9mntd\" (UID: \"566f6de6-0c47-4cd1-994a-cbe6796fc413\") " pod="kuadrant-system/authorino-85d76994c6-9mntd" Apr 22 18:53:18.682398 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.682337 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/566f6de6-0c47-4cd1-994a-cbe6796fc413-tls-cert\") pod \"authorino-85d76994c6-9mntd\" (UID: \"566f6de6-0c47-4cd1-994a-cbe6796fc413\") " pod="kuadrant-system/authorino-85d76994c6-9mntd" Apr 22 18:53:18.684234 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.684207 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c122d48-7d63-4667-861d-9fe8556f97fc-kube-api-access-brq65" (OuterVolumeSpecName: "kube-api-access-brq65") pod "0c122d48-7d63-4667-861d-9fe8556f97fc" (UID: "0c122d48-7d63-4667-861d-9fe8556f97fc"). InnerVolumeSpecName "kube-api-access-brq65". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:18.684725 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.684710 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/566f6de6-0c47-4cd1-994a-cbe6796fc413-tls-cert\") pod \"authorino-85d76994c6-9mntd\" (UID: \"566f6de6-0c47-4cd1-994a-cbe6796fc413\") " pod="kuadrant-system/authorino-85d76994c6-9mntd" Apr 22 18:53:18.690482 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.690463 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4rd7\" (UniqueName: \"kubernetes.io/projected/566f6de6-0c47-4cd1-994a-cbe6796fc413-kube-api-access-r4rd7\") pod \"authorino-85d76994c6-9mntd\" (UID: \"566f6de6-0c47-4cd1-994a-cbe6796fc413\") " pod="kuadrant-system/authorino-85d76994c6-9mntd" Apr 22 18:53:18.783611 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.783586 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-brq65\" (UniqueName: \"kubernetes.io/projected/0c122d48-7d63-4667-861d-9fe8556f97fc-kube-api-access-brq65\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:53:18.851646 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.851609 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85d76994c6-9mntd" Apr 22 18:53:18.971326 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:18.971297 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-85d76994c6-9mntd"] Apr 22 18:53:18.973917 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:53:18.973890 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod566f6de6_0c47_4cd1_994a_cbe6796fc413.slice/crio-b97192edd192e80a9bf6269d1067c3f17cfb5fb63efd6563ca7b2de584ef703b WatchSource:0}: Error finding container b97192edd192e80a9bf6269d1067c3f17cfb5fb63efd6563ca7b2de584ef703b: Status 404 returned error can't find the container with id b97192edd192e80a9bf6269d1067c3f17cfb5fb63efd6563ca7b2de584ef703b Apr 22 18:53:19.539774 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:19.539736 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85d76994c6-9mntd" event={"ID":"566f6de6-0c47-4cd1-994a-cbe6796fc413","Type":"ContainerStarted","Data":"dec64259d39ca39af13a8e3759158298a9ff313cd25619c643452af2f8f3d8a3"} Apr 22 18:53:19.539774 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:19.539783 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85d76994c6-9mntd" event={"ID":"566f6de6-0c47-4cd1-994a-cbe6796fc413","Type":"ContainerStarted","Data":"b97192edd192e80a9bf6269d1067c3f17cfb5fb63efd6563ca7b2de584ef703b"} Apr 22 18:53:19.541503 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:19.541476 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-67669fc9f6-gqxgg" event={"ID":"1f46d90a-257c-479d-a7f1-8e1ef8d8ab75","Type":"ContainerStarted","Data":"a4f07760ddf0ab03e24fc65ea497aa115f9d71a1ff6d6b3747408c562f6bfae8"} Apr 22 18:53:19.541633 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:19.541495 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-67669fc9f6-gqxgg" podUID="1f46d90a-257c-479d-a7f1-8e1ef8d8ab75" containerName="authorino" containerID="cri-o://a4f07760ddf0ab03e24fc65ea497aa115f9d71a1ff6d6b3747408c562f6bfae8" gracePeriod=30 Apr 22 18:53:19.541633 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:19.541510 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-67669fc9f6-gqxgg" event={"ID":"1f46d90a-257c-479d-a7f1-8e1ef8d8ab75","Type":"ContainerStarted","Data":"d302b04bd4560bfe8b345be2b3ff1ff42efefc695a600ad6198cc6660cd2ff7d"} Apr 22 18:53:19.541633 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:19.541488 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-5s49l" Apr 22 18:53:19.555850 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:19.555810 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-85d76994c6-9mntd" podStartSLOduration=1.14111674 podStartE2EDuration="1.555796718s" podCreationTimestamp="2026-04-22 18:53:18 +0000 UTC" firstStartedPulling="2026-04-22 18:53:18.975244905 +0000 UTC m=+589.958710825" lastFinishedPulling="2026-04-22 18:53:19.389924881 +0000 UTC m=+590.373390803" observedRunningTime="2026-04-22 18:53:19.555497316 +0000 UTC m=+590.538963258" watchObservedRunningTime="2026-04-22 18:53:19.555796718 +0000 UTC m=+590.539262693" Apr 22 18:53:19.576858 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:19.576808 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-67669fc9f6-gqxgg" podStartSLOduration=1.193998213 podStartE2EDuration="1.576789965s" podCreationTimestamp="2026-04-22 18:53:18 +0000 UTC" firstStartedPulling="2026-04-22 18:53:18.622213386 +0000 UTC m=+589.605679305" lastFinishedPulling="2026-04-22 18:53:19.005005138 +0000 UTC m=+589.988471057" observedRunningTime="2026-04-22 18:53:19.575918435 +0000 UTC m=+590.559384380" watchObservedRunningTime="2026-04-22 18:53:19.576789965 +0000 UTC m=+590.560255911" Apr 22 18:53:19.609796 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:19.609772 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5s49l"] Apr 22 18:53:19.613048 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:19.613019 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-5s49l"] Apr 22 18:53:19.777041 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:19.777018 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-67669fc9f6-gqxgg" Apr 22 18:53:19.891365 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:19.891334 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbx4f\" (UniqueName: \"kubernetes.io/projected/1f46d90a-257c-479d-a7f1-8e1ef8d8ab75-kube-api-access-mbx4f\") pod \"1f46d90a-257c-479d-a7f1-8e1ef8d8ab75\" (UID: \"1f46d90a-257c-479d-a7f1-8e1ef8d8ab75\") " Apr 22 18:53:19.893573 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:19.893543 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f46d90a-257c-479d-a7f1-8e1ef8d8ab75-kube-api-access-mbx4f" (OuterVolumeSpecName: "kube-api-access-mbx4f") pod "1f46d90a-257c-479d-a7f1-8e1ef8d8ab75" (UID: "1f46d90a-257c-479d-a7f1-8e1ef8d8ab75"). InnerVolumeSpecName "kube-api-access-mbx4f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:19.992306 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:19.992283 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mbx4f\" (UniqueName: \"kubernetes.io/projected/1f46d90a-257c-479d-a7f1-8e1ef8d8ab75-kube-api-access-mbx4f\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:53:20.328262 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.328229 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-5vdkc"] Apr 22 18:53:20.328745 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.328725 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f46d90a-257c-479d-a7f1-8e1ef8d8ab75" containerName="authorino" Apr 22 18:53:20.328808 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.328748 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f46d90a-257c-479d-a7f1-8e1ef8d8ab75" containerName="authorino" Apr 22 18:53:20.328877 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.328864 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f46d90a-257c-479d-a7f1-8e1ef8d8ab75" containerName="authorino" Apr 22 18:53:20.347893 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.347867 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-5vdkc"] Apr 22 18:53:20.347998 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.347919 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" Apr 22 18:53:20.350466 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.350445 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-6z92h\"" Apr 22 18:53:20.395691 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.395651 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr2pr\" (UniqueName: \"kubernetes.io/projected/4e4ef54e-cd64-4572-b2f0-fe335f6b33fd-kube-api-access-wr2pr\") pod \"maas-controller-6d4c8f55f9-5vdkc\" (UID: \"4e4ef54e-cd64-4572-b2f0-fe335f6b33fd\") " pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" Apr 22 18:53:20.480501 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.480477 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-68cddf8959-pn2q6"] Apr 22 18:53:20.496247 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.496220 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wr2pr\" (UniqueName: \"kubernetes.io/projected/4e4ef54e-cd64-4572-b2f0-fe335f6b33fd-kube-api-access-wr2pr\") pod \"maas-controller-6d4c8f55f9-5vdkc\" (UID: \"4e4ef54e-cd64-4572-b2f0-fe335f6b33fd\") " pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" Apr 22 18:53:20.498767 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.498746 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-68cddf8959-pn2q6"] Apr 22 18:53:20.498865 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.498853 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68cddf8959-pn2q6" Apr 22 18:53:20.505371 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.505309 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr2pr\" (UniqueName: \"kubernetes.io/projected/4e4ef54e-cd64-4572-b2f0-fe335f6b33fd-kube-api-access-wr2pr\") pod \"maas-controller-6d4c8f55f9-5vdkc\" (UID: \"4e4ef54e-cd64-4572-b2f0-fe335f6b33fd\") " pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" Apr 22 18:53:20.546173 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.546100 2570 generic.go:358] "Generic (PLEG): container finished" podID="1f46d90a-257c-479d-a7f1-8e1ef8d8ab75" containerID="a4f07760ddf0ab03e24fc65ea497aa115f9d71a1ff6d6b3747408c562f6bfae8" exitCode=0 Apr 22 18:53:20.546173 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.546162 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-67669fc9f6-gqxgg" Apr 22 18:53:20.546339 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.546193 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-67669fc9f6-gqxgg" event={"ID":"1f46d90a-257c-479d-a7f1-8e1ef8d8ab75","Type":"ContainerDied","Data":"a4f07760ddf0ab03e24fc65ea497aa115f9d71a1ff6d6b3747408c562f6bfae8"} Apr 22 18:53:20.546339 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.546232 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-67669fc9f6-gqxgg" event={"ID":"1f46d90a-257c-479d-a7f1-8e1ef8d8ab75","Type":"ContainerDied","Data":"d302b04bd4560bfe8b345be2b3ff1ff42efefc695a600ad6198cc6660cd2ff7d"} Apr 22 18:53:20.546339 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.546251 2570 scope.go:117] "RemoveContainer" containerID="a4f07760ddf0ab03e24fc65ea497aa115f9d71a1ff6d6b3747408c562f6bfae8" Apr 22 18:53:20.554470 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.554452 2570 scope.go:117] "RemoveContainer" containerID="a4f07760ddf0ab03e24fc65ea497aa115f9d71a1ff6d6b3747408c562f6bfae8" Apr 22 18:53:20.554804 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:53:20.554780 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f07760ddf0ab03e24fc65ea497aa115f9d71a1ff6d6b3747408c562f6bfae8\": container with ID starting with a4f07760ddf0ab03e24fc65ea497aa115f9d71a1ff6d6b3747408c562f6bfae8 not found: ID does not exist" containerID="a4f07760ddf0ab03e24fc65ea497aa115f9d71a1ff6d6b3747408c562f6bfae8" Apr 22 18:53:20.554884 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.554812 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f07760ddf0ab03e24fc65ea497aa115f9d71a1ff6d6b3747408c562f6bfae8"} err="failed to get container status \"a4f07760ddf0ab03e24fc65ea497aa115f9d71a1ff6d6b3747408c562f6bfae8\": rpc error: code = NotFound desc = could not find container \"a4f07760ddf0ab03e24fc65ea497aa115f9d71a1ff6d6b3747408c562f6bfae8\": container with ID starting with a4f07760ddf0ab03e24fc65ea497aa115f9d71a1ff6d6b3747408c562f6bfae8 not found: ID does not exist" Apr 22 18:53:20.566594 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.566573 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-67669fc9f6-gqxgg"] Apr 22 18:53:20.571321 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.571298 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-67669fc9f6-gqxgg"] Apr 22 18:53:20.594887 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.594865 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-68cddf8959-pn2q6"] Apr 22 18:53:20.595060 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:53:20.595045 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-b2s54], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-68cddf8959-pn2q6" podUID="51c93334-fecc-4500-b838-6206be98dc49" Apr 22 18:53:20.597590 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.597568 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2s54\" (UniqueName: \"kubernetes.io/projected/51c93334-fecc-4500-b838-6206be98dc49-kube-api-access-b2s54\") pod \"maas-controller-68cddf8959-pn2q6\" (UID: \"51c93334-fecc-4500-b838-6206be98dc49\") " pod="opendatahub/maas-controller-68cddf8959-pn2q6" Apr 22 18:53:20.620571 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.620550 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5d44dbddb-9pdk5"] Apr 22 18:53:20.624145 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.624129 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5d44dbddb-9pdk5" Apr 22 18:53:20.632232 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.632211 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5d44dbddb-9pdk5"] Apr 22 18:53:20.658281 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.658259 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" Apr 22 18:53:20.698289 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.698259 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2s54\" (UniqueName: \"kubernetes.io/projected/51c93334-fecc-4500-b838-6206be98dc49-kube-api-access-b2s54\") pod \"maas-controller-68cddf8959-pn2q6\" (UID: \"51c93334-fecc-4500-b838-6206be98dc49\") " pod="opendatahub/maas-controller-68cddf8959-pn2q6" Apr 22 18:53:20.698409 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.698303 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdk79\" (UniqueName: \"kubernetes.io/projected/d74df894-56ad-4f54-946b-9b667b89626b-kube-api-access-kdk79\") pod \"maas-controller-5d44dbddb-9pdk5\" (UID: \"d74df894-56ad-4f54-946b-9b667b89626b\") " pod="opendatahub/maas-controller-5d44dbddb-9pdk5" Apr 22 18:53:20.709793 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.709749 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2s54\" (UniqueName: \"kubernetes.io/projected/51c93334-fecc-4500-b838-6206be98dc49-kube-api-access-b2s54\") pod \"maas-controller-68cddf8959-pn2q6\" (UID: \"51c93334-fecc-4500-b838-6206be98dc49\") " pod="opendatahub/maas-controller-68cddf8959-pn2q6" Apr 22 18:53:20.779646 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.779562 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-5vdkc"] Apr 22 18:53:20.781898 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:53:20.781868 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e4ef54e_cd64_4572_b2f0_fe335f6b33fd.slice/crio-b36cfb27d8abf8a8449dc6a14f3d3ea2e935fbb9fef401d7763f194326ab5694 WatchSource:0}: Error finding container b36cfb27d8abf8a8449dc6a14f3d3ea2e935fbb9fef401d7763f194326ab5694: Status 404 returned error can't find the container with id b36cfb27d8abf8a8449dc6a14f3d3ea2e935fbb9fef401d7763f194326ab5694 Apr 22 18:53:20.799353 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.799299 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdk79\" (UniqueName: \"kubernetes.io/projected/d74df894-56ad-4f54-946b-9b667b89626b-kube-api-access-kdk79\") pod \"maas-controller-5d44dbddb-9pdk5\" (UID: \"d74df894-56ad-4f54-946b-9b667b89626b\") " pod="opendatahub/maas-controller-5d44dbddb-9pdk5" Apr 22 18:53:20.807249 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.807231 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdk79\" (UniqueName: \"kubernetes.io/projected/d74df894-56ad-4f54-946b-9b667b89626b-kube-api-access-kdk79\") pod \"maas-controller-5d44dbddb-9pdk5\" (UID: \"d74df894-56ad-4f54-946b-9b667b89626b\") " pod="opendatahub/maas-controller-5d44dbddb-9pdk5" Apr 22 18:53:20.933980 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:20.933952 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5d44dbddb-9pdk5" Apr 22 18:53:21.053767 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:21.053743 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5d44dbddb-9pdk5"] Apr 22 18:53:21.055786 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:53:21.055756 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd74df894_56ad_4f54_946b_9b667b89626b.slice/crio-ab79553b8c29100a44287a4465c0f1e536ad34bba84702d5dd2b049019f1efc5 WatchSource:0}: Error finding container ab79553b8c29100a44287a4465c0f1e536ad34bba84702d5dd2b049019f1efc5: Status 404 returned error can't find the container with id ab79553b8c29100a44287a4465c0f1e536ad34bba84702d5dd2b049019f1efc5 Apr 22 18:53:21.551236 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:21.551192 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c122d48-7d63-4667-861d-9fe8556f97fc" path="/var/lib/kubelet/pods/0c122d48-7d63-4667-861d-9fe8556f97fc/volumes" Apr 22 18:53:21.551561 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:21.551542 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f46d90a-257c-479d-a7f1-8e1ef8d8ab75" path="/var/lib/kubelet/pods/1f46d90a-257c-479d-a7f1-8e1ef8d8ab75/volumes" Apr 22 18:53:21.552675 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:21.552647 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" event={"ID":"4e4ef54e-cd64-4572-b2f0-fe335f6b33fd","Type":"ContainerStarted","Data":"b36cfb27d8abf8a8449dc6a14f3d3ea2e935fbb9fef401d7763f194326ab5694"} Apr 22 18:53:21.553779 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:21.553741 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5d44dbddb-9pdk5" event={"ID":"d74df894-56ad-4f54-946b-9b667b89626b","Type":"ContainerStarted","Data":"ab79553b8c29100a44287a4465c0f1e536ad34bba84702d5dd2b049019f1efc5"} Apr 22 18:53:21.554889 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:21.554864 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68cddf8959-pn2q6" Apr 22 18:53:21.561330 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:21.561309 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68cddf8959-pn2q6" Apr 22 18:53:21.707199 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:21.707162 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2s54\" (UniqueName: \"kubernetes.io/projected/51c93334-fecc-4500-b838-6206be98dc49-kube-api-access-b2s54\") pod \"51c93334-fecc-4500-b838-6206be98dc49\" (UID: \"51c93334-fecc-4500-b838-6206be98dc49\") " Apr 22 18:53:21.712339 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:21.712298 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51c93334-fecc-4500-b838-6206be98dc49-kube-api-access-b2s54" (OuterVolumeSpecName: "kube-api-access-b2s54") pod "51c93334-fecc-4500-b838-6206be98dc49" (UID: "51c93334-fecc-4500-b838-6206be98dc49"). InnerVolumeSpecName "kube-api-access-b2s54". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:21.808201 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:21.808128 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b2s54\" (UniqueName: \"kubernetes.io/projected/51c93334-fecc-4500-b838-6206be98dc49-kube-api-access-b2s54\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:53:22.560051 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:22.560021 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68cddf8959-pn2q6" Apr 22 18:53:22.597126 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:22.597092 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-68cddf8959-pn2q6"] Apr 22 18:53:22.601534 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:22.601509 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-68cddf8959-pn2q6"] Apr 22 18:53:23.551686 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:23.551654 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51c93334-fecc-4500-b838-6206be98dc49" path="/var/lib/kubelet/pods/51c93334-fecc-4500-b838-6206be98dc49/volumes" Apr 22 18:53:25.569773 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:25.569729 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" event={"ID":"4e4ef54e-cd64-4572-b2f0-fe335f6b33fd","Type":"ContainerStarted","Data":"8e308e93d8865a0a1447aa77d1282e9c599fa1761163f00ee8c542d8bcc0d674"} Apr 22 18:53:25.570252 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:25.569855 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" Apr 22 18:53:25.571074 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:25.571051 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5d44dbddb-9pdk5" event={"ID":"d74df894-56ad-4f54-946b-9b667b89626b","Type":"ContainerStarted","Data":"d1d1a38f61e6ed0ae8c275307a39b802bd32702d8a4df3e585ea2e2bce0de663"} Apr 22 18:53:25.571194 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:25.571176 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5d44dbddb-9pdk5" Apr 22 18:53:25.588240 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:25.588200 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" podStartSLOduration=1.80462187 podStartE2EDuration="5.588187892s" podCreationTimestamp="2026-04-22 18:53:20 +0000 UTC" firstStartedPulling="2026-04-22 18:53:20.783220535 +0000 UTC m=+591.766686454" lastFinishedPulling="2026-04-22 18:53:24.566786557 +0000 UTC m=+595.550252476" observedRunningTime="2026-04-22 18:53:25.586209485 +0000 UTC m=+596.569675428" watchObservedRunningTime="2026-04-22 18:53:25.588187892 +0000 UTC m=+596.571653833" Apr 22 18:53:25.602652 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:25.602590 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5d44dbddb-9pdk5" podStartSLOduration=2.086708914 podStartE2EDuration="5.602576429s" podCreationTimestamp="2026-04-22 18:53:20 +0000 UTC" firstStartedPulling="2026-04-22 18:53:21.05707328 +0000 UTC m=+592.040539201" lastFinishedPulling="2026-04-22 18:53:24.572940797 +0000 UTC m=+595.556406716" observedRunningTime="2026-04-22 18:53:25.602197811 +0000 UTC m=+596.585663753" watchObservedRunningTime="2026-04-22 18:53:25.602576429 +0000 UTC m=+596.586042371" Apr 22 18:53:29.431821 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:29.431796 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/ovn-acl-logging/0.log" Apr 22 18:53:29.431821 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:29.431806 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/ovn-acl-logging/0.log" Apr 22 18:53:36.579820 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:36.579789 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5d44dbddb-9pdk5" Apr 22 18:53:36.581081 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:36.579840 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" Apr 22 18:53:36.640966 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:36.640936 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-5vdkc"] Apr 22 18:53:36.641139 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:36.641103 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" podUID="4e4ef54e-cd64-4572-b2f0-fe335f6b33fd" containerName="manager" containerID="cri-o://8e308e93d8865a0a1447aa77d1282e9c599fa1761163f00ee8c542d8bcc0d674" gracePeriod=10 Apr 22 18:53:36.877590 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:36.877567 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" Apr 22 18:53:36.932387 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:36.932348 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-db5bb48f4-8jbln"] Apr 22 18:53:36.932754 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:36.932740 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e4ef54e-cd64-4572-b2f0-fe335f6b33fd" containerName="manager" Apr 22 18:53:36.932803 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:36.932756 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e4ef54e-cd64-4572-b2f0-fe335f6b33fd" containerName="manager" Apr 22 18:53:36.932837 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:36.932817 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e4ef54e-cd64-4572-b2f0-fe335f6b33fd" containerName="manager" Apr 22 18:53:36.934545 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:36.934530 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-db5bb48f4-8jbln" Apr 22 18:53:36.943610 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:36.943581 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-db5bb48f4-8jbln"] Apr 22 18:53:37.039874 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.039853 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr2pr\" (UniqueName: \"kubernetes.io/projected/4e4ef54e-cd64-4572-b2f0-fe335f6b33fd-kube-api-access-wr2pr\") pod \"4e4ef54e-cd64-4572-b2f0-fe335f6b33fd\" (UID: \"4e4ef54e-cd64-4572-b2f0-fe335f6b33fd\") " Apr 22 18:53:37.040070 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.040056 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf9vv\" (UniqueName: \"kubernetes.io/projected/a23fa10e-e945-432b-98d5-adf26b100e76-kube-api-access-nf9vv\") pod \"maas-controller-db5bb48f4-8jbln\" (UID: \"a23fa10e-e945-432b-98d5-adf26b100e76\") " pod="opendatahub/maas-controller-db5bb48f4-8jbln" Apr 22 18:53:37.042105 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.042080 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e4ef54e-cd64-4572-b2f0-fe335f6b33fd-kube-api-access-wr2pr" (OuterVolumeSpecName: "kube-api-access-wr2pr") pod "4e4ef54e-cd64-4572-b2f0-fe335f6b33fd" (UID: "4e4ef54e-cd64-4572-b2f0-fe335f6b33fd"). InnerVolumeSpecName "kube-api-access-wr2pr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:37.140995 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.140937 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nf9vv\" (UniqueName: \"kubernetes.io/projected/a23fa10e-e945-432b-98d5-adf26b100e76-kube-api-access-nf9vv\") pod \"maas-controller-db5bb48f4-8jbln\" (UID: \"a23fa10e-e945-432b-98d5-adf26b100e76\") " pod="opendatahub/maas-controller-db5bb48f4-8jbln" Apr 22 18:53:37.141078 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.141012 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wr2pr\" (UniqueName: \"kubernetes.io/projected/4e4ef54e-cd64-4572-b2f0-fe335f6b33fd-kube-api-access-wr2pr\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:53:37.151434 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.151411 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf9vv\" (UniqueName: \"kubernetes.io/projected/a23fa10e-e945-432b-98d5-adf26b100e76-kube-api-access-nf9vv\") pod \"maas-controller-db5bb48f4-8jbln\" (UID: \"a23fa10e-e945-432b-98d5-adf26b100e76\") " pod="opendatahub/maas-controller-db5bb48f4-8jbln" Apr 22 18:53:37.245380 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.245349 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-db5bb48f4-8jbln" Apr 22 18:53:37.367420 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.367400 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-db5bb48f4-8jbln"] Apr 22 18:53:37.369559 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:53:37.369528 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda23fa10e_e945_432b_98d5_adf26b100e76.slice/crio-51d3b3e104281350311901ca9afc06254f1b423f869120d5e55caea22d574f0f WatchSource:0}: Error finding container 51d3b3e104281350311901ca9afc06254f1b423f869120d5e55caea22d574f0f: Status 404 returned error can't find the container with id 51d3b3e104281350311901ca9afc06254f1b423f869120d5e55caea22d574f0f Apr 22 18:53:37.612104 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.612073 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-db5bb48f4-8jbln" event={"ID":"a23fa10e-e945-432b-98d5-adf26b100e76","Type":"ContainerStarted","Data":"51d3b3e104281350311901ca9afc06254f1b423f869120d5e55caea22d574f0f"} Apr 22 18:53:37.613211 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.613188 2570 generic.go:358] "Generic (PLEG): container finished" podID="4e4ef54e-cd64-4572-b2f0-fe335f6b33fd" containerID="8e308e93d8865a0a1447aa77d1282e9c599fa1761163f00ee8c542d8bcc0d674" exitCode=0 Apr 22 18:53:37.613315 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.613247 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" Apr 22 18:53:37.613315 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.613280 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" event={"ID":"4e4ef54e-cd64-4572-b2f0-fe335f6b33fd","Type":"ContainerDied","Data":"8e308e93d8865a0a1447aa77d1282e9c599fa1761163f00ee8c542d8bcc0d674"} Apr 22 18:53:37.613428 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.613321 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-5vdkc" event={"ID":"4e4ef54e-cd64-4572-b2f0-fe335f6b33fd","Type":"ContainerDied","Data":"b36cfb27d8abf8a8449dc6a14f3d3ea2e935fbb9fef401d7763f194326ab5694"} Apr 22 18:53:37.613428 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.613342 2570 scope.go:117] "RemoveContainer" containerID="8e308e93d8865a0a1447aa77d1282e9c599fa1761163f00ee8c542d8bcc0d674" Apr 22 18:53:37.621034 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.621005 2570 scope.go:117] "RemoveContainer" containerID="8e308e93d8865a0a1447aa77d1282e9c599fa1761163f00ee8c542d8bcc0d674" Apr 22 18:53:37.621276 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:53:37.621258 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e308e93d8865a0a1447aa77d1282e9c599fa1761163f00ee8c542d8bcc0d674\": container with ID starting with 8e308e93d8865a0a1447aa77d1282e9c599fa1761163f00ee8c542d8bcc0d674 not found: ID does not exist" containerID="8e308e93d8865a0a1447aa77d1282e9c599fa1761163f00ee8c542d8bcc0d674" Apr 22 18:53:37.621329 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.621283 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e308e93d8865a0a1447aa77d1282e9c599fa1761163f00ee8c542d8bcc0d674"} err="failed to get container status \"8e308e93d8865a0a1447aa77d1282e9c599fa1761163f00ee8c542d8bcc0d674\": rpc error: code = NotFound desc = could not find container \"8e308e93d8865a0a1447aa77d1282e9c599fa1761163f00ee8c542d8bcc0d674\": container with ID starting with 8e308e93d8865a0a1447aa77d1282e9c599fa1761163f00ee8c542d8bcc0d674 not found: ID does not exist" Apr 22 18:53:37.630309 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.630288 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-5vdkc"] Apr 22 18:53:37.633822 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:37.633804 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-5vdkc"] Apr 22 18:53:38.619569 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:38.619535 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-db5bb48f4-8jbln" event={"ID":"a23fa10e-e945-432b-98d5-adf26b100e76","Type":"ContainerStarted","Data":"36aa620aa8ee5f2de64f0be3410674fff3a60668ad428acccce333f23e1a84b6"} Apr 22 18:53:38.619966 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:38.619678 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-db5bb48f4-8jbln" Apr 22 18:53:38.636847 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:38.636761 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-db5bb48f4-8jbln" podStartSLOduration=2.275239733 podStartE2EDuration="2.636748161s" podCreationTimestamp="2026-04-22 18:53:36 +0000 UTC" firstStartedPulling="2026-04-22 18:53:37.370882139 +0000 UTC m=+608.354348057" lastFinishedPulling="2026-04-22 18:53:37.732390563 +0000 UTC m=+608.715856485" observedRunningTime="2026-04-22 18:53:38.636656177 +0000 UTC m=+609.620122118" watchObservedRunningTime="2026-04-22 18:53:38.636748161 +0000 UTC m=+609.620214101" Apr 22 18:53:39.550639 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:39.550597 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e4ef54e-cd64-4572-b2f0-fe335f6b33fd" path="/var/lib/kubelet/pods/4e4ef54e-cd64-4572-b2f0-fe335f6b33fd/volumes" Apr 22 18:53:49.628384 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:49.628310 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-db5bb48f4-8jbln" Apr 22 18:53:49.674722 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:49.674696 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5d44dbddb-9pdk5"] Apr 22 18:53:49.674934 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:49.674896 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-5d44dbddb-9pdk5" podUID="d74df894-56ad-4f54-946b-9b667b89626b" containerName="manager" containerID="cri-o://d1d1a38f61e6ed0ae8c275307a39b802bd32702d8a4df3e585ea2e2bce0de663" gracePeriod=10 Apr 22 18:53:49.914227 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:49.914207 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5d44dbddb-9pdk5" Apr 22 18:53:50.034860 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:50.034834 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdk79\" (UniqueName: \"kubernetes.io/projected/d74df894-56ad-4f54-946b-9b667b89626b-kube-api-access-kdk79\") pod \"d74df894-56ad-4f54-946b-9b667b89626b\" (UID: \"d74df894-56ad-4f54-946b-9b667b89626b\") " Apr 22 18:53:50.037001 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:50.036971 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d74df894-56ad-4f54-946b-9b667b89626b-kube-api-access-kdk79" (OuterVolumeSpecName: "kube-api-access-kdk79") pod "d74df894-56ad-4f54-946b-9b667b89626b" (UID: "d74df894-56ad-4f54-946b-9b667b89626b"). InnerVolumeSpecName "kube-api-access-kdk79". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:50.136182 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:50.136160 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kdk79\" (UniqueName: \"kubernetes.io/projected/d74df894-56ad-4f54-946b-9b667b89626b-kube-api-access-kdk79\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:53:50.658375 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:50.658337 2570 generic.go:358] "Generic (PLEG): container finished" podID="d74df894-56ad-4f54-946b-9b667b89626b" containerID="d1d1a38f61e6ed0ae8c275307a39b802bd32702d8a4df3e585ea2e2bce0de663" exitCode=0 Apr 22 18:53:50.658761 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:50.658397 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5d44dbddb-9pdk5" Apr 22 18:53:50.658761 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:50.658408 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5d44dbddb-9pdk5" event={"ID":"d74df894-56ad-4f54-946b-9b667b89626b","Type":"ContainerDied","Data":"d1d1a38f61e6ed0ae8c275307a39b802bd32702d8a4df3e585ea2e2bce0de663"} Apr 22 18:53:50.658761 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:50.658443 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5d44dbddb-9pdk5" event={"ID":"d74df894-56ad-4f54-946b-9b667b89626b","Type":"ContainerDied","Data":"ab79553b8c29100a44287a4465c0f1e536ad34bba84702d5dd2b049019f1efc5"} Apr 22 18:53:50.658761 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:50.658462 2570 scope.go:117] "RemoveContainer" containerID="d1d1a38f61e6ed0ae8c275307a39b802bd32702d8a4df3e585ea2e2bce0de663" Apr 22 18:53:50.668310 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:50.668288 2570 scope.go:117] "RemoveContainer" containerID="d1d1a38f61e6ed0ae8c275307a39b802bd32702d8a4df3e585ea2e2bce0de663" Apr 22 18:53:50.668795 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:53:50.668773 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1d1a38f61e6ed0ae8c275307a39b802bd32702d8a4df3e585ea2e2bce0de663\": container with ID starting with d1d1a38f61e6ed0ae8c275307a39b802bd32702d8a4df3e585ea2e2bce0de663 not found: ID does not exist" containerID="d1d1a38f61e6ed0ae8c275307a39b802bd32702d8a4df3e585ea2e2bce0de663" Apr 22 18:53:50.668887 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:50.668801 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1d1a38f61e6ed0ae8c275307a39b802bd32702d8a4df3e585ea2e2bce0de663"} err="failed to get container status \"d1d1a38f61e6ed0ae8c275307a39b802bd32702d8a4df3e585ea2e2bce0de663\": rpc error: code = NotFound desc = could not find container \"d1d1a38f61e6ed0ae8c275307a39b802bd32702d8a4df3e585ea2e2bce0de663\": container with ID starting with d1d1a38f61e6ed0ae8c275307a39b802bd32702d8a4df3e585ea2e2bce0de663 not found: ID does not exist" Apr 22 18:53:50.693418 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:50.693393 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5d44dbddb-9pdk5"] Apr 22 18:53:50.700252 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:50.700219 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-5d44dbddb-9pdk5"] Apr 22 18:53:51.549911 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:51.549879 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d74df894-56ad-4f54-946b-9b667b89626b" path="/var/lib/kubelet/pods/d74df894-56ad-4f54-946b-9b667b89626b/volumes" Apr 22 18:53:56.389482 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.389449 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-74c6cd7d45-wh2fn"] Apr 22 18:53:56.389938 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.389822 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d74df894-56ad-4f54-946b-9b667b89626b" containerName="manager" Apr 22 18:53:56.389938 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.389835 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74df894-56ad-4f54-946b-9b667b89626b" containerName="manager" Apr 22 18:53:56.389938 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.389904 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="d74df894-56ad-4f54-946b-9b667b89626b" containerName="manager" Apr 22 18:53:56.394341 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.394322 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-74c6cd7d45-wh2fn" Apr 22 18:53:56.398438 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.398421 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-hjs9h\"" Apr 22 18:53:56.399531 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.399515 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 22 18:53:56.399661 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.399533 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 22 18:53:56.404430 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.404411 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-74c6cd7d45-wh2fn"] Apr 22 18:53:56.587148 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.587117 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp7wx\" (UniqueName: \"kubernetes.io/projected/1cd1085c-f2fc-48d5-880f-6737130411a7-kube-api-access-kp7wx\") pod \"maas-api-74c6cd7d45-wh2fn\" (UID: \"1cd1085c-f2fc-48d5-880f-6737130411a7\") " pod="opendatahub/maas-api-74c6cd7d45-wh2fn" Apr 22 18:53:56.587299 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.587170 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1cd1085c-f2fc-48d5-880f-6737130411a7-maas-api-tls\") pod \"maas-api-74c6cd7d45-wh2fn\" (UID: \"1cd1085c-f2fc-48d5-880f-6737130411a7\") " pod="opendatahub/maas-api-74c6cd7d45-wh2fn" Apr 22 18:53:56.688391 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.688326 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1cd1085c-f2fc-48d5-880f-6737130411a7-maas-api-tls\") pod \"maas-api-74c6cd7d45-wh2fn\" (UID: \"1cd1085c-f2fc-48d5-880f-6737130411a7\") " pod="opendatahub/maas-api-74c6cd7d45-wh2fn" Apr 22 18:53:56.688534 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.688420 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp7wx\" (UniqueName: \"kubernetes.io/projected/1cd1085c-f2fc-48d5-880f-6737130411a7-kube-api-access-kp7wx\") pod \"maas-api-74c6cd7d45-wh2fn\" (UID: \"1cd1085c-f2fc-48d5-880f-6737130411a7\") " pod="opendatahub/maas-api-74c6cd7d45-wh2fn" Apr 22 18:53:56.690772 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.690747 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1cd1085c-f2fc-48d5-880f-6737130411a7-maas-api-tls\") pod \"maas-api-74c6cd7d45-wh2fn\" (UID: \"1cd1085c-f2fc-48d5-880f-6737130411a7\") " pod="opendatahub/maas-api-74c6cd7d45-wh2fn" Apr 22 18:53:56.708721 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:56.708696 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp7wx\" (UniqueName: \"kubernetes.io/projected/1cd1085c-f2fc-48d5-880f-6737130411a7-kube-api-access-kp7wx\") pod \"maas-api-74c6cd7d45-wh2fn\" (UID: \"1cd1085c-f2fc-48d5-880f-6737130411a7\") " pod="opendatahub/maas-api-74c6cd7d45-wh2fn" Apr 22 18:53:57.005491 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:57.005426 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-74c6cd7d45-wh2fn" Apr 22 18:53:57.134525 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:57.134497 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-74c6cd7d45-wh2fn"] Apr 22 18:53:57.136101 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:53:57.136063 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cd1085c_f2fc_48d5_880f_6737130411a7.slice/crio-15f7ec067fa377d66fd0ed481f42ee38d566be3b61aa01b4fd810c647909f742 WatchSource:0}: Error finding container 15f7ec067fa377d66fd0ed481f42ee38d566be3b61aa01b4fd810c647909f742: Status 404 returned error can't find the container with id 15f7ec067fa377d66fd0ed481f42ee38d566be3b61aa01b4fd810c647909f742 Apr 22 18:53:57.684154 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:57.684121 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-74c6cd7d45-wh2fn" event={"ID":"1cd1085c-f2fc-48d5-880f-6737130411a7","Type":"ContainerStarted","Data":"15f7ec067fa377d66fd0ed481f42ee38d566be3b61aa01b4fd810c647909f742"} Apr 22 18:53:59.691737 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:59.691693 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-74c6cd7d45-wh2fn" event={"ID":"1cd1085c-f2fc-48d5-880f-6737130411a7","Type":"ContainerStarted","Data":"62a32654d66ac641f613abc88aa2905daec6c34918b9561fdf87f097f7c7e067"} Apr 22 18:53:59.692195 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:59.691811 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-74c6cd7d45-wh2fn" Apr 22 18:53:59.724307 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:53:59.724249 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-74c6cd7d45-wh2fn" podStartSLOduration=1.719522515 podStartE2EDuration="3.724234534s" podCreationTimestamp="2026-04-22 18:53:56 +0000 UTC" firstStartedPulling="2026-04-22 18:53:57.137398457 +0000 UTC m=+628.120864377" lastFinishedPulling="2026-04-22 18:53:59.142110477 +0000 UTC m=+630.125576396" observedRunningTime="2026-04-22 18:53:59.72333466 +0000 UTC m=+630.706800600" watchObservedRunningTime="2026-04-22 18:53:59.724234534 +0000 UTC m=+630.707700474" Apr 22 18:54:05.668300 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.668269 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8"] Apr 22 18:54:05.670836 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.670819 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.674641 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.674584 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-gxmms\"" Apr 22 18:54:05.674766 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.674638 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 22 18:54:05.674766 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.674612 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 22 18:54:05.674766 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.674588 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 22 18:54:05.680822 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.680797 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8"] Apr 22 18:54:05.701210 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.701189 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-74c6cd7d45-wh2fn" Apr 22 18:54:05.764571 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.764541 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5529bc06-bc50-45fb-bc77-b11f50a74612-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.764571 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.764573 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x84kr\" (UniqueName: \"kubernetes.io/projected/5529bc06-bc50-45fb-bc77-b11f50a74612-kube-api-access-x84kr\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.764797 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.764590 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5529bc06-bc50-45fb-bc77-b11f50a74612-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.764797 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.764691 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5529bc06-bc50-45fb-bc77-b11f50a74612-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.764797 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.764727 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5529bc06-bc50-45fb-bc77-b11f50a74612-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.764797 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.764785 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5529bc06-bc50-45fb-bc77-b11f50a74612-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.865944 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.865915 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5529bc06-bc50-45fb-bc77-b11f50a74612-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.865944 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.865944 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x84kr\" (UniqueName: \"kubernetes.io/projected/5529bc06-bc50-45fb-bc77-b11f50a74612-kube-api-access-x84kr\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.866150 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.865964 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5529bc06-bc50-45fb-bc77-b11f50a74612-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.866150 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.865993 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5529bc06-bc50-45fb-bc77-b11f50a74612-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.866150 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.866017 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5529bc06-bc50-45fb-bc77-b11f50a74612-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.866150 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.866065 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5529bc06-bc50-45fb-bc77-b11f50a74612-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.866379 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.866272 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5529bc06-bc50-45fb-bc77-b11f50a74612-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.866379 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.866328 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5529bc06-bc50-45fb-bc77-b11f50a74612-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.866468 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.866383 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5529bc06-bc50-45fb-bc77-b11f50a74612-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.868423 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.868393 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5529bc06-bc50-45fb-bc77-b11f50a74612-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.868756 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.868736 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5529bc06-bc50-45fb-bc77-b11f50a74612-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.875353 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.875333 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x84kr\" (UniqueName: \"kubernetes.io/projected/5529bc06-bc50-45fb-bc77-b11f50a74612-kube-api-access-x84kr\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-j92m8\" (UID: \"5529bc06-bc50-45fb-bc77-b11f50a74612\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:05.982287 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:05.982238 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:06.105159 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:06.105136 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8"] Apr 22 18:54:06.106912 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:54:06.106871 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5529bc06_bc50_45fb_bc77_b11f50a74612.slice/crio-21262bb6dd9a7a7ec31b5d82830326661052901074f933b23a8221991cccad01 WatchSource:0}: Error finding container 21262bb6dd9a7a7ec31b5d82830326661052901074f933b23a8221991cccad01: Status 404 returned error can't find the container with id 21262bb6dd9a7a7ec31b5d82830326661052901074f933b23a8221991cccad01 Apr 22 18:54:06.717196 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:06.717156 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" event={"ID":"5529bc06-bc50-45fb-bc77-b11f50a74612","Type":"ContainerStarted","Data":"21262bb6dd9a7a7ec31b5d82830326661052901074f933b23a8221991cccad01"} Apr 22 18:54:13.750888 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:13.750850 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" event={"ID":"5529bc06-bc50-45fb-bc77-b11f50a74612","Type":"ContainerStarted","Data":"67c6fadc324ad56a18496e74b5c2d3346380088d3b3488dbe04ac63bd35beaea"} Apr 22 18:54:21.780927 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:21.780890 2570 generic.go:358] "Generic (PLEG): container finished" podID="5529bc06-bc50-45fb-bc77-b11f50a74612" containerID="67c6fadc324ad56a18496e74b5c2d3346380088d3b3488dbe04ac63bd35beaea" exitCode=0 Apr 22 18:54:21.781305 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:21.780969 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" event={"ID":"5529bc06-bc50-45fb-bc77-b11f50a74612","Type":"ContainerDied","Data":"67c6fadc324ad56a18496e74b5c2d3346380088d3b3488dbe04ac63bd35beaea"} Apr 22 18:54:21.781580 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:21.781565 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:54:23.789206 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:23.789172 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" event={"ID":"5529bc06-bc50-45fb-bc77-b11f50a74612","Type":"ContainerStarted","Data":"c2504eb56926dd895d636f55d98b0ebf1512ae43817edeed872072d91d4f18ee"} Apr 22 18:54:23.789585 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:23.789377 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:23.807890 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:23.807834 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" podStartSLOduration=2.093605399 podStartE2EDuration="18.807821487s" podCreationTimestamp="2026-04-22 18:54:05 +0000 UTC" firstStartedPulling="2026-04-22 18:54:06.108696928 +0000 UTC m=+637.092162847" lastFinishedPulling="2026-04-22 18:54:22.822913017 +0000 UTC m=+653.806378935" observedRunningTime="2026-04-22 18:54:23.806392185 +0000 UTC m=+654.789858141" watchObservedRunningTime="2026-04-22 18:54:23.807821487 +0000 UTC m=+654.791287428" Apr 22 18:54:34.805964 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:34.805917 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-j92m8" Apr 22 18:54:40.152021 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.151975 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc"] Apr 22 18:54:40.406324 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.406244 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc"] Apr 22 18:54:40.406486 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.406363 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.409169 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.409147 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 22 18:54:40.590124 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.590080 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2b1318cf-689a-4d60-a307-2b078e60cef7-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.590290 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.590138 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1318cf-689a-4d60-a307-2b078e60cef7-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.590290 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.590247 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2b1318cf-689a-4d60-a307-2b078e60cef7-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.590290 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.590278 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b1318cf-689a-4d60-a307-2b078e60cef7-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.590421 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.590359 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2b1318cf-689a-4d60-a307-2b078e60cef7-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.590466 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.590424 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b992f\" (UniqueName: \"kubernetes.io/projected/2b1318cf-689a-4d60-a307-2b078e60cef7-kube-api-access-b992f\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.691308 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.691219 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2b1318cf-689a-4d60-a307-2b078e60cef7-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.691308 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.691272 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1318cf-689a-4d60-a307-2b078e60cef7-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.691526 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.691323 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2b1318cf-689a-4d60-a307-2b078e60cef7-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.691526 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.691349 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b1318cf-689a-4d60-a307-2b078e60cef7-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.691526 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.691390 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2b1318cf-689a-4d60-a307-2b078e60cef7-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.691526 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.691438 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b992f\" (UniqueName: \"kubernetes.io/projected/2b1318cf-689a-4d60-a307-2b078e60cef7-kube-api-access-b992f\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.691819 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.691785 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2b1318cf-689a-4d60-a307-2b078e60cef7-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.691932 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.691839 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2b1318cf-689a-4d60-a307-2b078e60cef7-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.691978 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.691951 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2b1318cf-689a-4d60-a307-2b078e60cef7-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.693828 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.693801 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2b1318cf-689a-4d60-a307-2b078e60cef7-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.693922 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.693902 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1318cf-689a-4d60-a307-2b078e60cef7-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.699430 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.699409 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b992f\" (UniqueName: \"kubernetes.io/projected/2b1318cf-689a-4d60-a307-2b078e60cef7-kube-api-access-b992f\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc\" (UID: \"2b1318cf-689a-4d60-a307-2b078e60cef7\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.716530 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.716503 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:40.845948 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.845864 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc"] Apr 22 18:54:40.848769 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:54:40.848739 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b1318cf_689a_4d60_a307_2b078e60cef7.slice/crio-aec82cab29d6f5eb7579d2bc850077a8691e390fad6f32b58b8dbd385d4c1b5b WatchSource:0}: Error finding container aec82cab29d6f5eb7579d2bc850077a8691e390fad6f32b58b8dbd385d4c1b5b: Status 404 returned error can't find the container with id aec82cab29d6f5eb7579d2bc850077a8691e390fad6f32b58b8dbd385d4c1b5b Apr 22 18:54:40.856887 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:40.856858 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" event={"ID":"2b1318cf-689a-4d60-a307-2b078e60cef7","Type":"ContainerStarted","Data":"aec82cab29d6f5eb7579d2bc850077a8691e390fad6f32b58b8dbd385d4c1b5b"} Apr 22 18:54:41.862143 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:41.862103 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" event={"ID":"2b1318cf-689a-4d60-a307-2b078e60cef7","Type":"ContainerStarted","Data":"1db1228e542703b12400bb0be614791c2ca04f1cd1114cc8056b8b72ac02ae13"} Apr 22 18:54:46.882498 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:46.882462 2570 generic.go:358] "Generic (PLEG): container finished" podID="2b1318cf-689a-4d60-a307-2b078e60cef7" containerID="1db1228e542703b12400bb0be614791c2ca04f1cd1114cc8056b8b72ac02ae13" exitCode=0 Apr 22 18:54:46.882985 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:46.882542 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" event={"ID":"2b1318cf-689a-4d60-a307-2b078e60cef7","Type":"ContainerDied","Data":"1db1228e542703b12400bb0be614791c2ca04f1cd1114cc8056b8b72ac02ae13"} Apr 22 18:54:47.890633 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:47.890596 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" event={"ID":"2b1318cf-689a-4d60-a307-2b078e60cef7","Type":"ContainerStarted","Data":"4b62861d212e5a53939d7af36a0048c862f2a1ade7bfae3bb2a51b2421e71c74"} Apr 22 18:54:47.891059 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:47.890977 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:54:47.912785 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:47.912731 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" podStartSLOduration=7.714543365 podStartE2EDuration="7.912716994s" podCreationTimestamp="2026-04-22 18:54:40 +0000 UTC" firstStartedPulling="2026-04-22 18:54:46.88330161 +0000 UTC m=+677.866767529" lastFinishedPulling="2026-04-22 18:54:47.081475239 +0000 UTC m=+678.064941158" observedRunningTime="2026-04-22 18:54:47.909895937 +0000 UTC m=+678.893361878" watchObservedRunningTime="2026-04-22 18:54:47.912716994 +0000 UTC m=+678.896182936" Apr 22 18:54:58.906715 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:54:58.906683 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc" Apr 22 18:55:19.954949 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:19.954866 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j"] Apr 22 18:55:19.958614 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:19.958596 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:19.961037 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:19.961020 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 22 18:55:19.969813 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:19.969791 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j"] Apr 22 18:55:20.062963 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.062923 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.062963 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.062962 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.063193 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.062999 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.063193 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.063116 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.063193 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.063147 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55plz\" (UniqueName: \"kubernetes.io/projected/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-kube-api-access-55plz\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.063322 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.063202 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.164348 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.164295 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.164553 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.164407 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.164553 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.164435 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55plz\" (UniqueName: \"kubernetes.io/projected/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-kube-api-access-55plz\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.164553 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.164472 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.164553 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.164526 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.164802 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.164555 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.164802 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.164769 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.164917 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.164868 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.165001 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.164978 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.167069 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.167044 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.167316 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.167298 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.174052 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.174031 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55plz\" (UniqueName: \"kubernetes.io/projected/ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46-kube-api-access-55plz\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j\" (UID: \"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.269350 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.269250 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:20.395071 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:20.395049 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j"] Apr 22 18:55:20.397638 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:55:20.397594 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebdcc4b7_e9ee_4e95_b696_7a7da3aefd46.slice/crio-76be4d8120436e49dee3b1776a61b1760187c92b05c5d5b3af06baeecdb98d00 WatchSource:0}: Error finding container 76be4d8120436e49dee3b1776a61b1760187c92b05c5d5b3af06baeecdb98d00: Status 404 returned error can't find the container with id 76be4d8120436e49dee3b1776a61b1760187c92b05c5d5b3af06baeecdb98d00 Apr 22 18:55:21.015758 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:21.015715 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" event={"ID":"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46","Type":"ContainerStarted","Data":"e9a88172eac2e1e482f6c57c8c0ab4d604a36cd86d1e973515f31cc6aa815d42"} Apr 22 18:55:21.015758 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:21.015763 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" event={"ID":"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46","Type":"ContainerStarted","Data":"76be4d8120436e49dee3b1776a61b1760187c92b05c5d5b3af06baeecdb98d00"} Apr 22 18:55:26.037379 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:26.037342 2570 generic.go:358] "Generic (PLEG): container finished" podID="ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46" containerID="e9a88172eac2e1e482f6c57c8c0ab4d604a36cd86d1e973515f31cc6aa815d42" exitCode=0 Apr 22 18:55:26.037773 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:26.037413 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" event={"ID":"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46","Type":"ContainerDied","Data":"e9a88172eac2e1e482f6c57c8c0ab4d604a36cd86d1e973515f31cc6aa815d42"} Apr 22 18:55:27.042871 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:27.042836 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" event={"ID":"ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46","Type":"ContainerStarted","Data":"7cb3da29b11d76e28960ec413f93ff25dc3bff6229c6eaba2ca5c0ec0d4bf3d4"} Apr 22 18:55:27.043260 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:27.043057 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:27.061553 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:27.061505 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" podStartSLOduration=7.811982227 podStartE2EDuration="8.061490934s" podCreationTimestamp="2026-04-22 18:55:19 +0000 UTC" firstStartedPulling="2026-04-22 18:55:26.038145171 +0000 UTC m=+717.021611091" lastFinishedPulling="2026-04-22 18:55:26.287653872 +0000 UTC m=+717.271119798" observedRunningTime="2026-04-22 18:55:27.060212549 +0000 UTC m=+718.043678492" watchObservedRunningTime="2026-04-22 18:55:27.061490934 +0000 UTC m=+718.044956874" Apr 22 18:55:38.059313 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:38.059282 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j" Apr 22 18:55:47.080547 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:47.080511 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-695bb55bfc-v2zhk"] Apr 22 18:55:47.083802 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:47.083780 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-695bb55bfc-v2zhk" Apr 22 18:55:47.090611 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:47.090582 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-695bb55bfc-v2zhk"] Apr 22 18:55:47.203231 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:47.203204 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f-tls-cert\") pod \"authorino-695bb55bfc-v2zhk\" (UID: \"67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f\") " pod="kuadrant-system/authorino-695bb55bfc-v2zhk" Apr 22 18:55:47.203333 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:47.203289 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fdr2\" (UniqueName: \"kubernetes.io/projected/67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f-kube-api-access-2fdr2\") pod \"authorino-695bb55bfc-v2zhk\" (UID: \"67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f\") " pod="kuadrant-system/authorino-695bb55bfc-v2zhk" Apr 22 18:55:47.304570 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:47.304536 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fdr2\" (UniqueName: \"kubernetes.io/projected/67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f-kube-api-access-2fdr2\") pod \"authorino-695bb55bfc-v2zhk\" (UID: \"67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f\") " pod="kuadrant-system/authorino-695bb55bfc-v2zhk" Apr 22 18:55:47.304691 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:47.304645 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f-tls-cert\") pod \"authorino-695bb55bfc-v2zhk\" (UID: \"67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f\") " pod="kuadrant-system/authorino-695bb55bfc-v2zhk" Apr 22 18:55:47.307351 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:47.307327 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f-tls-cert\") pod \"authorino-695bb55bfc-v2zhk\" (UID: \"67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f\") " pod="kuadrant-system/authorino-695bb55bfc-v2zhk" Apr 22 18:55:47.312698 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:47.312672 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fdr2\" (UniqueName: \"kubernetes.io/projected/67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f-kube-api-access-2fdr2\") pod \"authorino-695bb55bfc-v2zhk\" (UID: \"67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f\") " pod="kuadrant-system/authorino-695bb55bfc-v2zhk" Apr 22 18:55:47.394721 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:47.394700 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-695bb55bfc-v2zhk" Apr 22 18:55:47.521654 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:47.521608 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-695bb55bfc-v2zhk"] Apr 22 18:55:47.523609 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:55:47.523578 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67c7e850_10f2_4d8d_bd88_1cbf3ca9c86f.slice/crio-dbf7e5a7a49ebafd575c34696b556ed3c7561d7f6003b3321432cacc92388d99 WatchSource:0}: Error finding container dbf7e5a7a49ebafd575c34696b556ed3c7561d7f6003b3321432cacc92388d99: Status 404 returned error can't find the container with id dbf7e5a7a49ebafd575c34696b556ed3c7561d7f6003b3321432cacc92388d99 Apr 22 18:55:48.117494 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:48.117461 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-695bb55bfc-v2zhk" event={"ID":"67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f","Type":"ContainerStarted","Data":"6a39afca65eb5bdaee8a23246a0bc0bd95183c91d591cfb564ae841da9351d8c"} Apr 22 18:55:48.117907 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:48.117499 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-695bb55bfc-v2zhk" event={"ID":"67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f","Type":"ContainerStarted","Data":"dbf7e5a7a49ebafd575c34696b556ed3c7561d7f6003b3321432cacc92388d99"} Apr 22 18:55:48.133310 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:48.133266 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-695bb55bfc-v2zhk" podStartSLOduration=0.710975016 podStartE2EDuration="1.133252546s" podCreationTimestamp="2026-04-22 18:55:47 +0000 UTC" firstStartedPulling="2026-04-22 18:55:47.524891144 +0000 UTC m=+738.508357063" lastFinishedPulling="2026-04-22 18:55:47.947168661 +0000 UTC m=+738.930634593" observedRunningTime="2026-04-22 18:55:48.131137636 +0000 UTC m=+739.114603576" watchObservedRunningTime="2026-04-22 18:55:48.133252546 +0000 UTC m=+739.116718529" Apr 22 18:55:48.160210 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:48.159821 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-85d76994c6-9mntd"] Apr 22 18:55:48.160210 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:48.160033 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-85d76994c6-9mntd" podUID="566f6de6-0c47-4cd1-994a-cbe6796fc413" containerName="authorino" containerID="cri-o://dec64259d39ca39af13a8e3759158298a9ff313cd25619c643452af2f8f3d8a3" gracePeriod=30 Apr 22 18:55:48.416035 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:48.416006 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85d76994c6-9mntd" Apr 22 18:55:48.515219 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:48.515184 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/566f6de6-0c47-4cd1-994a-cbe6796fc413-tls-cert\") pod \"566f6de6-0c47-4cd1-994a-cbe6796fc413\" (UID: \"566f6de6-0c47-4cd1-994a-cbe6796fc413\") " Apr 22 18:55:48.515219 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:48.515224 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4rd7\" (UniqueName: \"kubernetes.io/projected/566f6de6-0c47-4cd1-994a-cbe6796fc413-kube-api-access-r4rd7\") pod \"566f6de6-0c47-4cd1-994a-cbe6796fc413\" (UID: \"566f6de6-0c47-4cd1-994a-cbe6796fc413\") " Apr 22 18:55:48.517420 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:48.517386 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/566f6de6-0c47-4cd1-994a-cbe6796fc413-kube-api-access-r4rd7" (OuterVolumeSpecName: "kube-api-access-r4rd7") pod "566f6de6-0c47-4cd1-994a-cbe6796fc413" (UID: "566f6de6-0c47-4cd1-994a-cbe6796fc413"). InnerVolumeSpecName "kube-api-access-r4rd7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:55:48.526281 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:48.526249 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/566f6de6-0c47-4cd1-994a-cbe6796fc413-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "566f6de6-0c47-4cd1-994a-cbe6796fc413" (UID: "566f6de6-0c47-4cd1-994a-cbe6796fc413"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:55:48.616789 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:48.616762 2570 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/566f6de6-0c47-4cd1-994a-cbe6796fc413-tls-cert\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:55:48.616789 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:48.616787 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r4rd7\" (UniqueName: \"kubernetes.io/projected/566f6de6-0c47-4cd1-994a-cbe6796fc413-kube-api-access-r4rd7\") on node \"ip-10-0-134-244.ec2.internal\" DevicePath \"\"" Apr 22 18:55:49.122494 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:49.122457 2570 generic.go:358] "Generic (PLEG): container finished" podID="566f6de6-0c47-4cd1-994a-cbe6796fc413" containerID="dec64259d39ca39af13a8e3759158298a9ff313cd25619c643452af2f8f3d8a3" exitCode=0 Apr 22 18:55:49.122939 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:49.122518 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-85d76994c6-9mntd" Apr 22 18:55:49.122939 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:49.122543 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85d76994c6-9mntd" event={"ID":"566f6de6-0c47-4cd1-994a-cbe6796fc413","Type":"ContainerDied","Data":"dec64259d39ca39af13a8e3759158298a9ff313cd25619c643452af2f8f3d8a3"} Apr 22 18:55:49.122939 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:49.122577 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-85d76994c6-9mntd" event={"ID":"566f6de6-0c47-4cd1-994a-cbe6796fc413","Type":"ContainerDied","Data":"b97192edd192e80a9bf6269d1067c3f17cfb5fb63efd6563ca7b2de584ef703b"} Apr 22 18:55:49.122939 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:49.122594 2570 scope.go:117] "RemoveContainer" containerID="dec64259d39ca39af13a8e3759158298a9ff313cd25619c643452af2f8f3d8a3" Apr 22 18:55:49.131111 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:49.131094 2570 scope.go:117] "RemoveContainer" containerID="dec64259d39ca39af13a8e3759158298a9ff313cd25619c643452af2f8f3d8a3" Apr 22 18:55:49.131341 ip-10-0-134-244 kubenswrapper[2570]: E0422 18:55:49.131322 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dec64259d39ca39af13a8e3759158298a9ff313cd25619c643452af2f8f3d8a3\": container with ID starting with dec64259d39ca39af13a8e3759158298a9ff313cd25619c643452af2f8f3d8a3 not found: ID does not exist" containerID="dec64259d39ca39af13a8e3759158298a9ff313cd25619c643452af2f8f3d8a3" Apr 22 18:55:49.131407 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:49.131353 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dec64259d39ca39af13a8e3759158298a9ff313cd25619c643452af2f8f3d8a3"} err="failed to get container status \"dec64259d39ca39af13a8e3759158298a9ff313cd25619c643452af2f8f3d8a3\": rpc error: code = NotFound desc = could not find container \"dec64259d39ca39af13a8e3759158298a9ff313cd25619c643452af2f8f3d8a3\": container with ID starting with dec64259d39ca39af13a8e3759158298a9ff313cd25619c643452af2f8f3d8a3 not found: ID does not exist" Apr 22 18:55:49.145006 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:49.144984 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-85d76994c6-9mntd"] Apr 22 18:55:49.148845 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:49.148822 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-85d76994c6-9mntd"] Apr 22 18:55:49.551755 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:55:49.551677 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="566f6de6-0c47-4cd1-994a-cbe6796fc413" path="/var/lib/kubelet/pods/566f6de6-0c47-4cd1-994a-cbe6796fc413/volumes" Apr 22 18:56:59.966154 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:56:59.966070 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-695bb55bfc-v2zhk_67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f/authorino/0.log" Apr 22 18:57:04.357711 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:04.357676 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-74c6cd7d45-wh2fn_1cd1085c-f2fc-48d5-880f-6737130411a7/maas-api/0.log" Apr 22 18:57:04.463597 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:04.463563 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-db5bb48f4-8jbln_a23fa10e-e945-432b-98d5-adf26b100e76/manager/0.log" Apr 22 18:57:04.827996 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:04.827917 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7c59bb5d7b-gkc52_ddedcae0-3939-4ffe-8320-bdfb64ce1341/manager/0.log" Apr 22 18:57:05.038515 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:05.038482 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-9rsrk_50557419-fb54-49b5-976b-4cbed1aec8d0/postgres/0.log" Apr 22 18:57:06.265911 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:06.265884 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-695bb55bfc-v2zhk_67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f/authorino/0.log" Apr 22 18:57:07.613486 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:07.613456 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6bc9b7f4d-2vxvf_0ddfad48-49ca-4fcd-9ed4-abfe5922044b/kube-auth-proxy/0.log" Apr 22 18:57:07.942869 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:07.942761 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-88bf8fcd4-w7x4n_78f709d1-ef5e-40ac-8845-3f0108fe6b96/router/0.log" Apr 22 18:57:08.376695 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:08.376648 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc_2b1318cf-689a-4d60-a307-2b078e60cef7/storage-initializer/0.log" Apr 22 18:57:08.386718 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:08.386693 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-jpxwc_2b1318cf-689a-4d60-a307-2b078e60cef7/main/0.log" Apr 22 18:57:08.613123 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:08.613091 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j_ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46/storage-initializer/0.log" Apr 22 18:57:08.619955 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:08.619928 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccsdk5j_ebdcc4b7-e9ee-4e95-b696-7a7da3aefd46/main/0.log" Apr 22 18:57:08.843849 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:08.843778 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-j92m8_5529bc06-bc50-45fb-bc77-b11f50a74612/main/0.log" Apr 22 18:57:08.850761 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:08.850740 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-j92m8_5529bc06-bc50-45fb-bc77-b11f50a74612/storage-initializer/0.log" Apr 22 18:57:15.806154 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:15.806122 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9v7dk_a88529a8-1055-4ebf-bd16-aa151ce8e4cb/global-pull-secret-syncer/0.log" Apr 22 18:57:15.986320 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:15.986291 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-scf4m_68ae446a-922e-4629-b3f8-e81fa3ec7eec/konnectivity-agent/0.log" Apr 22 18:57:16.029802 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:16.029776 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-244.ec2.internal_009f099669c612b1a9a7e8809b1d3526/haproxy/0.log" Apr 22 18:57:20.246213 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:20.246178 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-695bb55bfc-v2zhk_67c7e850-10f2-4d8d-bd88-1cbf3ca9c86f/authorino/0.log" Apr 22 18:57:21.802557 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:21.802532 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1640a33-1eb6-4132-b617-fe75c229730f/alertmanager/0.log" Apr 22 18:57:21.835854 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:21.835794 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1640a33-1eb6-4132-b617-fe75c229730f/config-reloader/0.log" Apr 22 18:57:21.862033 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:21.862014 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1640a33-1eb6-4132-b617-fe75c229730f/kube-rbac-proxy-web/0.log" Apr 22 18:57:21.899898 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:21.899876 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1640a33-1eb6-4132-b617-fe75c229730f/kube-rbac-proxy/0.log" Apr 22 18:57:21.935905 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:21.935889 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1640a33-1eb6-4132-b617-fe75c229730f/kube-rbac-proxy-metric/0.log" Apr 22 18:57:21.963006 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:21.962990 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1640a33-1eb6-4132-b617-fe75c229730f/prom-label-proxy/0.log" Apr 22 18:57:21.995379 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:21.995356 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1640a33-1eb6-4132-b617-fe75c229730f/init-config-reloader/0.log" Apr 22 18:57:22.171401 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.171380 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7d7464b78d-44zkq_95e79537-ff76-4401-abc7-6ccb62d28f5b/metrics-server/0.log" Apr 22 18:57:22.191405 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.191385 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-7fjpj_a92952e7-39fe-4887-b105-21e90a80d306/monitoring-plugin/0.log" Apr 22 18:57:22.389491 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.389461 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xg28t_2669c4ce-5cb0-4d49-9870-a1001889ea0c/node-exporter/0.log" Apr 22 18:57:22.409975 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.409946 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xg28t_2669c4ce-5cb0-4d49-9870-a1001889ea0c/kube-rbac-proxy/0.log" Apr 22 18:57:22.430294 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.430242 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xg28t_2669c4ce-5cb0-4d49-9870-a1001889ea0c/init-textfile/0.log" Apr 22 18:57:22.543554 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.543537 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6be795e0-0dd9-48ab-8279-902c6314c44d/prometheus/0.log" Apr 22 18:57:22.564110 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.564086 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6be795e0-0dd9-48ab-8279-902c6314c44d/config-reloader/0.log" Apr 22 18:57:22.585330 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.585314 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6be795e0-0dd9-48ab-8279-902c6314c44d/thanos-sidecar/0.log" Apr 22 18:57:22.608837 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.608818 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6be795e0-0dd9-48ab-8279-902c6314c44d/kube-rbac-proxy-web/0.log" Apr 22 18:57:22.632061 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.632042 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6be795e0-0dd9-48ab-8279-902c6314c44d/kube-rbac-proxy/0.log" Apr 22 18:57:22.652829 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.652807 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6be795e0-0dd9-48ab-8279-902c6314c44d/kube-rbac-proxy-thanos/0.log" Apr 22 18:57:22.674396 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.674374 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6be795e0-0dd9-48ab-8279-902c6314c44d/init-config-reloader/0.log" Apr 22 18:57:22.774729 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.774706 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64fdcd7476-5r6w8_4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af/telemeter-client/0.log" Apr 22 18:57:22.796191 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.796176 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64fdcd7476-5r6w8_4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af/reload/0.log" Apr 22 18:57:22.817413 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.817389 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-64fdcd7476-5r6w8_4d11bfd0-3b3f-43f2-ae0c-bc7b4f8598af/kube-rbac-proxy/0.log" Apr 22 18:57:22.847684 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.847670 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5578c6869-cvzhf_bb72636f-5aad-4c98-b54d-2f66da79f35d/thanos-query/0.log" Apr 22 18:57:22.875140 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.875121 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5578c6869-cvzhf_bb72636f-5aad-4c98-b54d-2f66da79f35d/kube-rbac-proxy-web/0.log" Apr 22 18:57:22.897034 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.897013 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5578c6869-cvzhf_bb72636f-5aad-4c98-b54d-2f66da79f35d/kube-rbac-proxy/0.log" Apr 22 18:57:22.917485 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.917459 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5578c6869-cvzhf_bb72636f-5aad-4c98-b54d-2f66da79f35d/prom-label-proxy/0.log" Apr 22 18:57:22.945651 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.945609 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5578c6869-cvzhf_bb72636f-5aad-4c98-b54d-2f66da79f35d/kube-rbac-proxy-rules/0.log" Apr 22 18:57:22.966260 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:22.966242 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5578c6869-cvzhf_bb72636f-5aad-4c98-b54d-2f66da79f35d/kube-rbac-proxy-metrics/0.log" Apr 22 18:57:23.994671 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:23.994613 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-62t72_71f3cd65-16e4-4173-821a-48a924d80e7a/networking-console-plugin/0.log" Apr 22 18:57:24.564112 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.564081 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl"] Apr 22 18:57:24.564537 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.564516 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="566f6de6-0c47-4cd1-994a-cbe6796fc413" containerName="authorino" Apr 22 18:57:24.564537 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.564534 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="566f6de6-0c47-4cd1-994a-cbe6796fc413" containerName="authorino" Apr 22 18:57:24.564719 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.564649 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="566f6de6-0c47-4cd1-994a-cbe6796fc413" containerName="authorino" Apr 22 18:57:24.567980 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.567963 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.570253 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.570234 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nb747\"/\"default-dockercfg-hb8wb\"" Apr 22 18:57:24.570348 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.570256 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nb747\"/\"openshift-service-ca.crt\"" Apr 22 18:57:24.571114 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.571099 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nb747\"/\"kube-root-ca.crt\"" Apr 22 18:57:24.575231 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.574861 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl"] Apr 22 18:57:24.604584 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.604565 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/01b16fb2-af34-4038-9986-3b2a36a4259a-sys\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.604710 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.604595 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/01b16fb2-af34-4038-9986-3b2a36a4259a-proc\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.604710 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.604698 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/01b16fb2-af34-4038-9986-3b2a36a4259a-lib-modules\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.604796 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.604740 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/01b16fb2-af34-4038-9986-3b2a36a4259a-podres\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.604796 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.604783 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz5gp\" (UniqueName: \"kubernetes.io/projected/01b16fb2-af34-4038-9986-3b2a36a4259a-kube-api-access-jz5gp\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.705666 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.705639 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/01b16fb2-af34-4038-9986-3b2a36a4259a-podres\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.705781 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.705697 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jz5gp\" (UniqueName: \"kubernetes.io/projected/01b16fb2-af34-4038-9986-3b2a36a4259a-kube-api-access-jz5gp\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.705781 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.705740 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/01b16fb2-af34-4038-9986-3b2a36a4259a-sys\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.705781 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.705760 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/01b16fb2-af34-4038-9986-3b2a36a4259a-proc\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.705937 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.705797 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/01b16fb2-af34-4038-9986-3b2a36a4259a-lib-modules\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.705937 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.705814 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/01b16fb2-af34-4038-9986-3b2a36a4259a-sys\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.705937 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.705809 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/01b16fb2-af34-4038-9986-3b2a36a4259a-podres\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.705937 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.705862 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/01b16fb2-af34-4038-9986-3b2a36a4259a-proc\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.705937 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.705904 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/01b16fb2-af34-4038-9986-3b2a36a4259a-lib-modules\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.713038 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.713012 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz5gp\" (UniqueName: \"kubernetes.io/projected/01b16fb2-af34-4038-9986-3b2a36a4259a-kube-api-access-jz5gp\") pod \"perf-node-gather-daemonset-7zxbl\" (UID: \"01b16fb2-af34-4038-9986-3b2a36a4259a\") " pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:24.878828 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:24.878807 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:25.003601 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:25.003576 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl"] Apr 22 18:57:25.005681 ip-10-0-134-244 kubenswrapper[2570]: W0422 18:57:25.005647 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod01b16fb2_af34_4038_9986_3b2a36a4259a.slice/crio-2add0992abb2074c497650971ff90621809a4fe2a0896ad012a5887c6cf72e7c WatchSource:0}: Error finding container 2add0992abb2074c497650971ff90621809a4fe2a0896ad012a5887c6cf72e7c: Status 404 returned error can't find the container with id 2add0992abb2074c497650971ff90621809a4fe2a0896ad012a5887c6cf72e7c Apr 22 18:57:25.472870 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:25.472776 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" event={"ID":"01b16fb2-af34-4038-9986-3b2a36a4259a","Type":"ContainerStarted","Data":"64b2bea833a4a4a6bfb1ef449d9bd672244c1615f16f16fd300c27a5d8f9ba13"} Apr 22 18:57:25.472870 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:25.472814 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" event={"ID":"01b16fb2-af34-4038-9986-3b2a36a4259a","Type":"ContainerStarted","Data":"2add0992abb2074c497650971ff90621809a4fe2a0896ad012a5887c6cf72e7c"} Apr 22 18:57:25.472870 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:25.472849 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:25.490527 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:25.490485 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" podStartSLOduration=1.490472633 podStartE2EDuration="1.490472633s" podCreationTimestamp="2026-04-22 18:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:57:25.487740714 +0000 UTC m=+836.471206655" watchObservedRunningTime="2026-04-22 18:57:25.490472633 +0000 UTC m=+836.473938636" Apr 22 18:57:26.532726 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:26.532702 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6zh6w_ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d/dns/0.log" Apr 22 18:57:26.553560 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:26.553537 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-6zh6w_ef9eca3e-08aa-4a39-b6ce-a3c3f71dc95d/kube-rbac-proxy/0.log" Apr 22 18:57:26.685740 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:26.685714 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2z78z_4b955592-582e-4878-a4b9-99767a2aaefb/dns-node-resolver/0.log" Apr 22 18:57:27.226562 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:27.226529 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gmml2_d150b818-e3a3-47e2-835c-16ae11dff162/node-ca/0.log" Apr 22 18:57:28.146168 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:28.146141 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6bc9b7f4d-2vxvf_0ddfad48-49ca-4fcd-9ed4-abfe5922044b/kube-auth-proxy/0.log" Apr 22 18:57:28.225098 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:28.225073 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-88bf8fcd4-w7x4n_78f709d1-ef5e-40ac-8845-3f0108fe6b96/router/0.log" Apr 22 18:57:28.751117 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:28.751090 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ltknd_57a5c4fb-aa29-4923-b848-a57df5a62462/serve-healthcheck-canary/0.log" Apr 22 18:57:29.193364 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:29.193330 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-thx79_0c7ad148-5682-41c7-874a-a20686c43134/insights-operator/0.log" Apr 22 18:57:29.193364 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:29.193366 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-thx79_0c7ad148-5682-41c7-874a-a20686c43134/insights-operator/1.log" Apr 22 18:57:29.288442 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:29.288414 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gvsxk_9a880c07-41b1-4390-9b45-e37ff33a6bbc/kube-rbac-proxy/0.log" Apr 22 18:57:29.309339 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:29.309309 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gvsxk_9a880c07-41b1-4390-9b45-e37ff33a6bbc/exporter/0.log" Apr 22 18:57:29.330017 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:29.329989 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gvsxk_9a880c07-41b1-4390-9b45-e37ff33a6bbc/extractor/0.log" Apr 22 18:57:31.259069 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:31.259034 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-74c6cd7d45-wh2fn_1cd1085c-f2fc-48d5-880f-6737130411a7/maas-api/0.log" Apr 22 18:57:31.283283 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:31.283254 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-db5bb48f4-8jbln_a23fa10e-e945-432b-98d5-adf26b100e76/manager/0.log" Apr 22 18:57:31.386249 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:31.386219 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7c59bb5d7b-gkc52_ddedcae0-3939-4ffe-8320-bdfb64ce1341/manager/0.log" Apr 22 18:57:31.434883 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:31.434861 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-9rsrk_50557419-fb54-49b5-976b-4cbed1aec8d0/postgres/0.log" Apr 22 18:57:31.486776 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:31.486753 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nb747/perf-node-gather-daemonset-7zxbl" Apr 22 18:57:32.514676 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:32.514644 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-59544c8f7-fq47s_1b5fc6cd-1f41-4ea0-b121-cc5f8c8846ce/manager/0.log" Apr 22 18:57:37.005161 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:37.005134 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-4whhs_c3364bfb-9927-4cd3-89a5-1137295070fd/migrator/0.log" Apr 22 18:57:37.030499 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:37.030465 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-4whhs_c3364bfb-9927-4cd3-89a5-1137295070fd/graceful-termination/0.log" Apr 22 18:57:38.643082 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:38.643043 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ftgq4_10fccf91-d12b-4767-94a3-6a751cf19eb8/kube-multus-additional-cni-plugins/0.log" Apr 22 18:57:38.668821 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:38.668793 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ftgq4_10fccf91-d12b-4767-94a3-6a751cf19eb8/egress-router-binary-copy/0.log" Apr 22 18:57:38.690384 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:38.690368 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ftgq4_10fccf91-d12b-4767-94a3-6a751cf19eb8/cni-plugins/0.log" Apr 22 18:57:38.711164 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:38.711145 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ftgq4_10fccf91-d12b-4767-94a3-6a751cf19eb8/bond-cni-plugin/0.log" Apr 22 18:57:38.732223 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:38.732207 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ftgq4_10fccf91-d12b-4767-94a3-6a751cf19eb8/routeoverride-cni/0.log" Apr 22 18:57:38.753086 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:38.753068 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ftgq4_10fccf91-d12b-4767-94a3-6a751cf19eb8/whereabouts-cni-bincopy/0.log" Apr 22 18:57:38.776650 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:38.776609 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ftgq4_10fccf91-d12b-4767-94a3-6a751cf19eb8/whereabouts-cni/0.log" Apr 22 18:57:38.958109 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:38.958058 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rhtwv_d24c0ebd-b6b6-4d29-bb7b-8abf194a33f8/kube-multus/0.log" Apr 22 18:57:39.140197 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:39.140167 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-w8q5c_317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282/network-metrics-daemon/0.log" Apr 22 18:57:39.159644 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:39.159598 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-w8q5c_317bb3bc-9a8f-408c-8cc6-ab0ccf5a0282/kube-rbac-proxy/0.log" Apr 22 18:57:40.204186 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:40.204157 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/ovn-controller/0.log" Apr 22 18:57:40.224996 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:40.224966 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/ovn-acl-logging/0.log" Apr 22 18:57:40.229070 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:40.229053 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/ovn-acl-logging/1.log" Apr 22 18:57:40.248302 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:40.248285 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/kube-rbac-proxy-node/0.log" Apr 22 18:57:40.271684 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:40.271665 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:57:40.290122 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:40.290103 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/northd/0.log" Apr 22 18:57:40.311853 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:40.311837 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/nbdb/0.log" Apr 22 18:57:40.334389 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:40.334342 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/sbdb/0.log" Apr 22 18:57:40.434310 ip-10-0-134-244 kubenswrapper[2570]: I0422 18:57:40.434295 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rpznc_862d4b9d-0093-4dad-8175-851155e4b065/ovnkube-controller/0.log"