Apr 22 21:06:37.663390 ip-10-0-133-75 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 21:06:37.663401 ip-10-0-133-75 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 21:06:37.663410 ip-10-0-133-75 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 21:06:37.663610 ip-10-0-133-75 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 21:06:47.696619 ip-10-0-133-75 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 21:06:47.696634 ip-10-0-133-75 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 54ee0356fe7747f4bd1d2455b1dff778 -- Apr 22 21:09:12.013447 ip-10-0-133-75 systemd[1]: Starting Kubernetes Kubelet... Apr 22 21:09:12.504205 ip-10-0-133-75 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:09:12.504205 ip-10-0-133-75 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 21:09:12.504205 ip-10-0-133-75 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:09:12.504205 ip-10-0-133-75 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 21:09:12.504205 ip-10-0-133-75 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:09:12.505905 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.505819 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 21:09:12.512378 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512362 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:12.512378 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512376 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512380 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512384 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512387 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512390 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512393 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512395 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512398 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512401 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512404 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512407 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512409 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512412 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512414 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512417 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512420 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512423 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512425 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512428 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:12.512443 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512430 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512433 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512438 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512443 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512447 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512450 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512453 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512457 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512460 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512462 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512465 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512467 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512470 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512473 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512475 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512478 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512482 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512485 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512488 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:12.512937 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512490 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512493 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512496 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512499 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512501 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512504 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512507 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512510 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512513 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512517 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512521 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512525 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512528 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512530 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512533 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512536 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512539 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512542 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512544 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512547 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:12.513414 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512549 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512552 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512555 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512557 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512560 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512564 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512566 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512568 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512571 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512574 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512576 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512579 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512581 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512584 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512586 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512589 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512592 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512594 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512597 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512599 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:12.513897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512602 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512605 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512607 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512610 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512612 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512615 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512617 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512993 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.512998 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513001 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513004 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513006 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513009 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513012 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513015 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513017 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513019 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513022 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513024 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513027 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:12.514396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513030 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513032 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513035 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513037 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513040 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513043 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513045 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513048 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513051 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513055 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513058 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513062 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513064 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513067 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513069 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513072 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513074 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513077 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513080 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513084 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:12.514874 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513087 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513090 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513092 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513095 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513097 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513100 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513103 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513105 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513107 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513110 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513113 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513115 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513118 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513120 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513122 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513125 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513127 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513130 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513132 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513135 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:12.515410 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513137 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513157 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513160 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513163 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513165 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513168 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513170 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513174 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513177 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513179 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513181 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513186 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513189 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513192 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513194 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513197 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513199 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513202 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513205 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513207 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:12.515897 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513210 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513212 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513215 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513218 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513220 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513223 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513225 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513229 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513233 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513235 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513238 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513240 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.513243 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515270 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515285 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515292 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515296 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515301 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515304 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515309 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515313 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 21:09:12.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515316 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515319 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515323 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515326 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515329 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515332 2569 flags.go:64] FLAG: --cgroup-root="" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515335 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515338 2569 flags.go:64] FLAG: --client-ca-file="" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515341 2569 flags.go:64] FLAG: --cloud-config="" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515344 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515347 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515352 2569 flags.go:64] FLAG: --cluster-domain="" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515355 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515358 2569 flags.go:64] FLAG: --config-dir="" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515360 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515364 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515367 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515370 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515374 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515377 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515380 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515383 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515386 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515389 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515392 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 21:09:12.516914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515396 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515400 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515403 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515406 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515409 2569 flags.go:64] FLAG: --enable-server="true" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515412 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515417 2569 flags.go:64] FLAG: --event-burst="100" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515420 2569 flags.go:64] FLAG: --event-qps="50" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515423 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515427 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515430 2569 flags.go:64] FLAG: --eviction-hard="" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515434 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515437 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515440 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515443 2569 flags.go:64] FLAG: --eviction-soft="" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515445 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515448 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515451 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515454 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515457 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515460 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515463 2569 flags.go:64] FLAG: --feature-gates="" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515466 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515469 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515473 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 21:09:12.517533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515476 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515479 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515481 2569 flags.go:64] FLAG: --help="false" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515484 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-133-75.ec2.internal" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515487 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515491 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515494 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515497 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515501 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515504 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515507 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515510 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515513 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515516 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515519 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515522 2569 flags.go:64] FLAG: --kube-reserved="" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515525 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515528 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515531 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515534 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515537 2569 flags.go:64] FLAG: --lock-file="" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515540 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515542 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515545 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 21:09:12.518126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515551 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515554 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515556 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515559 2569 flags.go:64] FLAG: --logging-format="text" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515562 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515565 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515568 2569 flags.go:64] FLAG: --manifest-url="" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515571 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515575 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515578 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515583 2569 flags.go:64] FLAG: --max-pods="110" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515585 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515588 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515591 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515594 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515597 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515600 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515603 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515611 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515614 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515617 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515620 2569 flags.go:64] FLAG: --pod-cidr="" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515623 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 21:09:12.518710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515629 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515632 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515635 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515638 2569 flags.go:64] FLAG: --port="10250" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515641 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515644 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0def9175cea309222" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515647 2569 flags.go:64] FLAG: --qos-reserved="" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515649 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515652 2569 flags.go:64] FLAG: --register-node="true" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515655 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515658 2569 flags.go:64] FLAG: --register-with-taints="" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515662 2569 flags.go:64] FLAG: --registry-burst="10" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515664 2569 flags.go:64] FLAG: --registry-qps="5" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515667 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515670 2569 flags.go:64] FLAG: --reserved-memory="" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515673 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515676 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515679 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515682 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515685 2569 flags.go:64] FLAG: --runonce="false" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515688 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515691 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515694 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515697 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515700 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515703 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 21:09:12.519278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515705 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515708 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515711 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515715 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515717 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515720 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515723 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515726 2569 flags.go:64] FLAG: --system-cgroups="" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515729 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515734 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515737 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515740 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515744 2569 flags.go:64] FLAG: --tls-min-version="" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515747 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515749 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515752 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515755 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515758 2569 flags.go:64] FLAG: --v="2" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515762 2569 flags.go:64] FLAG: --version="false" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515766 2569 flags.go:64] FLAG: --vmodule="" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515771 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.515774 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515868 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515872 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515875 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:12.519959 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515878 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515881 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515884 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515886 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515889 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515892 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515894 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515897 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515900 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515903 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515906 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515909 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515911 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515914 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515917 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515919 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515922 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515925 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515927 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515930 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:12.520585 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515933 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515935 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515938 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515940 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515943 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515945 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515948 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515951 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515953 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515956 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515958 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515961 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515964 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515966 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515969 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515972 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515974 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515977 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515979 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515981 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:12.521130 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515984 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515987 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515990 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515992 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515995 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.515997 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516000 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516003 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516007 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516010 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516013 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516016 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516019 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516022 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516024 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516027 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516029 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516032 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516034 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:12.521633 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516037 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516039 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516042 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516047 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516050 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516053 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516055 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516058 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516060 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516063 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516065 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516068 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516071 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516075 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516080 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516082 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516085 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516088 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516091 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516093 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:12.522098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516096 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:12.522974 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516098 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:12.522974 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516101 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:12.522974 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.516103 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:12.522974 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.517095 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:09:12.523852 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.523830 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 21:09:12.523852 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.523852 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523924 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523932 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523938 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523942 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523947 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523951 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523956 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523960 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523965 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523970 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523974 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523980 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523984 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523988 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523993 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.523997 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524001 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524006 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:12.523999 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524010 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524015 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524019 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524024 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524028 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524033 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524038 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524042 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524046 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524050 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524055 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524059 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524063 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524069 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524074 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524078 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524082 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524086 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524090 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524094 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:12.524834 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524098 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524102 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524106 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524111 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524115 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524120 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524124 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524128 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524132 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524157 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524162 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524166 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524171 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524175 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524179 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524184 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524191 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524197 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524202 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524207 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:12.525461 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524211 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524215 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524220 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524224 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524228 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524232 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524242 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524246 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524250 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524255 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524259 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524263 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524267 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524271 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524276 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524281 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524286 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524290 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524294 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524298 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:12.526023 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524304 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524311 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524316 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524320 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524324 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524328 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524333 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524338 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.524346 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524549 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524558 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524563 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524567 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524571 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524576 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524580 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:12.526978 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524584 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524588 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524593 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524598 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524603 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524607 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524612 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524616 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524620 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524624 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524629 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524633 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524637 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524642 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524646 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524650 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524654 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524658 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524662 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524666 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:12.527685 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524671 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524676 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524683 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524689 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524694 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524698 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524703 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524707 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524711 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524715 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524719 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524724 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524728 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524733 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524737 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524741 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524746 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524751 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524755 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524760 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:12.528329 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524764 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524768 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524772 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524776 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524781 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524785 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524789 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524793 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524796 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524801 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524805 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524809 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524813 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524818 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524823 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524827 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524831 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524835 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524839 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:12.528921 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524843 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524847 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524851 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524855 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524859 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524863 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524867 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524872 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524876 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524880 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524885 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524890 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524894 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524898 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524902 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524907 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524913 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524918 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524923 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:12.529431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:12.524927 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:12.529917 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.524934 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:09:12.529917 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.525730 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 21:09:12.529917 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.528358 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 21:09:12.529917 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.529419 2569 server.go:1019] "Starting client certificate rotation" Apr 22 21:09:12.529917 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.529513 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 21:09:12.529917 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.529553 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 21:09:12.562390 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.562369 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 21:09:12.565745 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.565725 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 21:09:12.580709 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.580694 2569 log.go:25] "Validated CRI v1 runtime API" Apr 22 21:09:12.586485 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.586471 2569 log.go:25] "Validated CRI v1 image API" Apr 22 21:09:12.587707 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.587691 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 21:09:12.590582 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.590565 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 21:09:12.593197 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.593176 2569 fs.go:135] Filesystem UUIDs: map[3340cd5c-52d8-41d0-96eb-883dc92bef95:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e747a13a-bc4f-4806-b9c2-71c5bd1c64aa:/dev/nvme0n1p3] Apr 22 21:09:12.593268 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.593195 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 21:09:12.599213 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.599077 2569 manager.go:217] Machine: {Timestamp:2026-04-22 21:09:12.596773464 +0000 UTC m=+0.469665308 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100011 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a9f93de305e0066ba89cac18dd60d SystemUUID:ec2a9f93-de30-5e00-66ba-89cac18dd60d BootID:54ee0356-fe77-47f4-bd1d-2455b1dff778 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:06:b9:a3:fc:9f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:06:b9:a3:fc:9f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8e:24:bf:5e:5f:6a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 21:09:12.599213 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.599203 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 21:09:12.599378 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.599347 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 21:09:12.600470 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.600444 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 21:09:12.600620 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.600471 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-75.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 21:09:12.600744 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.600633 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 21:09:12.600744 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.600645 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 21:09:12.600744 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.600663 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 21:09:12.601588 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.601575 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 21:09:12.602863 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.602851 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 22 21:09:12.602997 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.602985 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 21:09:12.605519 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.605508 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 22 21:09:12.605586 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.605527 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 21:09:12.605586 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.605543 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 21:09:12.605586 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.605560 2569 kubelet.go:397] "Adding apiserver pod source" Apr 22 21:09:12.605586 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.605573 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 21:09:12.606748 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.606735 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 21:09:12.606825 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.606757 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 21:09:12.608750 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.608735 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w2r99" Apr 22 21:09:12.610412 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.610398 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 21:09:12.611861 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.611846 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 21:09:12.614104 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.614091 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 21:09:12.614200 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.614110 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 21:09:12.614200 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.614119 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 21:09:12.614200 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.614127 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 21:09:12.614200 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.614135 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 21:09:12.614200 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.614156 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 21:09:12.614200 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.614165 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 21:09:12.614200 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.614173 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 21:09:12.614200 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.614185 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 21:09:12.614200 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.614195 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 21:09:12.614478 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.614208 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 21:09:12.614478 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.614222 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 21:09:12.614693 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.614676 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w2r99" Apr 22 21:09:12.616186 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.616174 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 21:09:12.616238 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.616189 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 21:09:12.619677 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.619662 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 21:09:12.619763 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.619698 2569 server.go:1295] "Started kubelet" Apr 22 21:09:12.619825 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.619763 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 21:09:12.619885 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.619843 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 21:09:12.619925 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.619903 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 21:09:12.620433 ip-10-0-133-75 systemd[1]: Started Kubernetes Kubelet. Apr 22 21:09:12.623610 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.623584 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:12.623685 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.623609 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 21:09:12.625591 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.625575 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 22 21:09:12.627833 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.627815 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:12.629670 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.629655 2569 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-75.ec2.internal" not found Apr 22 21:09:12.629823 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.629805 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 21:09:12.629823 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.629816 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 21:09:12.630445 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.630428 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 21:09:12.630445 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.630435 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 21:09:12.630571 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.630456 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 21:09:12.630571 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.630514 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 22 21:09:12.630571 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.630525 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 22 21:09:12.630821 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:12.630783 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-75.ec2.internal\" not found" Apr 22 21:09:12.631853 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.631712 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:12.633308 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.633067 2569 factory.go:55] Registering systemd factory Apr 22 21:09:12.633308 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.633124 2569 factory.go:223] Registration of the systemd container factory successfully Apr 22 21:09:12.633693 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.633656 2569 factory.go:153] Registering CRI-O factory Apr 22 21:09:12.633693 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.633669 2569 factory.go:223] Registration of the crio container factory successfully Apr 22 21:09:12.633909 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.633721 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 21:09:12.633909 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.633743 2569 factory.go:103] Registering Raw factory Apr 22 21:09:12.633909 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.633756 2569 manager.go:1196] Started watching for new ooms in manager Apr 22 21:09:12.634569 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.634114 2569 manager.go:319] Starting recovery of all containers Apr 22 21:09:12.634569 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:12.634241 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 21:09:12.634569 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:12.634424 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-75.ec2.internal\" not found" node="ip-10-0-133-75.ec2.internal" Apr 22 21:09:12.643501 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.643486 2569 manager.go:324] Recovery completed Apr 22 21:09:12.644751 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:12.644736 2569 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 21:09:12.645593 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.645578 2569 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-75.ec2.internal" not found Apr 22 21:09:12.647430 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.647418 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:09:12.649917 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.649898 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:09:12.649985 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.649920 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:09:12.649985 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.649931 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:09:12.650443 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.650428 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 21:09:12.650443 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.650441 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 21:09:12.650538 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.650457 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 22 21:09:12.653034 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.653020 2569 policy_none.go:49] "None policy: Start" Apr 22 21:09:12.653034 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.653035 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 21:09:12.653219 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.653044 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 22 21:09:12.689658 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.689645 2569 manager.go:341] "Starting Device Plugin manager" Apr 22 21:09:12.702049 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:12.689671 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 21:09:12.702049 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.689679 2569 server.go:85] "Starting device plugin registration server" Apr 22 21:09:12.702049 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.689847 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 21:09:12.702049 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.689855 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 21:09:12.702049 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.689917 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 21:09:12.702049 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.690000 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 21:09:12.702049 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.690007 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 21:09:12.702049 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:12.690476 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 21:09:12.702049 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:12.690512 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-75.ec2.internal\" not found" Apr 22 21:09:12.705849 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.705835 2569 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-75.ec2.internal" not found Apr 22 21:09:12.783095 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.783048 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 21:09:12.784246 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.784226 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 21:09:12.784330 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.784253 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 21:09:12.784330 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.784268 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 21:09:12.784330 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.784274 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 21:09:12.784461 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:12.784332 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 21:09:12.787286 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.787270 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:12.789980 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.789964 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:09:12.790707 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.790692 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:09:12.790780 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.790718 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:09:12.790780 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.790729 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:09:12.790780 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.790749 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-75.ec2.internal" Apr 22 21:09:12.799732 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.799717 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-75.ec2.internal" Apr 22 21:09:12.884895 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.884877 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal"] Apr 22 21:09:12.887501 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.887486 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 22 21:09:12.887582 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.887491 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" Apr 22 21:09:12.912803 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.912788 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 22 21:09:12.916089 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.916076 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" Apr 22 21:09:12.925180 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.925164 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 21:09:12.931745 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.931730 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e52f89589d514c455852c1cdd49a71bd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal\" (UID: \"e52f89589d514c455852c1cdd49a71bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 22 21:09:12.931809 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.931754 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5401552a10b9bd31fa1f4a18dcace9bb-config\") pod \"kube-apiserver-proxy-ip-10-0-133-75.ec2.internal\" (UID: \"5401552a10b9bd31fa1f4a18dcace9bb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" Apr 22 21:09:12.931809 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:12.931772 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e52f89589d514c455852c1cdd49a71bd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal\" (UID: \"e52f89589d514c455852c1cdd49a71bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 22 21:09:13.013902 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.013882 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 21:09:13.032820 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.032799 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e52f89589d514c455852c1cdd49a71bd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal\" (UID: \"e52f89589d514c455852c1cdd49a71bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 22 21:09:13.032898 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.032832 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e52f89589d514c455852c1cdd49a71bd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal\" (UID: \"e52f89589d514c455852c1cdd49a71bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 22 21:09:13.032898 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.032870 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5401552a10b9bd31fa1f4a18dcace9bb-config\") pod \"kube-apiserver-proxy-ip-10-0-133-75.ec2.internal\" (UID: \"5401552a10b9bd31fa1f4a18dcace9bb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" Apr 22 21:09:13.032978 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.032907 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5401552a10b9bd31fa1f4a18dcace9bb-config\") pod \"kube-apiserver-proxy-ip-10-0-133-75.ec2.internal\" (UID: \"5401552a10b9bd31fa1f4a18dcace9bb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" Apr 22 21:09:13.032978 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.032948 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e52f89589d514c455852c1cdd49a71bd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal\" (UID: \"e52f89589d514c455852c1cdd49a71bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 22 21:09:13.033046 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.032980 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e52f89589d514c455852c1cdd49a71bd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal\" (UID: \"e52f89589d514c455852c1cdd49a71bd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 22 21:09:13.227351 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.227277 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" Apr 22 21:09:13.317159 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.316990 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" Apr 22 21:09:13.529525 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.529463 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 21:09:13.530127 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.529573 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 21:09:13.530127 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.529601 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 21:09:13.530127 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.529609 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 21:09:13.606229 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.606207 2569 apiserver.go:52] "Watching apiserver" Apr 22 21:09:13.613575 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.613557 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 21:09:13.613935 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.613915 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-b4l88","kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd","openshift-image-registry/node-ca-q864p","openshift-multus/multus-additional-cni-plugins-6z85g","openshift-multus/network-metrics-daemon-d7j8j","openshift-cluster-node-tuning-operator/tuned-ccslk","openshift-dns/node-resolver-rllmv","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal","openshift-multus/multus-xv5js","openshift-network-diagnostics/network-check-target-b9rbt","openshift-network-operator/iptables-alerter-r4xxs","openshift-ovn-kubernetes/ovnkube-node-sshlp"] Apr 22 21:09:13.615127 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.615109 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.616215 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.616193 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 21:04:12 +0000 UTC" deadline="2028-01-07 17:29:30.082771938 +0000 UTC" Apr 22 21:09:13.616215 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.616211 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14996h20m16.466562379s" Apr 22 21:09:13.616326 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.616238 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.617256 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.617236 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 21:09:13.617352 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.617272 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 21:09:13.617352 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.617236 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 21:09:13.617857 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.617807 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:13.617953 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:13.617925 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:13.618029 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.618010 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 21:09:13.618316 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.618298 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-tkx2v\"" Apr 22 21:09:13.618396 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.618337 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 21:09:13.618516 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.618500 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 21:09:13.618588 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.618572 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-m42kf\"" Apr 22 21:09:13.619098 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.618860 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.620015 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.619983 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rllmv" Apr 22 21:09:13.620848 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.620831 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:09:13.620969 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.620954 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pzc7r\"" Apr 22 21:09:13.621210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.621190 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 21:09:13.621349 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.621335 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-b4l88" Apr 22 21:09:13.622037 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.622022 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 21:09:13.622326 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.622310 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:13.622416 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.622361 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-q7c24\"" Apr 22 21:09:13.622416 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:13.622362 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:13.622573 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.622559 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 21:09:13.623250 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.623232 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 21:09:13.623324 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.623277 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 21:09:13.623376 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.623338 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.623376 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.623282 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-kzp5w\"" Apr 22 21:09:13.624307 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.624292 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q864p" Apr 22 21:09:13.624996 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.624980 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 21:09:13.625269 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.625248 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-r4xxs" Apr 22 21:09:13.625362 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.625273 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 21:09:13.625362 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.625289 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-q7rq8\"" Apr 22 21:09:13.625362 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.625325 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 21:09:13.626199 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.626183 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 21:09:13.626259 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.626246 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 21:09:13.626314 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.626249 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 21:09:13.626497 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.626481 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.626856 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.626841 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fndn8\"" Apr 22 21:09:13.627289 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.627269 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 21:09:13.627488 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.627472 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 21:09:13.627488 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.627484 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:09:13.627607 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.627484 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dj76g\"" Apr 22 21:09:13.628187 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.628171 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 21:09:13.629465 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.629439 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 21:09:13.629465 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.629458 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 21:09:13.629649 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.629486 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pm9kf\"" Apr 22 21:09:13.629649 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.629449 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 21:09:13.629649 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.629523 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 21:09:13.629649 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.629439 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 21:09:13.629914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.629897 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 21:09:13.631170 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.631136 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 21:09:13.637540 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637522 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d1d59cdd-035f-4424-9def-015beb3b369f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.637606 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637548 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfjw6\" (UniqueName: \"kubernetes.io/projected/d1d59cdd-035f-4424-9def-015beb3b369f-kube-api-access-dfjw6\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.637606 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637585 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-run-netns\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.637671 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637607 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/62da3121-b9c0-42d1-b441-45c1a4816f11-ovnkube-config\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.637671 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637637 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-sys-fs\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.637671 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637666 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-os-release\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.637760 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637693 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-run-multus-certs\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.637760 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637709 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-sysctl-conf\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.637760 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637722 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-run\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.637760 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637737 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0-iptables-alerter-script\") pod \"iptables-alerter-r4xxs\" (UID: \"0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0\") " pod="openshift-network-operator/iptables-alerter-r4xxs" Apr 22 21:09:13.637880 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637760 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d1d59cdd-035f-4424-9def-015beb3b369f-cnibin\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.637880 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637786 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d1d59cdd-035f-4424-9def-015beb3b369f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.637880 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637802 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq4s2\" (UniqueName: \"kubernetes.io/projected/e62d12e3-203c-47e6-8292-aebabba9b716-kube-api-access-vq4s2\") pod \"node-resolver-rllmv\" (UID: \"e62d12e3-203c-47e6-8292-aebabba9b716\") " pod="openshift-dns/node-resolver-rllmv" Apr 22 21:09:13.637880 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637815 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-kubelet\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.637880 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637835 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-cni-bin\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.637880 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637860 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e62d12e3-203c-47e6-8292-aebabba9b716-hosts-file\") pod \"node-resolver-rllmv\" (UID: \"e62d12e3-203c-47e6-8292-aebabba9b716\") " pod="openshift-dns/node-resolver-rllmv" Apr 22 21:09:13.638084 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637886 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-cni-netd\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.638084 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637914 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjdq4\" (UniqueName: \"kubernetes.io/projected/62da3121-b9c0-42d1-b441-45c1a4816f11-kube-api-access-qjdq4\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.638084 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637956 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-var-lib-kubelet\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.638084 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.637989 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-tuned\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.638084 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638011 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a350647b-d99d-4b2d-b6df-a46ddc7da504-tmp\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.638084 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638037 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-run-openvswitch\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.638084 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638060 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-system-cni-dir\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.638084 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638082 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9420aad-9147-4086-9dbb-2f74a2f65676-multus-daemon-config\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.638413 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638105 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0031b709-1393-4369-bbf3-cc631f87aafc-agent-certs\") pod \"konnectivity-agent-b4l88\" (UID: \"0031b709-1393-4369-bbf3-cc631f87aafc\") " pod="kube-system/konnectivity-agent-b4l88" Apr 22 21:09:13.638413 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638135 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0031b709-1393-4369-bbf3-cc631f87aafc-konnectivity-ca\") pod \"konnectivity-agent-b4l88\" (UID: \"0031b709-1393-4369-bbf3-cc631f87aafc\") " pod="kube-system/konnectivity-agent-b4l88" Apr 22 21:09:13.638413 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638201 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d1d59cdd-035f-4424-9def-015beb3b369f-cni-binary-copy\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.638413 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638242 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-systemd-units\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.638413 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638277 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/62da3121-b9c0-42d1-b441-45c1a4816f11-ovnkube-script-lib\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.638413 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638306 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-kubernetes\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.638413 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638332 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjrld\" (UniqueName: \"kubernetes.io/projected/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-kube-api-access-wjrld\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.638413 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638347 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0-host-slash\") pod \"iptables-alerter-r4xxs\" (UID: \"0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0\") " pod="openshift-network-operator/iptables-alerter-r4xxs" Apr 22 21:09:13.638413 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638373 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d1d59cdd-035f-4424-9def-015beb3b369f-os-release\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.638413 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638398 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5e2e8788-770b-4c8b-aa3b-d52af912e57b-serviceca\") pod \"node-ca-q864p\" (UID: \"5e2e8788-770b-4c8b-aa3b-d52af912e57b\") " pod="openshift-image-registry/node-ca-q864p" Apr 22 21:09:13.638792 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638777 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-node-log\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.638829 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638808 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-socket-dir\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.638863 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638832 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-device-dir\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.638863 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638853 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-var-lib-cni-bin\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.638939 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638870 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-sys\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.638939 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638884 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-systemd\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.638939 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638908 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcrjn\" (UniqueName: \"kubernetes.io/projected/a350647b-d99d-4b2d-b6df-a46ddc7da504-kube-api-access-kcrjn\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.638939 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638929 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-sysconfig\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.639055 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638945 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-lib-modules\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.639055 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.638984 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.639055 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639010 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e62d12e3-203c-47e6-8292-aebabba9b716-tmp-dir\") pod \"node-resolver-rllmv\" (UID: \"e62d12e3-203c-47e6-8292-aebabba9b716\") " pod="openshift-dns/node-resolver-rllmv" Apr 22 21:09:13.639055 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639038 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-run-ovn-kubernetes\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.639182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639065 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.639182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639081 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-multus-cni-dir\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.639182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639096 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-cnibin\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.639182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639110 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-etc-selinux\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.639182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639124 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9420aad-9147-4086-9dbb-2f74a2f65676-cni-binary-copy\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.639182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639139 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-multus-socket-dir-parent\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.639182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639179 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-multus-conf-dir\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.639373 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639199 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-etc-kubernetes\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.639373 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639232 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp87c\" (UniqueName: \"kubernetes.io/projected/d9420aad-9147-4086-9dbb-2f74a2f65676-kube-api-access-vp87c\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.639373 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639262 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-registration-dir\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.639373 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639289 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d1d59cdd-035f-4424-9def-015beb3b369f-system-cni-dir\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.639373 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639314 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d1d59cdd-035f-4424-9def-015beb3b369f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.639373 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639332 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/62da3121-b9c0-42d1-b441-45c1a4816f11-env-overrides\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.639373 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639345 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-modprobe-d\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.639373 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639367 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrbsg\" (UniqueName: \"kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg\") pod \"network-check-target-b9rbt\" (UID: \"d8c1c10e-bf24-4fb2-9019-e759c35b5460\") " pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639397 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-etc-openvswitch\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639418 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-slash\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639432 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-var-lib-openvswitch\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639452 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-var-lib-cni-multus\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639478 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlcz8\" (UniqueName: \"kubernetes.io/projected/0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0-kube-api-access-wlcz8\") pod \"iptables-alerter-r4xxs\" (UID: \"0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0\") " pod="openshift-network-operator/iptables-alerter-r4xxs" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639494 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rpwg\" (UniqueName: \"kubernetes.io/projected/5e2e8788-770b-4c8b-aa3b-d52af912e57b-kube-api-access-6rpwg\") pod \"node-ca-q864p\" (UID: \"5e2e8788-770b-4c8b-aa3b-d52af912e57b\") " pod="openshift-image-registry/node-ca-q864p" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639508 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-run-systemd\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639534 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-run-k8s-cni-cncf-io\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639547 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-run-netns\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639561 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-var-lib-kubelet\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639574 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-hostroot\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639588 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs\") pod \"network-metrics-daemon-d7j8j\" (UID: \"80bac7af-2767-4aee-b3fa-d0683f389b6a\") " pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639601 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-sysctl-d\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639614 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-host\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.639626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639626 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e2e8788-770b-4c8b-aa3b-d52af912e57b-host\") pod \"node-ca-q864p\" (UID: \"5e2e8788-770b-4c8b-aa3b-d52af912e57b\") " pod="openshift-image-registry/node-ca-q864p" Apr 22 21:09:13.640058 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639638 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-run-ovn\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.640058 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639662 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-log-socket\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.640058 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639676 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/62da3121-b9c0-42d1-b441-45c1a4816f11-ovn-node-metrics-cert\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.640058 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.639690 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpf66\" (UniqueName: \"kubernetes.io/projected/80bac7af-2767-4aee-b3fa-d0683f389b6a-kube-api-access-vpf66\") pod \"network-metrics-daemon-d7j8j\" (UID: \"80bac7af-2767-4aee-b3fa-d0683f389b6a\") " pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:13.652692 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.652674 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 21:09:13.683045 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.683027 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fk29j" Apr 22 21:09:13.690372 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.690353 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fk29j" Apr 22 21:09:13.740114 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740091 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-run-systemd\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.740199 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740122 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-run-k8s-cni-cncf-io\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.740199 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740154 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-run-netns\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.740199 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740171 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-var-lib-kubelet\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.740199 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740186 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-hostroot\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.740332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740202 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs\") pod \"network-metrics-daemon-d7j8j\" (UID: \"80bac7af-2767-4aee-b3fa-d0683f389b6a\") " pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:13.740332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740201 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-run-k8s-cni-cncf-io\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.740332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740218 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-sysctl-d\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.740332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740224 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-var-lib-kubelet\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.740332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740200 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-run-netns\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.740332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740258 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-run-systemd\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.740332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740264 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-hostroot\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.740332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740258 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-host\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.740332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740298 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e2e8788-770b-4c8b-aa3b-d52af912e57b-host\") pod \"node-ca-q864p\" (UID: \"5e2e8788-770b-4c8b-aa3b-d52af912e57b\") " pod="openshift-image-registry/node-ca-q864p" Apr 22 21:09:13.740332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740303 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-sysctl-d\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.740332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740302 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-host\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.740332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740321 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-run-ovn\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740337 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-log-socket\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740344 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e2e8788-770b-4c8b-aa3b-d52af912e57b-host\") pod \"node-ca-q864p\" (UID: \"5e2e8788-770b-4c8b-aa3b-d52af912e57b\") " pod="openshift-image-registry/node-ca-q864p" Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740353 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/62da3121-b9c0-42d1-b441-45c1a4816f11-ovn-node-metrics-cert\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:13.740357 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740368 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-run-ovn\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740374 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpf66\" (UniqueName: \"kubernetes.io/projected/80bac7af-2767-4aee-b3fa-d0683f389b6a-kube-api-access-vpf66\") pod \"network-metrics-daemon-d7j8j\" (UID: \"80bac7af-2767-4aee-b3fa-d0683f389b6a\") " pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740397 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-log-socket\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740401 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d1d59cdd-035f-4424-9def-015beb3b369f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:13.740427 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs podName:80bac7af-2767-4aee-b3fa-d0683f389b6a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:14.240396586 +0000 UTC m=+2.113288421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs") pod "network-metrics-daemon-d7j8j" (UID: "80bac7af-2767-4aee-b3fa-d0683f389b6a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740525 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfjw6\" (UniqueName: \"kubernetes.io/projected/d1d59cdd-035f-4424-9def-015beb3b369f-kube-api-access-dfjw6\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740563 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-run-netns\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740599 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/62da3121-b9c0-42d1-b441-45c1a4816f11-ovnkube-config\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740666 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-run-netns\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740690 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-sys-fs\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740728 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-os-release\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.740735 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740737 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740754 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-run-multus-certs\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740752 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-sys-fs\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740784 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-sysctl-conf\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740801 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-os-release\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740807 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-run\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740807 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-run-multus-certs\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740839 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0-iptables-alerter-script\") pod \"iptables-alerter-r4xxs\" (UID: \"0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0\") " pod="openshift-network-operator/iptables-alerter-r4xxs" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740864 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d1d59cdd-035f-4424-9def-015beb3b369f-cnibin\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740878 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-run\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740893 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d1d59cdd-035f-4424-9def-015beb3b369f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740900 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-sysctl-conf\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740911 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d1d59cdd-035f-4424-9def-015beb3b369f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740928 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d1d59cdd-035f-4424-9def-015beb3b369f-cnibin\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740934 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vq4s2\" (UniqueName: \"kubernetes.io/projected/e62d12e3-203c-47e6-8292-aebabba9b716-kube-api-access-vq4s2\") pod \"node-resolver-rllmv\" (UID: \"e62d12e3-203c-47e6-8292-aebabba9b716\") " pod="openshift-dns/node-resolver-rllmv" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740961 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-kubelet\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740980 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-cni-bin\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.741414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.740994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e62d12e3-203c-47e6-8292-aebabba9b716-hosts-file\") pod \"node-resolver-rllmv\" (UID: \"e62d12e3-203c-47e6-8292-aebabba9b716\") " pod="openshift-dns/node-resolver-rllmv" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741012 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-cni-netd\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741067 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-cni-netd\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741090 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-kubelet\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741112 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjdq4\" (UniqueName: \"kubernetes.io/projected/62da3121-b9c0-42d1-b441-45c1a4816f11-kube-api-access-qjdq4\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741131 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-var-lib-kubelet\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741161 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-tuned\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741178 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a350647b-d99d-4b2d-b6df-a46ddc7da504-tmp\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741202 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-run-openvswitch\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741214 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/62da3121-b9c0-42d1-b441-45c1a4816f11-ovnkube-config\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741220 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-system-cni-dir\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741246 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-system-cni-dir\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741269 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-cni-bin\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741270 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9420aad-9147-4086-9dbb-2f74a2f65676-multus-daemon-config\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741298 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0031b709-1393-4369-bbf3-cc631f87aafc-agent-certs\") pod \"konnectivity-agent-b4l88\" (UID: \"0031b709-1393-4369-bbf3-cc631f87aafc\") " pod="kube-system/konnectivity-agent-b4l88" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741325 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0031b709-1393-4369-bbf3-cc631f87aafc-konnectivity-ca\") pod \"konnectivity-agent-b4l88\" (UID: \"0031b709-1393-4369-bbf3-cc631f87aafc\") " pod="kube-system/konnectivity-agent-b4l88" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741330 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d1d59cdd-035f-4424-9def-015beb3b369f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d1d59cdd-035f-4424-9def-015beb3b369f-cni-binary-copy\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.742182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741376 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-systemd-units\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741403 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/62da3121-b9c0-42d1-b441-45c1a4816f11-ovnkube-script-lib\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741415 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-run-openvswitch\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741429 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-kubernetes\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741448 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e62d12e3-203c-47e6-8292-aebabba9b716-hosts-file\") pod \"node-resolver-rllmv\" (UID: \"e62d12e3-203c-47e6-8292-aebabba9b716\") " pod="openshift-dns/node-resolver-rllmv" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjrld\" (UniqueName: \"kubernetes.io/projected/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-kube-api-access-wjrld\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741479 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0-host-slash\") pod \"iptables-alerter-r4xxs\" (UID: \"0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0\") " pod="openshift-network-operator/iptables-alerter-r4xxs" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741490 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0-iptables-alerter-script\") pod \"iptables-alerter-r4xxs\" (UID: \"0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0\") " pod="openshift-network-operator/iptables-alerter-r4xxs" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741504 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d1d59cdd-035f-4424-9def-015beb3b369f-os-release\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741530 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5e2e8788-770b-4c8b-aa3b-d52af912e57b-serviceca\") pod \"node-ca-q864p\" (UID: \"5e2e8788-770b-4c8b-aa3b-d52af912e57b\") " pod="openshift-image-registry/node-ca-q864p" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741582 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-kubernetes\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741632 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-node-log\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741663 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-socket-dir\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741672 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-systemd-units\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741686 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-device-dir\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741711 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-var-lib-cni-bin\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741720 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0-host-slash\") pod \"iptables-alerter-r4xxs\" (UID: \"0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0\") " pod="openshift-network-operator/iptables-alerter-r4xxs" Apr 22 21:09:13.743009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741734 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-sys\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741742 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-var-lib-kubelet\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741786 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-systemd\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741807 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-node-log\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741817 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcrjn\" (UniqueName: \"kubernetes.io/projected/a350647b-d99d-4b2d-b6df-a46ddc7da504-kube-api-access-kcrjn\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741844 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-sysconfig\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741867 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-lib-modules\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741891 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741916 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e62d12e3-203c-47e6-8292-aebabba9b716-tmp-dir\") pod \"node-resolver-rllmv\" (UID: \"e62d12e3-203c-47e6-8292-aebabba9b716\") " pod="openshift-dns/node-resolver-rllmv" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741942 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-run-ovn-kubernetes\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741952 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5e2e8788-770b-4c8b-aa3b-d52af912e57b-serviceca\") pod \"node-ca-q864p\" (UID: \"5e2e8788-770b-4c8b-aa3b-d52af912e57b\") " pod="openshift-image-registry/node-ca-q864p" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741969 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741984 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9420aad-9147-4086-9dbb-2f74a2f65676-multus-daemon-config\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741984 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-socket-dir\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.741998 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-multus-cni-dir\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742016 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d1d59cdd-035f-4424-9def-015beb3b369f-os-release\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742023 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-cnibin\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.743796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742033 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-device-dir\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742054 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-etc-selinux\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742073 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-var-lib-cni-bin\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742080 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9420aad-9147-4086-9dbb-2f74a2f65676-cni-binary-copy\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742105 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-multus-socket-dir-parent\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742114 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-sys\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742129 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-multus-conf-dir\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742177 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-etc-kubernetes\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742202 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp87c\" (UniqueName: \"kubernetes.io/projected/d9420aad-9147-4086-9dbb-2f74a2f65676-kube-api-access-vp87c\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742254 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-systemd\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742285 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-registration-dir\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742311 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d1d59cdd-035f-4424-9def-015beb3b369f-system-cni-dir\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d1d59cdd-035f-4424-9def-015beb3b369f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742361 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/62da3121-b9c0-42d1-b441-45c1a4816f11-env-overrides\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742387 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-modprobe-d\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742415 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbsg\" (UniqueName: \"kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg\") pod \"network-check-target-b9rbt\" (UID: \"d8c1c10e-bf24-4fb2-9019-e759c35b5460\") " pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742480 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-etc-openvswitch\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.744335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742488 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d1d59cdd-035f-4424-9def-015beb3b369f-cni-binary-copy\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742504 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-slash\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742530 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-var-lib-openvswitch\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742550 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-multus-conf-dir\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742560 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-var-lib-cni-multus\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742589 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlcz8\" (UniqueName: \"kubernetes.io/projected/0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0-kube-api-access-wlcz8\") pod \"iptables-alerter-r4xxs\" (UID: \"0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0\") " pod="openshift-network-operator/iptables-alerter-r4xxs" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742604 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-multus-socket-dir-parent\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742616 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rpwg\" (UniqueName: \"kubernetes.io/projected/5e2e8788-770b-4c8b-aa3b-d52af912e57b-kube-api-access-6rpwg\") pod \"node-ca-q864p\" (UID: \"5e2e8788-770b-4c8b-aa3b-d52af912e57b\") " pod="openshift-image-registry/node-ca-q864p" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742649 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-etc-kubernetes\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742651 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0031b709-1393-4369-bbf3-cc631f87aafc-konnectivity-ca\") pod \"konnectivity-agent-b4l88\" (UID: \"0031b709-1393-4369-bbf3-cc631f87aafc\") " pod="kube-system/konnectivity-agent-b4l88" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742702 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-run-ovn-kubernetes\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742868 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9420aad-9147-4086-9dbb-2f74a2f65676-cni-binary-copy\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742927 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e62d12e3-203c-47e6-8292-aebabba9b716-tmp-dir\") pod \"node-resolver-rllmv\" (UID: \"e62d12e3-203c-47e6-8292-aebabba9b716\") " pod="openshift-dns/node-resolver-rllmv" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742948 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742930 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-sysconfig\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.743013 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-multus-cni-dir\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.743038 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-lib-modules\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.743060 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-cnibin\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.744777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.743076 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/62da3121-b9c0-42d1-b441-45c1a4816f11-ovnkube-script-lib\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.743084 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.742415 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-etc-selinux\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.743111 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-registration-dir\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.743166 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d1d59cdd-035f-4424-9def-015beb3b369f-system-cni-dir\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.743212 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-etc-openvswitch\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.743227 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-modprobe-d\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.743251 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-host-slash\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.743257 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d1d59cdd-035f-4424-9def-015beb3b369f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.743292 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/62da3121-b9c0-42d1-b441-45c1a4816f11-var-lib-openvswitch\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.743305 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9420aad-9147-4086-9dbb-2f74a2f65676-host-var-lib-cni-multus\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.743614 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/62da3121-b9c0-42d1-b441-45c1a4816f11-env-overrides\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.744157 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a350647b-d99d-4b2d-b6df-a46ddc7da504-tmp\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.744216 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/62da3121-b9c0-42d1-b441-45c1a4816f11-ovn-node-metrics-cert\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.744309 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0031b709-1393-4369-bbf3-cc631f87aafc-agent-certs\") pod \"konnectivity-agent-b4l88\" (UID: \"0031b709-1393-4369-bbf3-cc631f87aafc\") " pod="kube-system/konnectivity-agent-b4l88" Apr 22 21:09:13.745264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.744815 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a350647b-d99d-4b2d-b6df-a46ddc7da504-etc-tuned\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.752060 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:13.752031 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:13.752613 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:13.752067 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:13.752613 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:13.752082 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xrbsg for pod openshift-network-diagnostics/network-check-target-b9rbt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:13.752613 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:13.752135 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg podName:d8c1c10e-bf24-4fb2-9019-e759c35b5460 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:14.252118428 +0000 UTC m=+2.125010272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xrbsg" (UniqueName: "kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg") pod "network-check-target-b9rbt" (UID: "d8c1c10e-bf24-4fb2-9019-e759c35b5460") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:13.752613 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.752542 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpf66\" (UniqueName: \"kubernetes.io/projected/80bac7af-2767-4aee-b3fa-d0683f389b6a-kube-api-access-vpf66\") pod \"network-metrics-daemon-d7j8j\" (UID: \"80bac7af-2767-4aee-b3fa-d0683f389b6a\") " pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:13.753896 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.753870 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjrld\" (UniqueName: \"kubernetes.io/projected/3f9ade50-8cb6-4e21-a9e4-dad84e22e88c-kube-api-access-wjrld\") pod \"aws-ebs-csi-driver-node-87ttd\" (UID: \"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:13.754316 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.754293 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rpwg\" (UniqueName: \"kubernetes.io/projected/5e2e8788-770b-4c8b-aa3b-d52af912e57b-kube-api-access-6rpwg\") pod \"node-ca-q864p\" (UID: \"5e2e8788-770b-4c8b-aa3b-d52af912e57b\") " pod="openshift-image-registry/node-ca-q864p" Apr 22 21:09:13.754692 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.754670 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlcz8\" (UniqueName: \"kubernetes.io/projected/0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0-kube-api-access-wlcz8\") pod \"iptables-alerter-r4xxs\" (UID: \"0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0\") " pod="openshift-network-operator/iptables-alerter-r4xxs" Apr 22 21:09:13.754957 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.754934 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcrjn\" (UniqueName: \"kubernetes.io/projected/a350647b-d99d-4b2d-b6df-a46ddc7da504-kube-api-access-kcrjn\") pod \"tuned-ccslk\" (UID: \"a350647b-d99d-4b2d-b6df-a46ddc7da504\") " pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.755182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.755155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp87c\" (UniqueName: \"kubernetes.io/projected/d9420aad-9147-4086-9dbb-2f74a2f65676-kube-api-access-vp87c\") pod \"multus-xv5js\" (UID: \"d9420aad-9147-4086-9dbb-2f74a2f65676\") " pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.755330 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.755315 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjdq4\" (UniqueName: \"kubernetes.io/projected/62da3121-b9c0-42d1-b441-45c1a4816f11-kube-api-access-qjdq4\") pod \"ovnkube-node-sshlp\" (UID: \"62da3121-b9c0-42d1-b441-45c1a4816f11\") " pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.755491 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.755476 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq4s2\" (UniqueName: \"kubernetes.io/projected/e62d12e3-203c-47e6-8292-aebabba9b716-kube-api-access-vq4s2\") pod \"node-resolver-rllmv\" (UID: \"e62d12e3-203c-47e6-8292-aebabba9b716\") " pod="openshift-dns/node-resolver-rllmv" Apr 22 21:09:13.756293 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.756276 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfjw6\" (UniqueName: \"kubernetes.io/projected/d1d59cdd-035f-4424-9def-015beb3b369f-kube-api-access-dfjw6\") pod \"multus-additional-cni-plugins-6z85g\" (UID: \"d1d59cdd-035f-4424-9def-015beb3b369f\") " pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.767597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.767012 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-r4xxs" Apr 22 21:09:13.774378 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.774361 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:13.782422 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:13.782393 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode52f89589d514c455852c1cdd49a71bd.slice/crio-3bce23b07b8b2b539db5b48391243aa10ebbb03ee375ae6bcba9e1c33ed28960 WatchSource:0}: Error finding container 3bce23b07b8b2b539db5b48391243aa10ebbb03ee375ae6bcba9e1c33ed28960: Status 404 returned error can't find the container with id 3bce23b07b8b2b539db5b48391243aa10ebbb03ee375ae6bcba9e1c33ed28960 Apr 22 21:09:13.783115 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:13.783097 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5401552a10b9bd31fa1f4a18dcace9bb.slice/crio-aa4cf3a240072ac48a96467e07f4648f2b163bef4dac06b7d7305c350b9cb395 WatchSource:0}: Error finding container aa4cf3a240072ac48a96467e07f4648f2b163bef4dac06b7d7305c350b9cb395: Status 404 returned error can't find the container with id aa4cf3a240072ac48a96467e07f4648f2b163bef4dac06b7d7305c350b9cb395 Apr 22 21:09:13.783869 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:13.783855 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0840e683_cbd0_4d5c_aa8f_e3ef7dc8bba0.slice/crio-73d96a18ea298020273ca09ba761e616e4323b3c6188677f6a5fe3efdd59493e WatchSource:0}: Error finding container 73d96a18ea298020273ca09ba761e616e4323b3c6188677f6a5fe3efdd59493e: Status 404 returned error can't find the container with id 73d96a18ea298020273ca09ba761e616e4323b3c6188677f6a5fe3efdd59493e Apr 22 21:09:13.784317 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:13.784300 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62da3121_b9c0_42d1_b441_45c1a4816f11.slice/crio-9af8048f85c64dc9129442dbbc1f29036152bcb80aba6fdd4e75bd52751ddc96 WatchSource:0}: Error finding container 9af8048f85c64dc9129442dbbc1f29036152bcb80aba6fdd4e75bd52751ddc96: Status 404 returned error can't find the container with id 9af8048f85c64dc9129442dbbc1f29036152bcb80aba6fdd4e75bd52751ddc96 Apr 22 21:09:13.786656 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.786625 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:09:13.786725 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.786684 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" event={"ID":"62da3121-b9c0-42d1-b441-45c1a4816f11","Type":"ContainerStarted","Data":"9af8048f85c64dc9129442dbbc1f29036152bcb80aba6fdd4e75bd52751ddc96"} Apr 22 21:09:13.787879 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.787860 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-r4xxs" event={"ID":"0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0","Type":"ContainerStarted","Data":"73d96a18ea298020273ca09ba761e616e4323b3c6188677f6a5fe3efdd59493e"} Apr 22 21:09:13.789012 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.788982 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" event={"ID":"e52f89589d514c455852c1cdd49a71bd","Type":"ContainerStarted","Data":"3bce23b07b8b2b539db5b48391243aa10ebbb03ee375ae6bcba9e1c33ed28960"} Apr 22 21:09:13.789975 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.789946 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" event={"ID":"5401552a10b9bd31fa1f4a18dcace9bb","Type":"ContainerStarted","Data":"aa4cf3a240072ac48a96467e07f4648f2b163bef4dac06b7d7305c350b9cb395"} Apr 22 21:09:13.950064 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.950045 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xv5js" Apr 22 21:09:13.955465 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:13.955441 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9420aad_9147_4086_9dbb_2f74a2f65676.slice/crio-19d2b8b652122e8aa254109b45c957d3537846363713e97fdc854d558d1d7f8f WatchSource:0}: Error finding container 19d2b8b652122e8aa254109b45c957d3537846363713e97fdc854d558d1d7f8f: Status 404 returned error can't find the container with id 19d2b8b652122e8aa254109b45c957d3537846363713e97fdc854d558d1d7f8f Apr 22 21:09:13.960179 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.960164 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6z85g" Apr 22 21:09:13.966097 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:13.966077 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1d59cdd_035f_4424_9def_015beb3b369f.slice/crio-f6d590fe15daa3b1024afc222ecf36282784da65119a0292a24e7c086a2937b7 WatchSource:0}: Error finding container f6d590fe15daa3b1024afc222ecf36282784da65119a0292a24e7c086a2937b7: Status 404 returned error can't find the container with id f6d590fe15daa3b1024afc222ecf36282784da65119a0292a24e7c086a2937b7 Apr 22 21:09:13.967488 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.967472 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ccslk" Apr 22 21:09:13.972734 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:13.972716 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda350647b_d99d_4b2d_b6df_a46ddc7da504.slice/crio-1989dfa3b611345e45e30f63b31d7d076c032db1431d3b987de93862e8ce2956 WatchSource:0}: Error finding container 1989dfa3b611345e45e30f63b31d7d076c032db1431d3b987de93862e8ce2956: Status 404 returned error can't find the container with id 1989dfa3b611345e45e30f63b31d7d076c032db1431d3b987de93862e8ce2956 Apr 22 21:09:13.992602 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:13.992578 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rllmv" Apr 22 21:09:13.997913 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:13.997895 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode62d12e3_203c_47e6_8292_aebabba9b716.slice/crio-5bfc5f5de6e43171475c9cac6bd03c6cd269795fe0b3e54cf599b54a9e44edb1 WatchSource:0}: Error finding container 5bfc5f5de6e43171475c9cac6bd03c6cd269795fe0b3e54cf599b54a9e44edb1: Status 404 returned error can't find the container with id 5bfc5f5de6e43171475c9cac6bd03c6cd269795fe0b3e54cf599b54a9e44edb1 Apr 22 21:09:14.007927 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.007907 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-b4l88" Apr 22 21:09:14.012751 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:14.012730 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0031b709_1393_4369_bbf3_cc631f87aafc.slice/crio-85b6a0b431d13a0607ac3740787bd9d42c076829ac5c71dc1a0058e2ce39a313 WatchSource:0}: Error finding container 85b6a0b431d13a0607ac3740787bd9d42c076829ac5c71dc1a0058e2ce39a313: Status 404 returned error can't find the container with id 85b6a0b431d13a0607ac3740787bd9d42c076829ac5c71dc1a0058e2ce39a313 Apr 22 21:09:14.020728 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.020712 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" Apr 22 21:09:14.025658 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:14.025639 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f9ade50_8cb6_4e21_a9e4_dad84e22e88c.slice/crio-24181ee95a2df467208d0f898489385805b2369df8cf0f12df9d9067a51519c5 WatchSource:0}: Error finding container 24181ee95a2df467208d0f898489385805b2369df8cf0f12df9d9067a51519c5: Status 404 returned error can't find the container with id 24181ee95a2df467208d0f898489385805b2369df8cf0f12df9d9067a51519c5 Apr 22 21:09:14.034820 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.034779 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-q864p" Apr 22 21:09:14.040155 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:14.040116 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e2e8788_770b_4c8b_aa3b_d52af912e57b.slice/crio-c6ed2d76734115ed0c32fb01dcccaa797add2a835de8c1bf18fe2e661135edac WatchSource:0}: Error finding container c6ed2d76734115ed0c32fb01dcccaa797add2a835de8c1bf18fe2e661135edac: Status 404 returned error can't find the container with id c6ed2d76734115ed0c32fb01dcccaa797add2a835de8c1bf18fe2e661135edac Apr 22 21:09:14.245658 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.245623 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs\") pod \"network-metrics-daemon-d7j8j\" (UID: \"80bac7af-2767-4aee-b3fa-d0683f389b6a\") " pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:14.245825 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:14.245780 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:14.245889 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:14.245840 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs podName:80bac7af-2767-4aee-b3fa-d0683f389b6a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:15.245820407 +0000 UTC m=+3.118712240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs") pod "network-metrics-daemon-d7j8j" (UID: "80bac7af-2767-4aee-b3fa-d0683f389b6a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:14.346792 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.346707 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbsg\" (UniqueName: \"kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg\") pod \"network-check-target-b9rbt\" (UID: \"d8c1c10e-bf24-4fb2-9019-e759c35b5460\") " pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:14.346943 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:14.346886 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:14.346943 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:14.346905 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:14.346943 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:14.346917 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xrbsg for pod openshift-network-diagnostics/network-check-target-b9rbt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:14.347096 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:14.346972 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg podName:d8c1c10e-bf24-4fb2-9019-e759c35b5460 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:15.346955065 +0000 UTC m=+3.219846921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xrbsg" (UniqueName: "kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg") pod "network-check-target-b9rbt" (UID: "d8c1c10e-bf24-4fb2-9019-e759c35b5460") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:14.639319 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.639063 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:14.691158 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.691060 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 21:04:13 +0000 UTC" deadline="2027-12-11 10:58:39.172148486 +0000 UTC" Apr 22 21:09:14.691158 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.691090 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14341h49m24.481061664s" Apr 22 21:09:14.819270 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.819218 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xv5js" event={"ID":"d9420aad-9147-4086-9dbb-2f74a2f65676","Type":"ContainerStarted","Data":"19d2b8b652122e8aa254109b45c957d3537846363713e97fdc854d558d1d7f8f"} Apr 22 21:09:14.842354 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.842298 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q864p" event={"ID":"5e2e8788-770b-4c8b-aa3b-d52af912e57b","Type":"ContainerStarted","Data":"c6ed2d76734115ed0c32fb01dcccaa797add2a835de8c1bf18fe2e661135edac"} Apr 22 21:09:14.860953 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.860927 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:14.867707 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.867675 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" event={"ID":"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c","Type":"ContainerStarted","Data":"24181ee95a2df467208d0f898489385805b2369df8cf0f12df9d9067a51519c5"} Apr 22 21:09:14.891919 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.891847 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ccslk" event={"ID":"a350647b-d99d-4b2d-b6df-a46ddc7da504","Type":"ContainerStarted","Data":"1989dfa3b611345e45e30f63b31d7d076c032db1431d3b987de93862e8ce2956"} Apr 22 21:09:14.904755 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.904723 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-b4l88" event={"ID":"0031b709-1393-4369-bbf3-cc631f87aafc","Type":"ContainerStarted","Data":"85b6a0b431d13a0607ac3740787bd9d42c076829ac5c71dc1a0058e2ce39a313"} Apr 22 21:09:14.913519 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.913484 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rllmv" event={"ID":"e62d12e3-203c-47e6-8292-aebabba9b716","Type":"ContainerStarted","Data":"5bfc5f5de6e43171475c9cac6bd03c6cd269795fe0b3e54cf599b54a9e44edb1"} Apr 22 21:09:14.933051 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.932959 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z85g" event={"ID":"d1d59cdd-035f-4424-9def-015beb3b369f","Type":"ContainerStarted","Data":"f6d590fe15daa3b1024afc222ecf36282784da65119a0292a24e7c086a2937b7"} Apr 22 21:09:14.933607 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:14.933400 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:15.253589 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:15.253508 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs\") pod \"network-metrics-daemon-d7j8j\" (UID: \"80bac7af-2767-4aee-b3fa-d0683f389b6a\") " pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:15.253750 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:15.253639 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:15.253750 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:15.253697 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs podName:80bac7af-2767-4aee-b3fa-d0683f389b6a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:17.253679585 +0000 UTC m=+5.126571415 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs") pod "network-metrics-daemon-d7j8j" (UID: "80bac7af-2767-4aee-b3fa-d0683f389b6a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:15.354018 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:15.353984 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbsg\" (UniqueName: \"kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg\") pod \"network-check-target-b9rbt\" (UID: \"d8c1c10e-bf24-4fb2-9019-e759c35b5460\") " pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:15.354209 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:15.354197 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:15.354275 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:15.354216 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:15.354275 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:15.354230 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xrbsg for pod openshift-network-diagnostics/network-check-target-b9rbt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:15.354380 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:15.354286 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg podName:d8c1c10e-bf24-4fb2-9019-e759c35b5460 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:17.354268126 +0000 UTC m=+5.227159958 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xrbsg" (UniqueName: "kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg") pod "network-check-target-b9rbt" (UID: "d8c1c10e-bf24-4fb2-9019-e759c35b5460") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:15.692036 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:15.691934 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 21:04:13 +0000 UTC" deadline="2027-11-05 07:22:12.144501883 +0000 UTC" Apr 22 21:09:15.692036 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:15.691971 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13474h12m56.452534658s" Apr 22 21:09:15.784692 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:15.784656 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:15.784859 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:15.784778 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:15.785220 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:15.785200 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:15.785334 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:15.785308 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:17.272613 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:17.271983 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs\") pod \"network-metrics-daemon-d7j8j\" (UID: \"80bac7af-2767-4aee-b3fa-d0683f389b6a\") " pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:17.272613 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:17.272183 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:17.272613 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:17.272254 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs podName:80bac7af-2767-4aee-b3fa-d0683f389b6a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:21.272234895 +0000 UTC m=+9.145126725 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs") pod "network-metrics-daemon-d7j8j" (UID: "80bac7af-2767-4aee-b3fa-d0683f389b6a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:17.372805 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:17.372769 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbsg\" (UniqueName: \"kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg\") pod \"network-check-target-b9rbt\" (UID: \"d8c1c10e-bf24-4fb2-9019-e759c35b5460\") " pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:17.373015 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:17.372995 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:17.373088 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:17.373021 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:17.373088 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:17.373034 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xrbsg for pod openshift-network-diagnostics/network-check-target-b9rbt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:17.373210 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:17.373096 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg podName:d8c1c10e-bf24-4fb2-9019-e759c35b5460 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:21.373076771 +0000 UTC m=+9.245968605 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xrbsg" (UniqueName: "kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg") pod "network-check-target-b9rbt" (UID: "d8c1c10e-bf24-4fb2-9019-e759c35b5460") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:17.785272 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:17.785243 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:17.785425 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:17.785286 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:17.785425 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:17.785383 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:17.785549 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:17.785472 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:19.785530 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:19.785494 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:19.785967 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:19.785621 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:19.786066 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:19.786041 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:19.786213 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:19.786194 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:21.309967 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:21.309931 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs\") pod \"network-metrics-daemon-d7j8j\" (UID: \"80bac7af-2767-4aee-b3fa-d0683f389b6a\") " pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:21.310444 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:21.310088 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:21.310444 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:21.310166 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs podName:80bac7af-2767-4aee-b3fa-d0683f389b6a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:29.310128626 +0000 UTC m=+17.183020458 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs") pod "network-metrics-daemon-d7j8j" (UID: "80bac7af-2767-4aee-b3fa-d0683f389b6a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:21.411096 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:21.411061 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbsg\" (UniqueName: \"kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg\") pod \"network-check-target-b9rbt\" (UID: \"d8c1c10e-bf24-4fb2-9019-e759c35b5460\") " pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:21.411302 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:21.411283 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:21.411367 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:21.411305 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:21.411367 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:21.411317 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xrbsg for pod openshift-network-diagnostics/network-check-target-b9rbt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:21.411473 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:21.411376 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg podName:d8c1c10e-bf24-4fb2-9019-e759c35b5460 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:29.411357554 +0000 UTC m=+17.284249387 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xrbsg" (UniqueName: "kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg") pod "network-check-target-b9rbt" (UID: "d8c1c10e-bf24-4fb2-9019-e759c35b5460") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:21.785230 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:21.785155 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:21.785396 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:21.785157 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:21.785396 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:21.785297 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:21.785396 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:21.785314 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:23.784744 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:23.784664 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:23.784744 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:23.784687 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:23.785220 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:23.784793 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:23.785220 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:23.784949 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:25.784704 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:25.784672 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:25.785126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:25.784672 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:25.785126 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:25.784783 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:25.785126 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:25.784852 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:27.784703 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:27.784672 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:27.785161 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:27.784672 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:27.785161 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:27.784791 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:27.785161 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:27.784912 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:29.370084 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:29.370041 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs\") pod \"network-metrics-daemon-d7j8j\" (UID: \"80bac7af-2767-4aee-b3fa-d0683f389b6a\") " pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:29.370481 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:29.370215 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:29.370481 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:29.370286 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs podName:80bac7af-2767-4aee-b3fa-d0683f389b6a nodeName:}" failed. No retries permitted until 2026-04-22 21:09:45.370263658 +0000 UTC m=+33.243155487 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs") pod "network-metrics-daemon-d7j8j" (UID: "80bac7af-2767-4aee-b3fa-d0683f389b6a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:29.471000 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:29.470967 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbsg\" (UniqueName: \"kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg\") pod \"network-check-target-b9rbt\" (UID: \"d8c1c10e-bf24-4fb2-9019-e759c35b5460\") " pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:29.471179 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:29.471130 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:29.471179 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:29.471166 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:29.471179 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:29.471177 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xrbsg for pod openshift-network-diagnostics/network-check-target-b9rbt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:29.471320 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:29.471229 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg podName:d8c1c10e-bf24-4fb2-9019-e759c35b5460 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:45.471210737 +0000 UTC m=+33.344102569 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xrbsg" (UniqueName: "kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg") pod "network-check-target-b9rbt" (UID: "d8c1c10e-bf24-4fb2-9019-e759c35b5460") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:29.784937 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:29.784855 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:29.784937 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:29.784891 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:29.785174 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:29.784982 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:29.785174 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:29.785127 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:31.784701 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:31.784680 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:31.784994 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:31.784690 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:31.784994 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:31.784766 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:31.784994 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:31.784861 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:31.975521 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:31.975272 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" event={"ID":"62da3121-b9c0-42d1-b441-45c1a4816f11","Type":"ContainerStarted","Data":"653173a1b9339e9eb860f186f05e40f443d7cbf880625807ad8ef3415989561f"} Apr 22 21:09:31.975521 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:31.975510 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" event={"ID":"62da3121-b9c0-42d1-b441-45c1a4816f11","Type":"ContainerStarted","Data":"4586157c1ff602b957a4a25367ae8f2eb769c0776407435ace880ecde24a0c37"} Apr 22 21:09:31.977307 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:31.977270 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" event={"ID":"5401552a10b9bd31fa1f4a18dcace9bb","Type":"ContainerStarted","Data":"ec27627bc0074a3dfee231416033ffeb1a43f5d461761bf85231d4580b7ccae2"} Apr 22 21:09:31.978890 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:31.978866 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xv5js" event={"ID":"d9420aad-9147-4086-9dbb-2f74a2f65676","Type":"ContainerStarted","Data":"e183cd75c3401b28ba34726970a7d06bf42806debe10ed11c7f0c99454e3ac52"} Apr 22 21:09:31.980065 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:31.980033 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ccslk" event={"ID":"a350647b-d99d-4b2d-b6df-a46ddc7da504","Type":"ContainerStarted","Data":"4dbadb44047a046fcfc1c08c75b41c2a03ddf3af5fc7d5dfa60cba552832ecd6"} Apr 22 21:09:31.991715 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:31.991670 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-75.ec2.internal" podStartSLOduration=19.991657913 podStartE2EDuration="19.991657913s" podCreationTimestamp="2026-04-22 21:09:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:09:31.991162621 +0000 UTC m=+19.864054472" watchObservedRunningTime="2026-04-22 21:09:31.991657913 +0000 UTC m=+19.864549765" Apr 22 21:09:32.010118 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.010084 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xv5js" podStartSLOduration=2.253284342 podStartE2EDuration="20.010074506s" podCreationTimestamp="2026-04-22 21:09:12 +0000 UTC" firstStartedPulling="2026-04-22 21:09:13.956779974 +0000 UTC m=+1.829671803" lastFinishedPulling="2026-04-22 21:09:31.713570128 +0000 UTC m=+19.586461967" observedRunningTime="2026-04-22 21:09:32.009396912 +0000 UTC m=+19.882288779" watchObservedRunningTime="2026-04-22 21:09:32.010074506 +0000 UTC m=+19.882966359" Apr 22 21:09:32.983726 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.983516 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:09:32.984463 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.983918 2569 generic.go:358] "Generic (PLEG): container finished" podID="62da3121-b9c0-42d1-b441-45c1a4816f11" containerID="653173a1b9339e9eb860f186f05e40f443d7cbf880625807ad8ef3415989561f" exitCode=1 Apr 22 21:09:32.984463 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.983971 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" event={"ID":"62da3121-b9c0-42d1-b441-45c1a4816f11","Type":"ContainerDied","Data":"653173a1b9339e9eb860f186f05e40f443d7cbf880625807ad8ef3415989561f"} Apr 22 21:09:32.984463 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.983991 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" event={"ID":"62da3121-b9c0-42d1-b441-45c1a4816f11","Type":"ContainerStarted","Data":"f2045d7ddca156af6bfa01c0e1c70676d26d4c78a7387ff9f525988f337d8e1d"} Apr 22 21:09:32.984463 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.984000 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" event={"ID":"62da3121-b9c0-42d1-b441-45c1a4816f11","Type":"ContainerStarted","Data":"4de38b659a58a2c6527198b0d2adb6506d2b452e9c04b4cb3f0742da48c9c4c7"} Apr 22 21:09:32.984463 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.984009 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" event={"ID":"62da3121-b9c0-42d1-b441-45c1a4816f11","Type":"ContainerStarted","Data":"24d2b43af5068d9ad259b063c50f23dbbc7545ddddac6cc4d10820cc94056906"} Apr 22 21:09:32.984463 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.984017 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" event={"ID":"62da3121-b9c0-42d1-b441-45c1a4816f11","Type":"ContainerStarted","Data":"e389d11287785fc288e8b21ffdca20265235b0d1abba4d3eac0ae05259ed8a5d"} Apr 22 21:09:32.985163 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.985128 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-b4l88" event={"ID":"0031b709-1393-4369-bbf3-cc631f87aafc","Type":"ContainerStarted","Data":"0798e98e5f0ff4c53d58a261180756b1ca53eac5213ca5f18ed1ecf19de417ed"} Apr 22 21:09:32.986369 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.986355 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rllmv" event={"ID":"e62d12e3-203c-47e6-8292-aebabba9b716","Type":"ContainerStarted","Data":"070cafdf3cfa551c3d66f2040f5432dd76405d3560cd200112fafcff19825a0c"} Apr 22 21:09:32.987618 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.987596 2569 generic.go:358] "Generic (PLEG): container finished" podID="d1d59cdd-035f-4424-9def-015beb3b369f" containerID="acc4fc182b5724eb499476045aa800e9f0d1949cfec00409327a50da653aedad" exitCode=0 Apr 22 21:09:32.987697 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.987671 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z85g" event={"ID":"d1d59cdd-035f-4424-9def-015beb3b369f","Type":"ContainerDied","Data":"acc4fc182b5724eb499476045aa800e9f0d1949cfec00409327a50da653aedad"} Apr 22 21:09:32.988889 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.988864 2569 generic.go:358] "Generic (PLEG): container finished" podID="e52f89589d514c455852c1cdd49a71bd" containerID="eeab1bfdcfeb2bc1a45d029f4d40de1fe2b073b4f0186b0fa5c522781f268681" exitCode=0 Apr 22 21:09:32.988986 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.988919 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" event={"ID":"e52f89589d514c455852c1cdd49a71bd","Type":"ContainerDied","Data":"eeab1bfdcfeb2bc1a45d029f4d40de1fe2b073b4f0186b0fa5c522781f268681"} Apr 22 21:09:32.990286 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.990190 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-q864p" event={"ID":"5e2e8788-770b-4c8b-aa3b-d52af912e57b","Type":"ContainerStarted","Data":"4a39e56fbd1d8913cc450095da308e743fade753405cfb91642c0055d07df894"} Apr 22 21:09:32.991513 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.991493 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" event={"ID":"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c","Type":"ContainerStarted","Data":"ee88288ac8eb17e93bb4baf6f0bafc02d816e4ec7bd5074a5ac4423a048bb8cf"} Apr 22 21:09:32.997949 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:32.997912 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ccslk" podStartSLOduration=3.293149485 podStartE2EDuration="20.997901547s" podCreationTimestamp="2026-04-22 21:09:12 +0000 UTC" firstStartedPulling="2026-04-22 21:09:13.973752844 +0000 UTC m=+1.846644673" lastFinishedPulling="2026-04-22 21:09:31.678504898 +0000 UTC m=+19.551396735" observedRunningTime="2026-04-22 21:09:32.028755725 +0000 UTC m=+19.901647577" watchObservedRunningTime="2026-04-22 21:09:32.997901547 +0000 UTC m=+20.870793463" Apr 22 21:09:33.010004 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:33.009971 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-b4l88" podStartSLOduration=3.371163894 podStartE2EDuration="21.009963995s" podCreationTimestamp="2026-04-22 21:09:12 +0000 UTC" firstStartedPulling="2026-04-22 21:09:14.01402297 +0000 UTC m=+1.886914799" lastFinishedPulling="2026-04-22 21:09:31.652823056 +0000 UTC m=+19.525714900" observedRunningTime="2026-04-22 21:09:32.997742633 +0000 UTC m=+20.870634486" watchObservedRunningTime="2026-04-22 21:09:33.009963995 +0000 UTC m=+20.882855847" Apr 22 21:09:33.010123 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:33.010103 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-q864p" podStartSLOduration=6.238383838 podStartE2EDuration="20.010098745s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:14.043222044 +0000 UTC m=+1.916113875" lastFinishedPulling="2026-04-22 21:09:27.814936948 +0000 UTC m=+15.687828782" observedRunningTime="2026-04-22 21:09:33.009522186 +0000 UTC m=+20.882414049" watchObservedRunningTime="2026-04-22 21:09:33.010098745 +0000 UTC m=+20.882990600" Apr 22 21:09:33.021915 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:33.021722 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rllmv" podStartSLOduration=7.2058975180000004 podStartE2EDuration="21.021710043s" podCreationTimestamp="2026-04-22 21:09:12 +0000 UTC" firstStartedPulling="2026-04-22 21:09:13.999126203 +0000 UTC m=+1.872018033" lastFinishedPulling="2026-04-22 21:09:27.814938706 +0000 UTC m=+15.687830558" observedRunningTime="2026-04-22 21:09:33.021396499 +0000 UTC m=+20.894288352" watchObservedRunningTime="2026-04-22 21:09:33.021710043 +0000 UTC m=+20.894601896" Apr 22 21:09:33.527315 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:33.527154 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 21:09:33.701634 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:33.701442 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T21:09:33.527311943Z","UUID":"7255ec88-a193-4561-ab08-9f8980311327","Handler":null,"Name":"","Endpoint":""} Apr 22 21:09:33.702883 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:33.702859 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 21:09:33.702883 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:33.702888 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 21:09:33.784860 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:33.784840 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:33.784979 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:33.784839 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:33.784979 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:33.784960 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:33.785069 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:33.785028 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:33.954447 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:33.954374 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-b4l88" Apr 22 21:09:33.954962 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:33.954943 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-b4l88" Apr 22 21:09:33.994313 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:33.994262 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-r4xxs" event={"ID":"0840e683-cbd0-4d5c-aa8f-e3ef7dc8bba0","Type":"ContainerStarted","Data":"7dd37e2894d800884a6fe8c247d039c7c4edd0044360a8bd1bd52ecfe71e3933"} Apr 22 21:09:33.995962 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:33.995937 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" event={"ID":"e52f89589d514c455852c1cdd49a71bd","Type":"ContainerStarted","Data":"5846766063b1e88f7efc4ac909eb7cabfbfd3844c4e05a64af60cf02a3160733"} Apr 22 21:09:33.997468 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:33.997444 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" event={"ID":"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c","Type":"ContainerStarted","Data":"13b6ea3a75cae7ce0d87680ab2d1750b601ef5586e957b60bb378fb24bef7ce8"} Apr 22 21:09:34.006831 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:34.006795 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-r4xxs" podStartSLOduration=3.1141542700000002 podStartE2EDuration="21.006785808s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:13.786850742 +0000 UTC m=+1.659742576" lastFinishedPulling="2026-04-22 21:09:31.679482277 +0000 UTC m=+19.552374114" observedRunningTime="2026-04-22 21:09:34.006317086 +0000 UTC m=+21.879208937" watchObservedRunningTime="2026-04-22 21:09:34.006785808 +0000 UTC m=+21.879677659" Apr 22 21:09:34.019124 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:34.019063 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-75.ec2.internal" podStartSLOduration=21.019052151 podStartE2EDuration="21.019052151s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:09:34.018700889 +0000 UTC m=+21.891592743" watchObservedRunningTime="2026-04-22 21:09:34.019052151 +0000 UTC m=+21.891944003" Apr 22 21:09:35.000682 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:35.000602 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" event={"ID":"3f9ade50-8cb6-4e21-a9e4-dad84e22e88c","Type":"ContainerStarted","Data":"45d9d7fa2018acf880dccde9eaec1b930a5fd91627d28f650a5a252adb1499f9"} Apr 22 21:09:35.003908 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:35.003882 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:09:35.004313 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:35.004256 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" event={"ID":"62da3121-b9c0-42d1-b441-45c1a4816f11","Type":"ContainerStarted","Data":"9a1a47d88439225cfb68f8568860e05aee86c2f3d4981611560c0e3ccc30e739"} Apr 22 21:09:35.004435 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:35.004315 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 21:09:35.016651 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:35.016596 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-87ttd" podStartSLOduration=1.553150588 podStartE2EDuration="22.016581881s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:14.026962758 +0000 UTC m=+1.899854588" lastFinishedPulling="2026-04-22 21:09:34.49039404 +0000 UTC m=+22.363285881" observedRunningTime="2026-04-22 21:09:35.016091355 +0000 UTC m=+22.888983208" watchObservedRunningTime="2026-04-22 21:09:35.016581881 +0000 UTC m=+22.889473734" Apr 22 21:09:35.785361 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:35.785330 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:35.785504 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:35.785330 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:35.785504 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:35.785459 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:35.785504 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:35.785494 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:37.784874 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:37.784739 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:37.785221 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:37.784739 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:37.785221 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:37.784987 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:37.785221 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:37.785030 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:38.013054 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:38.013031 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:09:38.013381 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:38.013353 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" event={"ID":"62da3121-b9c0-42d1-b441-45c1a4816f11","Type":"ContainerStarted","Data":"343d5e8fad06f1162e83de1df041d0155f4c8e581949657213dd10529e967ea1"} Apr 22 21:09:38.013679 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:38.013660 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:38.013846 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:38.013833 2569 scope.go:117] "RemoveContainer" containerID="653173a1b9339e9eb860f186f05e40f443d7cbf880625807ad8ef3415989561f" Apr 22 21:09:38.027476 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:38.027456 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:39.016525 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:39.016342 2569 generic.go:358] "Generic (PLEG): container finished" podID="d1d59cdd-035f-4424-9def-015beb3b369f" containerID="3cb5e99898814013709196afc74130946daf4c1c257e6143dba5e99beb9f6bb7" exitCode=0 Apr 22 21:09:39.017043 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:39.016430 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z85g" event={"ID":"d1d59cdd-035f-4424-9def-015beb3b369f","Type":"ContainerDied","Data":"3cb5e99898814013709196afc74130946daf4c1c257e6143dba5e99beb9f6bb7"} Apr 22 21:09:39.019749 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:39.019732 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:09:39.020011 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:39.019987 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" event={"ID":"62da3121-b9c0-42d1-b441-45c1a4816f11","Type":"ContainerStarted","Data":"e4007f126ccc4d8feb77fe74f71a8ab174bc376289002b7ee3b3bc006825f8ce"} Apr 22 21:09:39.020199 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:39.020116 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 21:09:39.020380 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:39.020360 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:39.034488 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:39.034471 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:39.055655 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:39.055631 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-b4l88" Apr 22 21:09:39.055759 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:39.055748 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 21:09:39.056184 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:39.056165 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-b4l88" Apr 22 21:09:39.065324 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:39.065291 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" podStartSLOduration=8.008308041 podStartE2EDuration="26.065280872s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:13.78692614 +0000 UTC m=+1.659817973" lastFinishedPulling="2026-04-22 21:09:31.843898971 +0000 UTC m=+19.716790804" observedRunningTime="2026-04-22 21:09:39.063865957 +0000 UTC m=+26.936757808" watchObservedRunningTime="2026-04-22 21:09:39.065280872 +0000 UTC m=+26.938172733" Apr 22 21:09:39.784844 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:39.784822 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:39.784971 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:39.784822 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:39.784971 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:39.784931 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:39.785082 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:39.784972 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:40.023375 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:40.023344 2569 generic.go:358] "Generic (PLEG): container finished" podID="d1d59cdd-035f-4424-9def-015beb3b369f" containerID="6ad55e823e5d90b00ae21b86ee15afa061910da777bbbc67437d2d53938cd4ea" exitCode=0 Apr 22 21:09:40.023792 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:40.023412 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z85g" event={"ID":"d1d59cdd-035f-4424-9def-015beb3b369f","Type":"ContainerDied","Data":"6ad55e823e5d90b00ae21b86ee15afa061910da777bbbc67437d2d53938cd4ea"} Apr 22 21:09:40.023842 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:40.023799 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 21:09:41.026731 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:41.026699 2569 generic.go:358] "Generic (PLEG): container finished" podID="d1d59cdd-035f-4424-9def-015beb3b369f" containerID="a603b68b817c1a9c6a52922dab6dce7fd7a9d0740015c46b419b528a88e03d4a" exitCode=0 Apr 22 21:09:41.027130 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:41.026799 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z85g" event={"ID":"d1d59cdd-035f-4424-9def-015beb3b369f","Type":"ContainerDied","Data":"a603b68b817c1a9c6a52922dab6dce7fd7a9d0740015c46b419b528a88e03d4a"} Apr 22 21:09:41.027130 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:41.026926 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 21:09:41.562279 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:41.562249 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:09:41.785305 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:41.785275 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:41.785439 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:41.785275 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:41.785439 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:41.785378 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:41.785548 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:41.785485 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:42.039613 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:42.039562 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" podUID="62da3121-b9c0-42d1-b441-45c1a4816f11" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 21:09:43.785069 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:43.785032 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:43.785500 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:43.785032 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:43.785500 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:43.785196 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:43.785500 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:43.785274 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:45.388460 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:45.388424 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs\") pod \"network-metrics-daemon-d7j8j\" (UID: \"80bac7af-2767-4aee-b3fa-d0683f389b6a\") " pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:45.389171 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:45.388601 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:45.389171 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:45.388682 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs podName:80bac7af-2767-4aee-b3fa-d0683f389b6a nodeName:}" failed. No retries permitted until 2026-04-22 21:10:17.388659399 +0000 UTC m=+65.261551243 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs") pod "network-metrics-daemon-d7j8j" (UID: "80bac7af-2767-4aee-b3fa-d0683f389b6a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:45.489002 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:45.488971 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbsg\" (UniqueName: \"kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg\") pod \"network-check-target-b9rbt\" (UID: \"d8c1c10e-bf24-4fb2-9019-e759c35b5460\") " pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:45.489191 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:45.489118 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:45.489191 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:45.489152 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:45.489191 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:45.489163 2569 projected.go:194] Error preparing data for projected volume kube-api-access-xrbsg for pod openshift-network-diagnostics/network-check-target-b9rbt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:45.489354 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:45.489213 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg podName:d8c1c10e-bf24-4fb2-9019-e759c35b5460 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:17.489195699 +0000 UTC m=+65.362087529 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-xrbsg" (UniqueName: "kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg") pod "network-check-target-b9rbt" (UID: "d8c1c10e-bf24-4fb2-9019-e759c35b5460") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:45.785061 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:45.785027 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:45.785061 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:45.785042 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:45.785311 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:45.785167 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:45.785377 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:45.785302 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:47.040351 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:47.040315 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z85g" event={"ID":"d1d59cdd-035f-4424-9def-015beb3b369f","Type":"ContainerStarted","Data":"21cda822349a8f056339f64ede475d9cbbd71f28c379de138888e115dca95b12"} Apr 22 21:09:47.784942 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:47.784912 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:47.785112 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:47.784912 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:47.785112 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:47.785034 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:47.785112 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:47.785075 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:48.044752 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:48.044682 2569 generic.go:358] "Generic (PLEG): container finished" podID="d1d59cdd-035f-4424-9def-015beb3b369f" containerID="21cda822349a8f056339f64ede475d9cbbd71f28c379de138888e115dca95b12" exitCode=0 Apr 22 21:09:48.044752 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:48.044730 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z85g" event={"ID":"d1d59cdd-035f-4424-9def-015beb3b369f","Type":"ContainerDied","Data":"21cda822349a8f056339f64ede475d9cbbd71f28c379de138888e115dca95b12"} Apr 22 21:09:49.049212 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:49.049179 2569 generic.go:358] "Generic (PLEG): container finished" podID="d1d59cdd-035f-4424-9def-015beb3b369f" containerID="fa94ce9a2088fd19d83820da4d48bc9f82daa79a5590177fa2a0f007e401b6cb" exitCode=0 Apr 22 21:09:49.049561 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:49.049219 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z85g" event={"ID":"d1d59cdd-035f-4424-9def-015beb3b369f","Type":"ContainerDied","Data":"fa94ce9a2088fd19d83820da4d48bc9f82daa79a5590177fa2a0f007e401b6cb"} Apr 22 21:09:49.784467 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:49.784437 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:49.784647 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:49.784440 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:49.784647 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:49.784553 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:49.784647 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:49.784593 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:50.054081 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:50.054006 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z85g" event={"ID":"d1d59cdd-035f-4424-9def-015beb3b369f","Type":"ContainerStarted","Data":"aaaa986bcc39034195f3197d05b196682fd43b48f6dbc5f2d40affda5d6b0057"} Apr 22 21:09:50.073974 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:50.073929 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6z85g" podStartSLOduration=5.206025793 podStartE2EDuration="38.073915015s" podCreationTimestamp="2026-04-22 21:09:12 +0000 UTC" firstStartedPulling="2026-04-22 21:09:13.967389457 +0000 UTC m=+1.840281291" lastFinishedPulling="2026-04-22 21:09:46.835278679 +0000 UTC m=+34.708170513" observedRunningTime="2026-04-22 21:09:50.073399349 +0000 UTC m=+37.946291202" watchObservedRunningTime="2026-04-22 21:09:50.073915015 +0000 UTC m=+37.946806904" Apr 22 21:09:51.785305 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:51.785268 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:51.785694 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:51.785268 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:51.785694 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:51.785366 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:51.785694 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:51.785439 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:52.744686 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:52.743511 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b9rbt"] Apr 22 21:09:52.744686 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:52.743831 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:52.744686 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:52.743937 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:52.746197 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:52.746176 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d7j8j"] Apr 22 21:09:52.746307 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:52.746260 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:52.746409 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:52.746362 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:54.785158 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:54.785124 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:54.785606 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:54.785182 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:54.785606 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:54.785279 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:54.786345 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:54.786248 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:56.784758 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:56.784724 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:56.785238 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:56.784739 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:56.785238 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:56.784826 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b9rbt" podUID="d8c1c10e-bf24-4fb2-9019-e759c35b5460" Apr 22 21:09:56.785238 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:56.784941 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-d7j8j" podUID="80bac7af-2767-4aee-b3fa-d0683f389b6a" Apr 22 21:09:56.991052 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:56.991020 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-75.ec2.internal" event="NodeReady" Apr 22 21:09:56.991244 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:56.991159 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 21:09:57.030925 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.030895 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xdxhq"] Apr 22 21:09:57.067682 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.067617 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-j7nqz"] Apr 22 21:09:57.067810 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.067754 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xdxhq" Apr 22 21:09:57.069952 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.069932 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 21:09:57.070395 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.070378 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 21:09:57.070612 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.070595 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7ll98\"" Apr 22 21:09:57.097102 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.097083 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xdxhq"] Apr 22 21:09:57.097102 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.097103 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j7nqz"] Apr 22 21:09:57.097232 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.097209 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.099951 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.099812 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 21:09:57.099951 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.099826 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7tkh8\"" Apr 22 21:09:57.099951 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.099806 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 21:09:57.099951 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.099880 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 21:09:57.100182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.099971 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 21:09:57.134927 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.134905 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-z44pn"] Apr 22 21:09:57.148598 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.148579 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z44pn"] Apr 22 21:09:57.148686 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.148672 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z44pn" Apr 22 21:09:57.150907 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.150884 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 21:09:57.151004 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.150905 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 21:09:57.151004 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.150939 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-c86tt\"" Apr 22 21:09:57.151004 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.151000 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 21:09:57.180869 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.180845 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9bc6f374-c3e0-48a6-8231-9d5d16eea2ac-config-volume\") pod \"dns-default-xdxhq\" (UID: \"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac\") " pod="openshift-dns/dns-default-xdxhq" Apr 22 21:09:57.180970 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.180877 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bc6f374-c3e0-48a6-8231-9d5d16eea2ac-metrics-tls\") pod \"dns-default-xdxhq\" (UID: \"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac\") " pod="openshift-dns/dns-default-xdxhq" Apr 22 21:09:57.180970 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.180942 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9bc6f374-c3e0-48a6-8231-9d5d16eea2ac-tmp-dir\") pod \"dns-default-xdxhq\" (UID: \"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac\") " pod="openshift-dns/dns-default-xdxhq" Apr 22 21:09:57.181062 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.180978 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hxgs\" (UniqueName: \"kubernetes.io/projected/9bc6f374-c3e0-48a6-8231-9d5d16eea2ac-kube-api-access-8hxgs\") pod \"dns-default-xdxhq\" (UID: \"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac\") " pod="openshift-dns/dns-default-xdxhq" Apr 22 21:09:57.282126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.282102 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e22ce551-77f1-4198-bdd2-edf964b0a064-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.282263 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.282130 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4xgj\" (UniqueName: \"kubernetes.io/projected/e22ce551-77f1-4198-bdd2-edf964b0a064-kube-api-access-p4xgj\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.282263 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.282170 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hxgs\" (UniqueName: \"kubernetes.io/projected/9bc6f374-c3e0-48a6-8231-9d5d16eea2ac-kube-api-access-8hxgs\") pod \"dns-default-xdxhq\" (UID: \"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac\") " pod="openshift-dns/dns-default-xdxhq" Apr 22 21:09:57.282263 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.282189 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrnw7\" (UniqueName: \"kubernetes.io/projected/661d84d5-8c52-4e6c-823f-add2e843f2a4-kube-api-access-jrnw7\") pod \"ingress-canary-z44pn\" (UID: \"661d84d5-8c52-4e6c-823f-add2e843f2a4\") " pod="openshift-ingress-canary/ingress-canary-z44pn" Apr 22 21:09:57.282408 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.282295 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e22ce551-77f1-4198-bdd2-edf964b0a064-data-volume\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.282408 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.282330 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9bc6f374-c3e0-48a6-8231-9d5d16eea2ac-config-volume\") pod \"dns-default-xdxhq\" (UID: \"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac\") " pod="openshift-dns/dns-default-xdxhq" Apr 22 21:09:57.282408 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.282349 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bc6f374-c3e0-48a6-8231-9d5d16eea2ac-metrics-tls\") pod \"dns-default-xdxhq\" (UID: \"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac\") " pod="openshift-dns/dns-default-xdxhq" Apr 22 21:09:57.282408 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.282392 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e22ce551-77f1-4198-bdd2-edf964b0a064-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.282567 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.282434 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9bc6f374-c3e0-48a6-8231-9d5d16eea2ac-tmp-dir\") pod \"dns-default-xdxhq\" (UID: \"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac\") " pod="openshift-dns/dns-default-xdxhq" Apr 22 21:09:57.282567 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.282457 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/661d84d5-8c52-4e6c-823f-add2e843f2a4-cert\") pod \"ingress-canary-z44pn\" (UID: \"661d84d5-8c52-4e6c-823f-add2e843f2a4\") " pod="openshift-ingress-canary/ingress-canary-z44pn" Apr 22 21:09:57.282567 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.282482 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e22ce551-77f1-4198-bdd2-edf964b0a064-crio-socket\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.282758 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.282742 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9bc6f374-c3e0-48a6-8231-9d5d16eea2ac-tmp-dir\") pod \"dns-default-xdxhq\" (UID: \"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac\") " pod="openshift-dns/dns-default-xdxhq" Apr 22 21:09:57.282835 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.282818 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9bc6f374-c3e0-48a6-8231-9d5d16eea2ac-config-volume\") pod \"dns-default-xdxhq\" (UID: \"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac\") " pod="openshift-dns/dns-default-xdxhq" Apr 22 21:09:57.286367 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.286345 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9bc6f374-c3e0-48a6-8231-9d5d16eea2ac-metrics-tls\") pod \"dns-default-xdxhq\" (UID: \"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac\") " pod="openshift-dns/dns-default-xdxhq" Apr 22 21:09:57.288867 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.288840 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hxgs\" (UniqueName: \"kubernetes.io/projected/9bc6f374-c3e0-48a6-8231-9d5d16eea2ac-kube-api-access-8hxgs\") pod \"dns-default-xdxhq\" (UID: \"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac\") " pod="openshift-dns/dns-default-xdxhq" Apr 22 21:09:57.376339 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.376261 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xdxhq" Apr 22 21:09:57.383187 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.383126 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e22ce551-77f1-4198-bdd2-edf964b0a064-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.383307 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.383201 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/661d84d5-8c52-4e6c-823f-add2e843f2a4-cert\") pod \"ingress-canary-z44pn\" (UID: \"661d84d5-8c52-4e6c-823f-add2e843f2a4\") " pod="openshift-ingress-canary/ingress-canary-z44pn" Apr 22 21:09:57.383307 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.383228 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e22ce551-77f1-4198-bdd2-edf964b0a064-crio-socket\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.383307 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.383259 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e22ce551-77f1-4198-bdd2-edf964b0a064-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.383465 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.383391 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4xgj\" (UniqueName: \"kubernetes.io/projected/e22ce551-77f1-4198-bdd2-edf964b0a064-kube-api-access-p4xgj\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.383465 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.383400 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e22ce551-77f1-4198-bdd2-edf964b0a064-crio-socket\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.383465 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.383435 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrnw7\" (UniqueName: \"kubernetes.io/projected/661d84d5-8c52-4e6c-823f-add2e843f2a4-kube-api-access-jrnw7\") pod \"ingress-canary-z44pn\" (UID: \"661d84d5-8c52-4e6c-823f-add2e843f2a4\") " pod="openshift-ingress-canary/ingress-canary-z44pn" Apr 22 21:09:57.383602 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.383475 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e22ce551-77f1-4198-bdd2-edf964b0a064-data-volume\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.383970 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.383839 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e22ce551-77f1-4198-bdd2-edf964b0a064-data-volume\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.384071 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.383971 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e22ce551-77f1-4198-bdd2-edf964b0a064-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.386368 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.386347 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/661d84d5-8c52-4e6c-823f-add2e843f2a4-cert\") pod \"ingress-canary-z44pn\" (UID: \"661d84d5-8c52-4e6c-823f-add2e843f2a4\") " pod="openshift-ingress-canary/ingress-canary-z44pn" Apr 22 21:09:57.386471 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.386379 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e22ce551-77f1-4198-bdd2-edf964b0a064-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.390500 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.390480 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4xgj\" (UniqueName: \"kubernetes.io/projected/e22ce551-77f1-4198-bdd2-edf964b0a064-kube-api-access-p4xgj\") pod \"insights-runtime-extractor-j7nqz\" (UID: \"e22ce551-77f1-4198-bdd2-edf964b0a064\") " pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.390608 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.390586 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrnw7\" (UniqueName: \"kubernetes.io/projected/661d84d5-8c52-4e6c-823f-add2e843f2a4-kube-api-access-jrnw7\") pod \"ingress-canary-z44pn\" (UID: \"661d84d5-8c52-4e6c-823f-add2e843f2a4\") " pod="openshift-ingress-canary/ingress-canary-z44pn" Apr 22 21:09:57.415446 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.415420 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j7nqz" Apr 22 21:09:57.456550 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.456522 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z44pn" Apr 22 21:09:57.550829 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.550802 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xdxhq"] Apr 22 21:09:57.553880 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.553844 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j7nqz"] Apr 22 21:09:57.554914 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:57.554883 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bc6f374_c3e0_48a6_8231_9d5d16eea2ac.slice/crio-b27d0dd8ee5dd42e512771ab8646eee7efcb5fdcdf60bfdb30e06258f087206d WatchSource:0}: Error finding container b27d0dd8ee5dd42e512771ab8646eee7efcb5fdcdf60bfdb30e06258f087206d: Status 404 returned error can't find the container with id b27d0dd8ee5dd42e512771ab8646eee7efcb5fdcdf60bfdb30e06258f087206d Apr 22 21:09:57.558370 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:57.558344 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode22ce551_77f1_4198_bdd2_edf964b0a064.slice/crio-8c035ee4177823f2ed013cc110f081ff37f90fe69e5abf38567962f37d3a9915 WatchSource:0}: Error finding container 8c035ee4177823f2ed013cc110f081ff37f90fe69e5abf38567962f37d3a9915: Status 404 returned error can't find the container with id 8c035ee4177823f2ed013cc110f081ff37f90fe69e5abf38567962f37d3a9915 Apr 22 21:09:57.581055 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:57.581033 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z44pn"] Apr 22 21:09:57.583855 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:09:57.583832 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod661d84d5_8c52_4e6c_823f_add2e843f2a4.slice/crio-0f7f91f85d48b5f663464d98fa49bf777f02dc84ade6d0a2401e55eaa0bf264a WatchSource:0}: Error finding container 0f7f91f85d48b5f663464d98fa49bf777f02dc84ade6d0a2401e55eaa0bf264a: Status 404 returned error can't find the container with id 0f7f91f85d48b5f663464d98fa49bf777f02dc84ade6d0a2401e55eaa0bf264a Apr 22 21:09:58.069382 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:58.069343 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z44pn" event={"ID":"661d84d5-8c52-4e6c-823f-add2e843f2a4","Type":"ContainerStarted","Data":"0f7f91f85d48b5f663464d98fa49bf777f02dc84ade6d0a2401e55eaa0bf264a"} Apr 22 21:09:58.071258 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:58.071228 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j7nqz" event={"ID":"e22ce551-77f1-4198-bdd2-edf964b0a064","Type":"ContainerStarted","Data":"9ae9925b138b9c4a730d741616cbd5126081b39b81a0cf7d6be56883d5572c07"} Apr 22 21:09:58.071394 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:58.071267 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j7nqz" event={"ID":"e22ce551-77f1-4198-bdd2-edf964b0a064","Type":"ContainerStarted","Data":"8c035ee4177823f2ed013cc110f081ff37f90fe69e5abf38567962f37d3a9915"} Apr 22 21:09:58.072331 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:58.072302 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xdxhq" event={"ID":"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac","Type":"ContainerStarted","Data":"b27d0dd8ee5dd42e512771ab8646eee7efcb5fdcdf60bfdb30e06258f087206d"} Apr 22 21:09:58.785156 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:58.785098 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:09:58.785400 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:58.785377 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:09:58.789600 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:58.789553 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bm97r\"" Apr 22 21:09:58.789600 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:58.789567 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 21:09:58.789783 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:58.789567 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ccfht\"" Apr 22 21:09:58.789783 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:58.789646 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 21:09:58.789783 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:58.789554 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 21:09:59.635408 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.635376 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2"] Apr 22 21:09:59.667854 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.667301 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8ktp7"] Apr 22 21:09:59.669290 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.668674 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:09:59.674161 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.674122 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 21:09:59.677741 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.674409 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-rb8p5\"" Apr 22 21:09:59.677826 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.674492 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 21:09:59.677891 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.674663 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 21:09:59.677891 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.674719 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 21:09:59.679076 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.679054 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 21:09:59.698639 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.698617 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk776\" (UniqueName: \"kubernetes.io/projected/0ae169a5-d6ee-433b-af7c-56170339c262-kube-api-access-zk776\") pod \"openshift-state-metrics-9d44df66c-7zzq2\" (UID: \"0ae169a5-d6ee-433b-af7c-56170339c262\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:09:59.698722 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.698652 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ae169a5-d6ee-433b-af7c-56170339c262-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7zzq2\" (UID: \"0ae169a5-d6ee-433b-af7c-56170339c262\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:09:59.698764 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.698728 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ae169a5-d6ee-433b-af7c-56170339c262-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7zzq2\" (UID: \"0ae169a5-d6ee-433b-af7c-56170339c262\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:09:59.698764 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.698757 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0ae169a5-d6ee-433b-af7c-56170339c262-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7zzq2\" (UID: \"0ae169a5-d6ee-433b-af7c-56170339c262\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:09:59.699252 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.699223 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2"] Apr 22 21:09:59.699252 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.699246 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8ktp7"] Apr 22 21:09:59.699410 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.699261 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jrgd2"] Apr 22 21:09:59.699410 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.699332 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.701654 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.701637 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 21:09:59.701839 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.701818 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-q6kbz\"" Apr 22 21:09:59.701934 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.701818 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 21:09:59.701934 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.701873 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 21:09:59.714088 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.714072 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.716197 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.716178 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 21:09:59.716197 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.716189 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 21:09:59.716349 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.716186 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7bz2b\"" Apr 22 21:09:59.716349 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.716295 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 21:09:59.799731 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.799700 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.799890 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.799736 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kht9\" (UniqueName: \"kubernetes.io/projected/832e8ef7-8db0-4d87-a721-afdfff094b49-kube-api-access-7kht9\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.799890 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.799765 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-textfile\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.799890 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.799830 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0ae169a5-d6ee-433b-af7c-56170339c262-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7zzq2\" (UID: \"0ae169a5-d6ee-433b-af7c-56170339c262\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:09:59.799890 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.799867 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.799890 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.799888 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.800166 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.799910 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/832e8ef7-8db0-4d87-a721-afdfff094b49-sys\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.800166 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.799926 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-tls\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.800166 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.799963 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcr7p\" (UniqueName: \"kubernetes.io/projected/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-api-access-rcr7p\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.800166 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.799989 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.800166 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.800014 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ae169a5-d6ee-433b-af7c-56170339c262-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7zzq2\" (UID: \"0ae169a5-d6ee-433b-af7c-56170339c262\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:09:59.800166 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.800062 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/832e8ef7-8db0-4d87-a721-afdfff094b49-root\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.800166 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.800136 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/832e8ef7-8db0-4d87-a721-afdfff094b49-metrics-client-ca\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.800166 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:59.800139 2569 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 21:09:59.800516 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.800189 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-wtmp\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.800516 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:59.800220 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ae169a5-d6ee-433b-af7c-56170339c262-openshift-state-metrics-tls podName:0ae169a5-d6ee-433b-af7c-56170339c262 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:00.300199915 +0000 UTC m=+48.173091765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/0ae169a5-d6ee-433b-af7c-56170339c262-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-7zzq2" (UID: "0ae169a5-d6ee-433b-af7c-56170339c262") : secret "openshift-state-metrics-tls" not found Apr 22 21:09:59.800516 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.800257 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zk776\" (UniqueName: \"kubernetes.io/projected/0ae169a5-d6ee-433b-af7c-56170339c262-kube-api-access-zk776\") pod \"openshift-state-metrics-9d44df66c-7zzq2\" (UID: \"0ae169a5-d6ee-433b-af7c-56170339c262\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:09:59.800516 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.800280 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.800516 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.800305 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-accelerators-collector-config\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.800516 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.800323 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.800516 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.800348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ae169a5-d6ee-433b-af7c-56170339c262-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7zzq2\" (UID: \"0ae169a5-d6ee-433b-af7c-56170339c262\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:09:59.800950 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.800934 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ae169a5-d6ee-433b-af7c-56170339c262-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7zzq2\" (UID: \"0ae169a5-d6ee-433b-af7c-56170339c262\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:09:59.802643 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.802618 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0ae169a5-d6ee-433b-af7c-56170339c262-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7zzq2\" (UID: \"0ae169a5-d6ee-433b-af7c-56170339c262\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:09:59.807690 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.807663 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk776\" (UniqueName: \"kubernetes.io/projected/0ae169a5-d6ee-433b-af7c-56170339c262-kube-api-access-zk776\") pod \"openshift-state-metrics-9d44df66c-7zzq2\" (UID: \"0ae169a5-d6ee-433b-af7c-56170339c262\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:09:59.901165 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901083 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/832e8ef7-8db0-4d87-a721-afdfff094b49-root\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.901165 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901120 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/832e8ef7-8db0-4d87-a721-afdfff094b49-metrics-client-ca\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.901165 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901156 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-wtmp\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.901355 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901185 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.901355 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901203 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/832e8ef7-8db0-4d87-a721-afdfff094b49-root\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.901355 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901206 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-accelerators-collector-config\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.901355 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901245 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.901355 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901281 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.901355 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901299 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kht9\" (UniqueName: \"kubernetes.io/projected/832e8ef7-8db0-4d87-a721-afdfff094b49-kube-api-access-7kht9\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.901355 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901318 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-textfile\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.901355 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901345 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-wtmp\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.901355 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901354 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.901752 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901382 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.901752 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901420 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/832e8ef7-8db0-4d87-a721-afdfff094b49-sys\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.901752 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-tls\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.901752 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901487 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcr7p\" (UniqueName: \"kubernetes.io/projected/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-api-access-rcr7p\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.901752 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901520 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.901752 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:59.901646 2569 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 21:09:59.901752 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:09:59.901702 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-state-metrics-tls podName:8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb nodeName:}" failed. No retries permitted until 2026-04-22 21:10:00.401685002 +0000 UTC m=+48.274576852 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-8ktp7" (UID: "8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb") : secret "kube-state-metrics-tls" not found Apr 22 21:09:59.901752 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901704 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-textfile\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.901752 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901715 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/832e8ef7-8db0-4d87-a721-afdfff094b49-sys\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.902210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901763 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/832e8ef7-8db0-4d87-a721-afdfff094b49-metrics-client-ca\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.902210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901815 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-accelerators-collector-config\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.902210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.901979 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.902210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.902179 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.902418 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.902389 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.903732 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.903713 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:09:59.903890 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.903873 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.904306 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.904291 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/832e8ef7-8db0-4d87-a721-afdfff094b49-node-exporter-tls\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.911239 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.911220 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kht9\" (UniqueName: \"kubernetes.io/projected/832e8ef7-8db0-4d87-a721-afdfff094b49-kube-api-access-7kht9\") pod \"node-exporter-jrgd2\" (UID: \"832e8ef7-8db0-4d87-a721-afdfff094b49\") " pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:09:59.911629 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:09:59.911608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcr7p\" (UniqueName: \"kubernetes.io/projected/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-api-access-rcr7p\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:10:00.022486 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.022456 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jrgd2" Apr 22 21:10:00.070470 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:10:00.070444 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod832e8ef7_8db0_4d87_a721_afdfff094b49.slice/crio-f60e9b3218cb8d06a9ed87fcf6149f2c4de4ac79d7d6e634f7a8ff42d6b4a037 WatchSource:0}: Error finding container f60e9b3218cb8d06a9ed87fcf6149f2c4de4ac79d7d6e634f7a8ff42d6b4a037: Status 404 returned error can't find the container with id f60e9b3218cb8d06a9ed87fcf6149f2c4de4ac79d7d6e634f7a8ff42d6b4a037 Apr 22 21:10:00.070791 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.070725 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-8fb9f8f44-5nlb2"] Apr 22 21:10:00.104961 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.104937 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8fb9f8f44-5nlb2"] Apr 22 21:10:00.105076 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.104962 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jrgd2" event={"ID":"832e8ef7-8db0-4d87-a721-afdfff094b49","Type":"ContainerStarted","Data":"f60e9b3218cb8d06a9ed87fcf6149f2c4de4ac79d7d6e634f7a8ff42d6b4a037"} Apr 22 21:10:00.105076 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.105063 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.107616 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.107591 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 21:10:00.107718 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.107680 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 21:10:00.107803 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.107744 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 21:10:00.107865 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.107811 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 21:10:00.107865 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.107744 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-pg8x7\"" Apr 22 21:10:00.107964 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.107951 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 21:10:00.108090 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.108072 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 21:10:00.108178 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.108089 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 21:10:00.203285 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.203033 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-console-config\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.203394 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.203338 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd95v\" (UniqueName: \"kubernetes.io/projected/53580084-30b5-4540-b077-e50d91769724-kube-api-access-dd95v\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.203394 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.203374 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-service-ca\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.203501 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.203412 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53580084-30b5-4540-b077-e50d91769724-console-oauth-config\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.203587 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.203565 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-oauth-serving-cert\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.203643 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.203622 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53580084-30b5-4540-b077-e50d91769724-console-serving-cert\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.306176 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.304937 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dd95v\" (UniqueName: \"kubernetes.io/projected/53580084-30b5-4540-b077-e50d91769724-kube-api-access-dd95v\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.306176 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.305811 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-service-ca\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.306176 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.305944 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-service-ca\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.306176 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.306015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53580084-30b5-4540-b077-e50d91769724-console-oauth-config\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.306176 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.306128 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-oauth-serving-cert\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.306554 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.306183 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ae169a5-d6ee-433b-af7c-56170339c262-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7zzq2\" (UID: \"0ae169a5-d6ee-433b-af7c-56170339c262\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:10:00.306554 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.306223 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53580084-30b5-4540-b077-e50d91769724-console-serving-cert\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.306554 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.306279 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-console-config\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.310069 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.307108 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-console-config\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.310069 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.307326 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-oauth-serving-cert\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.312160 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.312126 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53580084-30b5-4540-b077-e50d91769724-console-serving-cert\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.313976 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.313934 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53580084-30b5-4540-b077-e50d91769724-console-oauth-config\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.314207 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.314183 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ae169a5-d6ee-433b-af7c-56170339c262-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7zzq2\" (UID: \"0ae169a5-d6ee-433b-af7c-56170339c262\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:10:00.320705 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.320665 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd95v\" (UniqueName: \"kubernetes.io/projected/53580084-30b5-4540-b077-e50d91769724-kube-api-access-dd95v\") pod \"console-8fb9f8f44-5nlb2\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.407036 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.406962 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:10:00.409827 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.409799 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8ktp7\" (UID: \"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:10:00.414530 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.414514 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:00.539451 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.539408 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8fb9f8f44-5nlb2"] Apr 22 21:10:00.545198 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:10:00.545138 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53580084_30b5_4540_b077_e50d91769724.slice/crio-220661187a0d56e7140f5eb35885349be8a3e734671a1ff51f8a78ee3952888a WatchSource:0}: Error finding container 220661187a0d56e7140f5eb35885349be8a3e734671a1ff51f8a78ee3952888a: Status 404 returned error can't find the container with id 220661187a0d56e7140f5eb35885349be8a3e734671a1ff51f8a78ee3952888a Apr 22 21:10:00.586367 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.586332 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" Apr 22 21:10:00.607513 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.607485 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" Apr 22 21:10:00.702155 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.702106 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:10:00.712097 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.712069 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.714402 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.714376 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 21:10:00.716087 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.714856 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-xb2qx\"" Apr 22 21:10:00.716087 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.715263 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 21:10:00.716087 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.715496 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 21:10:00.716087 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.715759 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 21:10:00.716087 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.715945 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 21:10:00.717227 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.717129 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 21:10:00.718105 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.718084 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 21:10:00.718520 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.718362 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 21:10:00.721026 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.720005 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 21:10:00.724372 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.724347 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:10:00.724455 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.724386 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2"] Apr 22 21:10:00.744845 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.744798 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8ktp7"] Apr 22 21:10:00.810177 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.810127 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.810328 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.810188 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.810328 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.810227 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-web-config\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.810328 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.810252 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.810328 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.810322 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.810597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.810342 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmcrf\" (UniqueName: \"kubernetes.io/projected/bc3a7000-51c7-4478-b73c-963df00f3606-kube-api-access-xmcrf\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.810597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.810371 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-config-volume\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.810597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.810417 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bc3a7000-51c7-4478-b73c-963df00f3606-config-out\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.810597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.810446 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.810597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.810471 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bc3a7000-51c7-4478-b73c-963df00f3606-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.810597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.810491 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.810597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.810550 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.810597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.810590 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.866482 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:10:00.866438 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ae169a5_d6ee_433b_af7c_56170339c262.slice/crio-4a8d7542892f1d8c08eb11d7f40a1e5d2768987ff7d60e92e2ed7463a7d13824 WatchSource:0}: Error finding container 4a8d7542892f1d8c08eb11d7f40a1e5d2768987ff7d60e92e2ed7463a7d13824: Status 404 returned error can't find the container with id 4a8d7542892f1d8c08eb11d7f40a1e5d2768987ff7d60e92e2ed7463a7d13824 Apr 22 21:10:00.866740 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:10:00.866706 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d4bb5e4_1b2f_4fdb_bb90_9147216fe7eb.slice/crio-10cd52fd0ce68e7928796fefa539a0f138bf3a294adfd61947edc6000e4ae486 WatchSource:0}: Error finding container 10cd52fd0ce68e7928796fefa539a0f138bf3a294adfd61947edc6000e4ae486: Status 404 returned error can't find the container with id 10cd52fd0ce68e7928796fefa539a0f138bf3a294adfd61947edc6000e4ae486 Apr 22 21:10:00.911139 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.911087 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.911139 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.911121 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.911280 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.911160 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-web-config\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.911280 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.911187 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.911280 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.911244 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.911280 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.911266 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmcrf\" (UniqueName: \"kubernetes.io/projected/bc3a7000-51c7-4478-b73c-963df00f3606-kube-api-access-xmcrf\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.911401 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.911290 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-config-volume\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.911401 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.911334 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bc3a7000-51c7-4478-b73c-963df00f3606-config-out\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.911401 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.911367 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.911401 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.911396 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bc3a7000-51c7-4478-b73c-963df00f3606-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.911529 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.911420 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.911529 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.911452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.911529 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.911477 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.912388 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.912196 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.912490 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:00.912338 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-trusted-ca-bundle podName:bc3a7000-51c7-4478-b73c-963df00f3606 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:01.412318924 +0000 UTC m=+49.285210769 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "bc3a7000-51c7-4478-b73c-963df00f3606") : configmap references non-existent config key: ca-bundle.crt Apr 22 21:10:00.914878 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.914849 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bc3a7000-51c7-4478-b73c-963df00f3606-config-out\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.914968 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.914949 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-web-config\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.914968 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.914960 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.915135 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.915114 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.915268 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.915247 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.915712 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.915689 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.917250 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.917224 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bc3a7000-51c7-4478-b73c-963df00f3606-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.920415 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.920391 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmcrf\" (UniqueName: \"kubernetes.io/projected/bc3a7000-51c7-4478-b73c-963df00f3606-kube-api-access-xmcrf\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.921056 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.921031 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.921792 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.921769 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-config-volume\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:00.921866 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:00.921794 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:01.084081 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:01.084031 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" event={"ID":"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb","Type":"ContainerStarted","Data":"10cd52fd0ce68e7928796fefa539a0f138bf3a294adfd61947edc6000e4ae486"} Apr 22 21:10:01.085810 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:01.085787 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j7nqz" event={"ID":"e22ce551-77f1-4198-bdd2-edf964b0a064","Type":"ContainerStarted","Data":"c990b7ff5fad6d3f02837f21f50b27dd80796437fa231850f45075dd1ac6d8c0"} Apr 22 21:10:01.087346 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:01.087307 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" event={"ID":"0ae169a5-d6ee-433b-af7c-56170339c262","Type":"ContainerStarted","Data":"7429e6538290e21d67a6ea5416cdc2b3bf3420a116323cd4529259c32b3fd9dd"} Apr 22 21:10:01.087346 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:01.087331 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" event={"ID":"0ae169a5-d6ee-433b-af7c-56170339c262","Type":"ContainerStarted","Data":"4a8d7542892f1d8c08eb11d7f40a1e5d2768987ff7d60e92e2ed7463a7d13824"} Apr 22 21:10:01.089172 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:01.089131 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xdxhq" event={"ID":"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac","Type":"ContainerStarted","Data":"58ad6e8373a7b234eaf63dc7ccc78e8a4da7571a3f4e9090efcb3bfd63321baf"} Apr 22 21:10:01.089309 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:01.089293 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xdxhq" event={"ID":"9bc6f374-c3e0-48a6-8231-9d5d16eea2ac","Type":"ContainerStarted","Data":"a2177831fd5645698b4ccae10cc0ef778d2d0fc8a5c2bc932f979a29cb9e6717"} Apr 22 21:10:01.089435 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:01.089413 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xdxhq" Apr 22 21:10:01.090613 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:01.090593 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z44pn" event={"ID":"661d84d5-8c52-4e6c-823f-add2e843f2a4","Type":"ContainerStarted","Data":"976e6a09d1f712391ce8be6421aa70146fa19736182d2a13b703955b48d61546"} Apr 22 21:10:01.092498 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:01.092472 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8fb9f8f44-5nlb2" event={"ID":"53580084-30b5-4540-b077-e50d91769724","Type":"ContainerStarted","Data":"220661187a0d56e7140f5eb35885349be8a3e734671a1ff51f8a78ee3952888a"} Apr 22 21:10:01.104112 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:01.104075 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xdxhq" podStartSLOduration=1.594887524 podStartE2EDuration="4.104063964s" podCreationTimestamp="2026-04-22 21:09:57 +0000 UTC" firstStartedPulling="2026-04-22 21:09:57.557259419 +0000 UTC m=+45.430151249" lastFinishedPulling="2026-04-22 21:10:00.066435854 +0000 UTC m=+47.939327689" observedRunningTime="2026-04-22 21:10:01.103580241 +0000 UTC m=+48.976472094" watchObservedRunningTime="2026-04-22 21:10:01.104063964 +0000 UTC m=+48.976955842" Apr 22 21:10:01.118264 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:01.118222 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-z44pn" podStartSLOduration=1.632563837 podStartE2EDuration="4.118204041s" podCreationTimestamp="2026-04-22 21:09:57 +0000 UTC" firstStartedPulling="2026-04-22 21:09:57.585818577 +0000 UTC m=+45.458710406" lastFinishedPulling="2026-04-22 21:10:00.071458765 +0000 UTC m=+47.944350610" observedRunningTime="2026-04-22 21:10:01.117376022 +0000 UTC m=+48.990267876" watchObservedRunningTime="2026-04-22 21:10:01.118204041 +0000 UTC m=+48.991095892" Apr 22 21:10:01.415815 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:01.415771 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:01.416784 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:01.416742 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:01.628613 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:01.628550 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:02.097222 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.096976 2569 generic.go:358] "Generic (PLEG): container finished" podID="832e8ef7-8db0-4d87-a721-afdfff094b49" containerID="3224ebf64795d8cddf89d826fcb19df0f1eb72876508e34d1ccf5d6a23fdaae1" exitCode=0 Apr 22 21:10:02.098085 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.097044 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jrgd2" event={"ID":"832e8ef7-8db0-4d87-a721-afdfff094b49","Type":"ContainerDied","Data":"3224ebf64795d8cddf89d826fcb19df0f1eb72876508e34d1ccf5d6a23fdaae1"} Apr 22 21:10:02.100291 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.100260 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" event={"ID":"0ae169a5-d6ee-433b-af7c-56170339c262","Type":"ContainerStarted","Data":"03a45adbca276fe85ba44c8f18b417227907db19a1c627d369dc70390557ae51"} Apr 22 21:10:02.117568 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.117541 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:10:02.213830 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:10:02.213792 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc3a7000_51c7_4478_b73c_963df00f3606.slice/crio-55254b91c8e68d1e4c3136d314639b76c3570a1b212baea420b9a811b0e250d8 WatchSource:0}: Error finding container 55254b91c8e68d1e4c3136d314639b76c3570a1b212baea420b9a811b0e250d8: Status 404 returned error can't find the container with id 55254b91c8e68d1e4c3136d314639b76c3570a1b212baea420b9a811b0e250d8 Apr 22 21:10:02.707345 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.707317 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk"] Apr 22 21:10:02.722236 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.721537 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.726314 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.723994 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 21:10:02.726314 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.724268 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 21:10:02.726314 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.724434 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 21:10:02.726314 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.724642 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 21:10:02.726314 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.724643 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk"] Apr 22 21:10:02.728172 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.727955 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-zd6hh\"" Apr 22 21:10:02.728172 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.728103 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-9o99ljv8i3c87\"" Apr 22 21:10:02.729944 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.729921 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 21:10:02.828961 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.828928 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.829129 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.828970 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.829129 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.828992 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.829129 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.829113 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-grpc-tls\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.829308 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.829159 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7bdg\" (UniqueName: \"kubernetes.io/projected/cd925cbe-dac6-4f84-8d08-c8337a350d1d-kube-api-access-d7bdg\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.829308 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.829224 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.829308 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.829254 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-tls\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.829308 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.829279 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd925cbe-dac6-4f84-8d08-c8337a350d1d-metrics-client-ca\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.930008 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.929974 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-grpc-tls\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.930008 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.930016 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7bdg\" (UniqueName: \"kubernetes.io/projected/cd925cbe-dac6-4f84-8d08-c8337a350d1d-kube-api-access-d7bdg\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.930275 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.930060 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.930275 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.930095 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-tls\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.930275 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.930119 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd925cbe-dac6-4f84-8d08-c8337a350d1d-metrics-client-ca\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.930275 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.930183 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.930275 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.930211 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.930275 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.930242 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.932219 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.932190 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cd925cbe-dac6-4f84-8d08-c8337a350d1d-metrics-client-ca\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.935025 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.934973 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.935025 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.934999 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.935183 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.935097 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.935253 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.935219 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.935323 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.935281 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-grpc-tls\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.935611 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.935565 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cd925cbe-dac6-4f84-8d08-c8337a350d1d-secret-thanos-querier-tls\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:02.937170 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:02.937134 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7bdg\" (UniqueName: \"kubernetes.io/projected/cd925cbe-dac6-4f84-8d08-c8337a350d1d-kube-api-access-d7bdg\") pod \"thanos-querier-6bd6dd8cc8-pshnk\" (UID: \"cd925cbe-dac6-4f84-8d08-c8337a350d1d\") " pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:03.040253 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.040214 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:03.106561 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.106509 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j7nqz" event={"ID":"e22ce551-77f1-4198-bdd2-edf964b0a064","Type":"ContainerStarted","Data":"d7f988dae669fa99f16be8045c4cd759f293288280787d4f4ec3fcdb9cdcfe6e"} Apr 22 21:10:03.108597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.108543 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jrgd2" event={"ID":"832e8ef7-8db0-4d87-a721-afdfff094b49","Type":"ContainerStarted","Data":"dcacf94c3f46337a6f8d89d6d409aa6ab29e4c1ec8349e30b4d26d34eabd63a1"} Apr 22 21:10:03.108597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.108577 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jrgd2" event={"ID":"832e8ef7-8db0-4d87-a721-afdfff094b49","Type":"ContainerStarted","Data":"20ded96b266c93e97cf0bc127e56d37dedb8595d42ae0c625e781a001927c0ee"} Apr 22 21:10:03.109811 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.109786 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerStarted","Data":"55254b91c8e68d1e4c3136d314639b76c3570a1b212baea420b9a811b0e250d8"} Apr 22 21:10:03.122706 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.122640 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-j7nqz" podStartSLOduration=1.5574863749999999 podStartE2EDuration="6.12262547s" podCreationTimestamp="2026-04-22 21:09:57 +0000 UTC" firstStartedPulling="2026-04-22 21:09:57.662240162 +0000 UTC m=+45.535131992" lastFinishedPulling="2026-04-22 21:10:02.227379242 +0000 UTC m=+50.100271087" observedRunningTime="2026-04-22 21:10:03.122017076 +0000 UTC m=+50.994908953" watchObservedRunningTime="2026-04-22 21:10:03.12262547 +0000 UTC m=+50.995517322" Apr 22 21:10:03.137838 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.137798 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jrgd2" podStartSLOduration=3.107598995 podStartE2EDuration="4.137787136s" podCreationTimestamp="2026-04-22 21:09:59 +0000 UTC" firstStartedPulling="2026-04-22 21:10:00.071999106 +0000 UTC m=+47.944890939" lastFinishedPulling="2026-04-22 21:10:01.102187242 +0000 UTC m=+48.975079080" observedRunningTime="2026-04-22 21:10:03.136976756 +0000 UTC m=+51.009868603" watchObservedRunningTime="2026-04-22 21:10:03.137787136 +0000 UTC m=+51.010678979" Apr 22 21:10:03.943028 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.942822 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-748ffcd58d-lljq4"] Apr 22 21:10:03.958582 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.957358 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-748ffcd58d-lljq4"] Apr 22 21:10:03.958582 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.957518 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:03.959945 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.959898 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 21:10:03.960695 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.960674 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-scmsj\"" Apr 22 21:10:03.960695 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.960686 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-20l4edp2rmr69\"" Apr 22 21:10:03.960857 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.960787 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 21:10:03.960857 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.960674 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 21:10:03.961043 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:03.961022 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 21:10:04.040018 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.039988 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/43473ce1-d3e6-424f-aef0-e230cda76179-secret-metrics-server-tls\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.040214 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.040032 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43473ce1-d3e6-424f-aef0-e230cda76179-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.040214 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.040067 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/43473ce1-d3e6-424f-aef0-e230cda76179-metrics-server-audit-profiles\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.040214 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.040131 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43473ce1-d3e6-424f-aef0-e230cda76179-client-ca-bundle\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.040214 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.040184 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7vbm\" (UniqueName: \"kubernetes.io/projected/43473ce1-d3e6-424f-aef0-e230cda76179-kube-api-access-k7vbm\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.040214 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.040212 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/43473ce1-d3e6-424f-aef0-e230cda76179-secret-metrics-server-client-certs\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.040393 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.040313 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/43473ce1-d3e6-424f-aef0-e230cda76179-audit-log\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.141109 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.141073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/43473ce1-d3e6-424f-aef0-e230cda76179-secret-metrics-server-tls\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.141565 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.141114 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43473ce1-d3e6-424f-aef0-e230cda76179-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.141565 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.141390 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/43473ce1-d3e6-424f-aef0-e230cda76179-metrics-server-audit-profiles\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.141565 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.141510 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43473ce1-d3e6-424f-aef0-e230cda76179-client-ca-bundle\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.141565 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.141535 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k7vbm\" (UniqueName: \"kubernetes.io/projected/43473ce1-d3e6-424f-aef0-e230cda76179-kube-api-access-k7vbm\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.141793 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.141582 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/43473ce1-d3e6-424f-aef0-e230cda76179-secret-metrics-server-client-certs\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.141793 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.141679 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/43473ce1-d3e6-424f-aef0-e230cda76179-audit-log\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.142216 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.141917 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43473ce1-d3e6-424f-aef0-e230cda76179-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.142216 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.141983 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/43473ce1-d3e6-424f-aef0-e230cda76179-audit-log\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.143263 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.143220 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/43473ce1-d3e6-424f-aef0-e230cda76179-metrics-server-audit-profiles\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.145131 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.144879 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/43473ce1-d3e6-424f-aef0-e230cda76179-secret-metrics-server-client-certs\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.145131 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.145087 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/43473ce1-d3e6-424f-aef0-e230cda76179-secret-metrics-server-tls\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.145131 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.145101 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43473ce1-d3e6-424f-aef0-e230cda76179-client-ca-bundle\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.153874 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.153850 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7vbm\" (UniqueName: \"kubernetes.io/projected/43473ce1-d3e6-424f-aef0-e230cda76179-kube-api-access-k7vbm\") pod \"metrics-server-748ffcd58d-lljq4\" (UID: \"43473ce1-d3e6-424f-aef0-e230cda76179\") " pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.269356 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.269315 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:04.438135 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.438110 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8mrrq"] Apr 22 21:10:04.442162 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.441450 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8mrrq" Apr 22 21:10:04.445044 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.443866 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 21:10:04.445044 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.444104 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-tc7jz\"" Apr 22 21:10:04.451660 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.449983 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8mrrq"] Apr 22 21:10:04.526978 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.526672 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk"] Apr 22 21:10:04.531227 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:10:04.531200 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd925cbe_dac6_4f84_8d08_c8337a350d1d.slice/crio-96f793f7ece7797698efcaf3eb4db0b9ab67352cbf013c0ab9508b0ff4c118bf WatchSource:0}: Error finding container 96f793f7ece7797698efcaf3eb4db0b9ab67352cbf013c0ab9508b0ff4c118bf: Status 404 returned error can't find the container with id 96f793f7ece7797698efcaf3eb4db0b9ab67352cbf013c0ab9508b0ff4c118bf Apr 22 21:10:04.539747 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.539691 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-748ffcd58d-lljq4"] Apr 22 21:10:04.545935 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.545901 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/55622d37-e0f0-4be4-b871-5f5e077eff37-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8mrrq\" (UID: \"55622d37-e0f0-4be4-b871-5f5e077eff37\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8mrrq" Apr 22 21:10:04.646566 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.646524 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/55622d37-e0f0-4be4-b871-5f5e077eff37-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8mrrq\" (UID: \"55622d37-e0f0-4be4-b871-5f5e077eff37\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8mrrq" Apr 22 21:10:04.646735 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:04.646687 2569 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 21:10:04.646802 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:04.646770 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55622d37-e0f0-4be4-b871-5f5e077eff37-monitoring-plugin-cert podName:55622d37-e0f0-4be4-b871-5f5e077eff37 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:05.146745592 +0000 UTC m=+53.019637425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/55622d37-e0f0-4be4-b871-5f5e077eff37-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-8mrrq" (UID: "55622d37-e0f0-4be4-b871-5f5e077eff37") : secret "monitoring-plugin-cert" not found Apr 22 21:10:04.822094 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.822012 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-74bbd6b687-tl28x"] Apr 22 21:10:04.825571 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.825546 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.832164 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.831588 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 21:10:04.832164 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.831610 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-wm5rb\"" Apr 22 21:10:04.832164 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.831900 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 21:10:04.832164 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.831951 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 21:10:04.832164 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.832075 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 21:10:04.832164 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.832098 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 21:10:04.839167 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.837051 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-74bbd6b687-tl28x"] Apr 22 21:10:04.839167 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.838310 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 21:10:04.848443 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.848421 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/09610e14-4051-4f28-b405-a73b06e01c3f-telemeter-client-tls\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.848526 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.848474 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09610e14-4051-4f28-b405-a73b06e01c3f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.848577 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.848525 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09610e14-4051-4f28-b405-a73b06e01c3f-serving-certs-ca-bundle\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.848577 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.848543 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/09610e14-4051-4f28-b405-a73b06e01c3f-federate-client-tls\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.848682 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.848633 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09610e14-4051-4f28-b405-a73b06e01c3f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.848736 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.848670 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hspsm\" (UniqueName: \"kubernetes.io/projected/09610e14-4051-4f28-b405-a73b06e01c3f-kube-api-access-hspsm\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.848784 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.848733 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/09610e14-4051-4f28-b405-a73b06e01c3f-secret-telemeter-client\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.848784 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.848777 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09610e14-4051-4f28-b405-a73b06e01c3f-metrics-client-ca\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.949784 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.949763 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/09610e14-4051-4f28-b405-a73b06e01c3f-telemeter-client-tls\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.949858 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.949816 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09610e14-4051-4f28-b405-a73b06e01c3f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.950025 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.950008 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09610e14-4051-4f28-b405-a73b06e01c3f-serving-certs-ca-bundle\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.950058 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.950040 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/09610e14-4051-4f28-b405-a73b06e01c3f-federate-client-tls\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.950104 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.950085 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09610e14-4051-4f28-b405-a73b06e01c3f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.950137 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.950109 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hspsm\" (UniqueName: \"kubernetes.io/projected/09610e14-4051-4f28-b405-a73b06e01c3f-kube-api-access-hspsm\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.950212 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.950158 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/09610e14-4051-4f28-b405-a73b06e01c3f-secret-telemeter-client\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.950212 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.950197 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09610e14-4051-4f28-b405-a73b06e01c3f-metrics-client-ca\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.950932 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.950879 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09610e14-4051-4f28-b405-a73b06e01c3f-serving-certs-ca-bundle\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.951036 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.950941 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/09610e14-4051-4f28-b405-a73b06e01c3f-metrics-client-ca\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.951736 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.951693 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09610e14-4051-4f28-b405-a73b06e01c3f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.952699 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.952670 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/09610e14-4051-4f28-b405-a73b06e01c3f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.952796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.952780 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/09610e14-4051-4f28-b405-a73b06e01c3f-telemeter-client-tls\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.952859 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.952816 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/09610e14-4051-4f28-b405-a73b06e01c3f-federate-client-tls\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.953222 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.953203 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/09610e14-4051-4f28-b405-a73b06e01c3f-secret-telemeter-client\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:04.958095 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:04.958072 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hspsm\" (UniqueName: \"kubernetes.io/projected/09610e14-4051-4f28-b405-a73b06e01c3f-kube-api-access-hspsm\") pod \"telemeter-client-74bbd6b687-tl28x\" (UID: \"09610e14-4051-4f28-b405-a73b06e01c3f\") " pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:05.118687 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.118607 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" event={"ID":"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb","Type":"ContainerStarted","Data":"e8b43de94944b8332272faa2a85910198c2ef934fa98d1ab366fb62c08566d85"} Apr 22 21:10:05.118687 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.118644 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" event={"ID":"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb","Type":"ContainerStarted","Data":"12077a51e7d7cb31628ad25c123f54725f45cd6ac4097b64658699e8f9807ade"} Apr 22 21:10:05.118687 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.118656 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" event={"ID":"8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb","Type":"ContainerStarted","Data":"cb7fe7f2f745e31229fa5342af18081a6457ee6ae1a91a5bb6251b224fff6691"} Apr 22 21:10:05.122660 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.122633 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" event={"ID":"0ae169a5-d6ee-433b-af7c-56170339c262","Type":"ContainerStarted","Data":"ce23fbecfd24d6f324e42ae521be12ef23ab0c4a8d4d48d21389b38a7d686b0c"} Apr 22 21:10:05.123593 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.123573 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" event={"ID":"43473ce1-d3e6-424f-aef0-e230cda76179","Type":"ContainerStarted","Data":"2b25d7e9cd57c1e6677054398391a496abf39256f960ba20d74b0cb4907b45af"} Apr 22 21:10:05.124498 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.124477 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" event={"ID":"cd925cbe-dac6-4f84-8d08-c8337a350d1d","Type":"ContainerStarted","Data":"96f793f7ece7797698efcaf3eb4db0b9ab67352cbf013c0ab9508b0ff4c118bf"} Apr 22 21:10:05.125726 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.125701 2569 generic.go:358] "Generic (PLEG): container finished" podID="bc3a7000-51c7-4478-b73c-963df00f3606" containerID="df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920" exitCode=0 Apr 22 21:10:05.125822 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.125730 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerDied","Data":"df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920"} Apr 22 21:10:05.127180 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.127127 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8fb9f8f44-5nlb2" event={"ID":"53580084-30b5-4540-b077-e50d91769724","Type":"ContainerStarted","Data":"1a67edb01be928d30e9ce98089ad6aa15ab591a4474cd8f5ffccd657fab19507"} Apr 22 21:10:05.135591 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.135553 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-8ktp7" podStartSLOduration=2.643311814 podStartE2EDuration="6.135539978s" podCreationTimestamp="2026-04-22 21:09:59 +0000 UTC" firstStartedPulling="2026-04-22 21:10:00.869128231 +0000 UTC m=+48.742020066" lastFinishedPulling="2026-04-22 21:10:04.361356393 +0000 UTC m=+52.234248230" observedRunningTime="2026-04-22 21:10:05.134452572 +0000 UTC m=+53.007344433" watchObservedRunningTime="2026-04-22 21:10:05.135539978 +0000 UTC m=+53.008431879" Apr 22 21:10:05.144667 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.144647 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" Apr 22 21:10:05.150051 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.149999 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8fb9f8f44-5nlb2" podStartSLOduration=1.338422517 podStartE2EDuration="5.149983279s" podCreationTimestamp="2026-04-22 21:10:00 +0000 UTC" firstStartedPulling="2026-04-22 21:10:00.547542278 +0000 UTC m=+48.420434115" lastFinishedPulling="2026-04-22 21:10:04.359103041 +0000 UTC m=+52.231994877" observedRunningTime="2026-04-22 21:10:05.148323002 +0000 UTC m=+53.021214856" watchObservedRunningTime="2026-04-22 21:10:05.149983279 +0000 UTC m=+53.022875133" Apr 22 21:10:05.155073 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.154780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/55622d37-e0f0-4be4-b871-5f5e077eff37-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8mrrq\" (UID: \"55622d37-e0f0-4be4-b871-5f5e077eff37\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8mrrq" Apr 22 21:10:05.157796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.157771 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/55622d37-e0f0-4be4-b871-5f5e077eff37-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8mrrq\" (UID: \"55622d37-e0f0-4be4-b871-5f5e077eff37\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8mrrq" Apr 22 21:10:05.189050 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.188982 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7zzq2" podStartSLOduration=2.99758251 podStartE2EDuration="6.188965655s" podCreationTimestamp="2026-04-22 21:09:59 +0000 UTC" firstStartedPulling="2026-04-22 21:10:01.169976776 +0000 UTC m=+49.042868609" lastFinishedPulling="2026-04-22 21:10:04.361359907 +0000 UTC m=+52.234251754" observedRunningTime="2026-04-22 21:10:05.187585067 +0000 UTC m=+53.060476920" watchObservedRunningTime="2026-04-22 21:10:05.188965655 +0000 UTC m=+53.061857507" Apr 22 21:10:05.287023 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.286987 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-74bbd6b687-tl28x"] Apr 22 21:10:05.291730 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:10:05.291691 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09610e14_4051_4f28_b405_a73b06e01c3f.slice/crio-1557d875b6cf8147a01826c2fc7da958e6416ba6989074177d722378df68bece WatchSource:0}: Error finding container 1557d875b6cf8147a01826c2fc7da958e6416ba6989074177d722378df68bece: Status 404 returned error can't find the container with id 1557d875b6cf8147a01826c2fc7da958e6416ba6989074177d722378df68bece Apr 22 21:10:05.364380 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.364345 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8mrrq" Apr 22 21:10:05.508037 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.507967 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8mrrq"] Apr 22 21:10:05.512265 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:10:05.512231 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55622d37_e0f0_4be4_b871_5f5e077eff37.slice/crio-730b2a7c9e5104031cbba2d55195510aee9bf6cdf9d3f8b21979233fe1466e00 WatchSource:0}: Error finding container 730b2a7c9e5104031cbba2d55195510aee9bf6cdf9d3f8b21979233fe1466e00: Status 404 returned error can't find the container with id 730b2a7c9e5104031cbba2d55195510aee9bf6cdf9d3f8b21979233fe1466e00 Apr 22 21:10:05.797766 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.797696 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 21:10:05.802894 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.802864 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.805540 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.805515 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 21:10:05.805677 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.805583 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-ktw22\"" Apr 22 21:10:05.805783 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.805522 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 21:10:05.806025 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.805897 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 21:10:05.808169 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.807764 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 21:10:05.808169 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.807939 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 21:10:05.808169 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.808002 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-1bkvgoads58m8\"" Apr 22 21:10:05.810321 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.808952 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 21:10:05.810321 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.809072 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 21:10:05.810321 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.809805 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 21:10:05.810321 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.809981 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 21:10:05.810321 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.810124 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 21:10:05.813940 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.810941 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 21:10:05.813940 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.813180 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 21:10:05.819749 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.819710 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 21:10:05.868118 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.867990 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868310 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868121 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868310 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868208 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868310 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868237 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868310 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868267 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868310 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868289 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868563 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868324 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cxgg\" (UniqueName: \"kubernetes.io/projected/fa12663f-80b2-43a3-b968-76c97dac6965-kube-api-access-8cxgg\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868563 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868353 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fa12663f-80b2-43a3-b968-76c97dac6965-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868563 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868378 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868563 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868413 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868563 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868512 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-web-config\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868794 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868567 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fa12663f-80b2-43a3-b968-76c97dac6965-config-out\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868794 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868794 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868634 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868794 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868658 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868794 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868683 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-config\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868794 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868707 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.868794 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.868769 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970261 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970218 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970449 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970274 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-web-config\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970449 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970309 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fa12663f-80b2-43a3-b968-76c97dac6965-config-out\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970449 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970354 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970449 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970381 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970449 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970405 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970449 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970430 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-config\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970800 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970454 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970800 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970486 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970800 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970531 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970800 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970577 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970800 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970631 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970800 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970662 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970800 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970690 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970800 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970713 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970800 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970749 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxgg\" (UniqueName: \"kubernetes.io/projected/fa12663f-80b2-43a3-b968-76c97dac6965-kube-api-access-8cxgg\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970800 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970774 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fa12663f-80b2-43a3-b968-76c97dac6965-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.970800 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.970798 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.971623 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.971591 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.974210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.973388 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.974210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.973415 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.974745 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.974391 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.976177 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.976105 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.981234 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.980640 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fa12663f-80b2-43a3-b968-76c97dac6965-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.981234 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.980869 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.981234 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.981098 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fa12663f-80b2-43a3-b968-76c97dac6965-config-out\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.981234 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.981190 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.981502 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.981318 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.981963 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.981923 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-config\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.982452 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.982401 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.982562 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.982491 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.982629 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.982576 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-web-config\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.982770 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.982728 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.983019 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.982977 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.983600 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.983582 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:05.985112 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:05.985069 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxgg\" (UniqueName: \"kubernetes.io/projected/fa12663f-80b2-43a3-b968-76c97dac6965-kube-api-access-8cxgg\") pod \"prometheus-k8s-0\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:06.118994 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:06.118909 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:06.131266 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:06.131237 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8mrrq" event={"ID":"55622d37-e0f0-4be4-b871-5f5e077eff37","Type":"ContainerStarted","Data":"730b2a7c9e5104031cbba2d55195510aee9bf6cdf9d3f8b21979233fe1466e00"} Apr 22 21:10:06.132462 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:06.132431 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" event={"ID":"09610e14-4051-4f28-b405-a73b06e01c3f","Type":"ContainerStarted","Data":"1557d875b6cf8147a01826c2fc7da958e6416ba6989074177d722378df68bece"} Apr 22 21:10:08.228241 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:08.227178 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 21:10:08.240654 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:10:08.239004 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa12663f_80b2_43a3_b968_76c97dac6965.slice/crio-78168a1a63d1b46800bff9aa7b18eea2766fe47df5cae3715bfcdd42861ae0fb WatchSource:0}: Error finding container 78168a1a63d1b46800bff9aa7b18eea2766fe47df5cae3715bfcdd42861ae0fb: Status 404 returned error can't find the container with id 78168a1a63d1b46800bff9aa7b18eea2766fe47df5cae3715bfcdd42861ae0fb Apr 22 21:10:09.161574 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.161545 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" event={"ID":"43473ce1-d3e6-424f-aef0-e230cda76179","Type":"ContainerStarted","Data":"50e4da5b9db33ea95bd82d7f974f7b00e47f2e8b269e1ab4b0d276344f94b6a3"} Apr 22 21:10:09.163512 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.163484 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" event={"ID":"cd925cbe-dac6-4f84-8d08-c8337a350d1d","Type":"ContainerStarted","Data":"4492a31a1e9e2b16d60a081fa8cfbb948dd0d9df18a2b00e1d57e40f8ad87bc9"} Apr 22 21:10:09.163619 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.163521 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" event={"ID":"cd925cbe-dac6-4f84-8d08-c8337a350d1d","Type":"ContainerStarted","Data":"8c96bfc4c8e7abf7391617f6b0e6c869e64430eb69b2b8061859af541d61349f"} Apr 22 21:10:09.163619 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.163533 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" event={"ID":"cd925cbe-dac6-4f84-8d08-c8337a350d1d","Type":"ContainerStarted","Data":"debba2b2c10b3e2d5f175966cc688d836eede91db4b6b81fec205fbcbb5ebe87"} Apr 22 21:10:09.164887 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.164865 2569 generic.go:358] "Generic (PLEG): container finished" podID="fa12663f-80b2-43a3-b968-76c97dac6965" containerID="d463c704b73501561a1e2994b18ab2b534b8c93e013b0615932a247e642efdc5" exitCode=0 Apr 22 21:10:09.165009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.164933 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerDied","Data":"d463c704b73501561a1e2994b18ab2b534b8c93e013b0615932a247e642efdc5"} Apr 22 21:10:09.165009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.164952 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerStarted","Data":"78168a1a63d1b46800bff9aa7b18eea2766fe47df5cae3715bfcdd42861ae0fb"} Apr 22 21:10:09.170319 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.170296 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerStarted","Data":"f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e"} Apr 22 21:10:09.170401 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.170322 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerStarted","Data":"396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c"} Apr 22 21:10:09.170401 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.170331 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerStarted","Data":"4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c"} Apr 22 21:10:09.170401 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.170340 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerStarted","Data":"976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c"} Apr 22 21:10:09.170401 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.170348 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerStarted","Data":"be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563"} Apr 22 21:10:09.171811 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.171786 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8mrrq" event={"ID":"55622d37-e0f0-4be4-b871-5f5e077eff37","Type":"ContainerStarted","Data":"65df238a2c1457b210afcc093611057a9141ef1467b72a68278f6d82142361b0"} Apr 22 21:10:09.172011 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.171993 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8mrrq" Apr 22 21:10:09.173718 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.173694 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" event={"ID":"09610e14-4051-4f28-b405-a73b06e01c3f","Type":"ContainerStarted","Data":"b62868d3129aff9f7a9c6faf85e86966c95aaa94c6d4a015b7eb3708e3b1a4c5"} Apr 22 21:10:09.173816 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.173723 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" event={"ID":"09610e14-4051-4f28-b405-a73b06e01c3f","Type":"ContainerStarted","Data":"bca95d5543da7286469ec8f735e4a590ab7991fef0d3d70ebdb267fe5faea6ab"} Apr 22 21:10:09.173816 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.173735 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" event={"ID":"09610e14-4051-4f28-b405-a73b06e01c3f","Type":"ContainerStarted","Data":"f7ca441b75d156147fbb07bca171a3197922e18eeda1783dc779756274809ef4"} Apr 22 21:10:09.177716 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.177687 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8mrrq" Apr 22 21:10:09.177891 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.177858 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" podStartSLOduration=2.954718269 podStartE2EDuration="6.177846801s" podCreationTimestamp="2026-04-22 21:10:03 +0000 UTC" firstStartedPulling="2026-04-22 21:10:04.837866009 +0000 UTC m=+52.710757839" lastFinishedPulling="2026-04-22 21:10:08.060994534 +0000 UTC m=+55.933886371" observedRunningTime="2026-04-22 21:10:09.176036699 +0000 UTC m=+57.048928552" watchObservedRunningTime="2026-04-22 21:10:09.177846801 +0000 UTC m=+57.050738653" Apr 22 21:10:09.193680 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.193630 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-74bbd6b687-tl28x" podStartSLOduration=2.426934878 podStartE2EDuration="5.193615864s" podCreationTimestamp="2026-04-22 21:10:04 +0000 UTC" firstStartedPulling="2026-04-22 21:10:05.294314727 +0000 UTC m=+53.167206569" lastFinishedPulling="2026-04-22 21:10:08.060995701 +0000 UTC m=+55.933887555" observedRunningTime="2026-04-22 21:10:09.192923989 +0000 UTC m=+57.065815836" watchObservedRunningTime="2026-04-22 21:10:09.193615864 +0000 UTC m=+57.066507709" Apr 22 21:10:09.210813 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:09.210763 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8mrrq" podStartSLOduration=2.611021171 podStartE2EDuration="5.210747197s" podCreationTimestamp="2026-04-22 21:10:04 +0000 UTC" firstStartedPulling="2026-04-22 21:10:05.514682474 +0000 UTC m=+53.387574311" lastFinishedPulling="2026-04-22 21:10:08.114408488 +0000 UTC m=+55.987300337" observedRunningTime="2026-04-22 21:10:09.209390151 +0000 UTC m=+57.082282003" watchObservedRunningTime="2026-04-22 21:10:09.210747197 +0000 UTC m=+57.083639048" Apr 22 21:10:10.179984 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:10.179943 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" event={"ID":"cd925cbe-dac6-4f84-8d08-c8337a350d1d","Type":"ContainerStarted","Data":"a6c203135ec4605c32b6476878278afa8efe702c1f69c8725f9027db7583cf38"} Apr 22 21:10:10.179984 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:10.179986 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" event={"ID":"cd925cbe-dac6-4f84-8d08-c8337a350d1d","Type":"ContainerStarted","Data":"3de736564e7631485420c23c5a33dc5ad19a8273d3b823b55a057d385cedfa4d"} Apr 22 21:10:10.180533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:10.179998 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" event={"ID":"cd925cbe-dac6-4f84-8d08-c8337a350d1d","Type":"ContainerStarted","Data":"62fdc10bdf6c33c56dad9d4b217e3165a3626ea76b64e4088affd2eb2de39c6e"} Apr 22 21:10:10.180533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:10.180127 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:10.183541 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:10.183513 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerStarted","Data":"7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61"} Apr 22 21:10:10.199048 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:10.199009 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" podStartSLOduration=3.575388389 podStartE2EDuration="8.198996869s" podCreationTimestamp="2026-04-22 21:10:02 +0000 UTC" firstStartedPulling="2026-04-22 21:10:04.533424191 +0000 UTC m=+52.406316025" lastFinishedPulling="2026-04-22 21:10:09.157032661 +0000 UTC m=+57.029924505" observedRunningTime="2026-04-22 21:10:10.197372108 +0000 UTC m=+58.070263961" watchObservedRunningTime="2026-04-22 21:10:10.198996869 +0000 UTC m=+58.071888724" Apr 22 21:10:10.220739 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:10.220698 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.2801429349999998 podStartE2EDuration="10.220685014s" podCreationTimestamp="2026-04-22 21:10:00 +0000 UTC" firstStartedPulling="2026-04-22 21:10:02.221702542 +0000 UTC m=+50.094594375" lastFinishedPulling="2026-04-22 21:10:09.162244621 +0000 UTC m=+57.035136454" observedRunningTime="2026-04-22 21:10:10.21923237 +0000 UTC m=+58.092124213" watchObservedRunningTime="2026-04-22 21:10:10.220685014 +0000 UTC m=+58.093576872" Apr 22 21:10:10.414918 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:10.414864 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:10.414918 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:10.414925 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:10.420829 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:10.420803 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:11.102788 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:11.102760 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xdxhq" Apr 22 21:10:11.191069 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:11.191039 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:12.053479 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:12.053459 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sshlp" Apr 22 21:10:12.192367 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:12.192273 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerStarted","Data":"a67daa047c0a284b73a0ac6881dfc296057c9ac3d492193780c9eb2aaa55fe47"} Apr 22 21:10:12.192367 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:12.192311 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerStarted","Data":"a02ff8acd2e8ba92e7041cacf8413cf536f3cc2ce378d7ab3101cd1447f85c6b"} Apr 22 21:10:12.192367 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:12.192324 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerStarted","Data":"d16ca88861410afb0b806f3b406d061adb1e2be9f4a7ee308d7b3cab6c5d3237"} Apr 22 21:10:12.192367 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:12.192335 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerStarted","Data":"4baa9b78626000ebe203e47529a46a56036fe5fb18d3a3bd468321369567d75b"} Apr 22 21:10:12.192367 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:12.192346 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerStarted","Data":"0459b12646a06bb44fb4b68eac8dadea2bd5c5c2780c71b6be8e0ec55fabd409"} Apr 22 21:10:12.192367 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:12.192357 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerStarted","Data":"a8ae7fe4f43529491a5c17e64d510f2c1e307c4e0b03518e7e26f4e0ecc372d5"} Apr 22 21:10:12.217771 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:12.217705 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.726714646 podStartE2EDuration="7.217687458s" podCreationTimestamp="2026-04-22 21:10:05 +0000 UTC" firstStartedPulling="2026-04-22 21:10:09.166255481 +0000 UTC m=+57.039147312" lastFinishedPulling="2026-04-22 21:10:11.65722829 +0000 UTC m=+59.530120124" observedRunningTime="2026-04-22 21:10:12.216451223 +0000 UTC m=+60.089343111" watchObservedRunningTime="2026-04-22 21:10:12.217687458 +0000 UTC m=+60.090579312" Apr 22 21:10:16.119637 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:16.119597 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:16.192050 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:16.192018 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6bd6dd8cc8-pshnk" Apr 22 21:10:16.854928 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:16.854895 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b6c46488-hvgp7"] Apr 22 21:10:16.892477 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:16.892446 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b6c46488-hvgp7"] Apr 22 21:10:16.892634 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:16.892524 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:16.899433 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:16.899409 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 21:10:16.979332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:16.979299 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-oauth-serving-cert\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:16.979332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:16.979334 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-trusted-ca-bundle\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:16.979544 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:16.979355 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-service-ca\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:16.979544 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:16.979383 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-oauth-config\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:16.979544 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:16.979418 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np27f\" (UniqueName: \"kubernetes.io/projected/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-kube-api-access-np27f\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:16.979544 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:16.979443 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-config\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:16.979544 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:16.979511 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-serving-cert\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.080163 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.080108 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-serving-cert\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.080351 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.080212 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-oauth-serving-cert\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.080351 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.080240 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-trusted-ca-bundle\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.080351 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.080257 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-service-ca\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.080351 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.080279 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-oauth-config\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.080351 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.080314 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-np27f\" (UniqueName: \"kubernetes.io/projected/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-kube-api-access-np27f\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.080351 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.080352 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-config\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.081030 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.081007 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-oauth-serving-cert\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.081132 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.081083 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-config\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.081223 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.081188 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-service-ca\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.081223 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.081201 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-trusted-ca-bundle\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.082716 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.082694 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-serving-cert\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.082825 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.082728 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-oauth-config\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.090901 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.090879 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-np27f\" (UniqueName: \"kubernetes.io/projected/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-kube-api-access-np27f\") pod \"console-b6c46488-hvgp7\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.203174 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.203072 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:17.326292 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.326257 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b6c46488-hvgp7"] Apr 22 21:10:17.329483 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:10:17.329434 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6f8fd9e_523a_4f00_a44d_15d6e4dbb207.slice/crio-f03a4de923839b341a75b05e8120be109e137ecb6ecb3115c8fcc0f9d07c1ebe WatchSource:0}: Error finding container f03a4de923839b341a75b05e8120be109e137ecb6ecb3115c8fcc0f9d07c1ebe: Status 404 returned error can't find the container with id f03a4de923839b341a75b05e8120be109e137ecb6ecb3115c8fcc0f9d07c1ebe Apr 22 21:10:17.484328 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.484305 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs\") pod \"network-metrics-daemon-d7j8j\" (UID: \"80bac7af-2767-4aee-b3fa-d0683f389b6a\") " pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:10:17.486820 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.486800 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 21:10:17.497379 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.497355 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80bac7af-2767-4aee-b3fa-d0683f389b6a-metrics-certs\") pod \"network-metrics-daemon-d7j8j\" (UID: \"80bac7af-2767-4aee-b3fa-d0683f389b6a\") " pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:10:17.585162 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.585111 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrbsg\" (UniqueName: \"kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg\") pod \"network-check-target-b9rbt\" (UID: \"d8c1c10e-bf24-4fb2-9019-e759c35b5460\") " pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:10:17.587194 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.587174 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 21:10:17.597630 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.597612 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 21:10:17.608696 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.608677 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrbsg\" (UniqueName: \"kubernetes.io/projected/d8c1c10e-bf24-4fb2-9019-e759c35b5460-kube-api-access-xrbsg\") pod \"network-check-target-b9rbt\" (UID: \"d8c1c10e-bf24-4fb2-9019-e759c35b5460\") " pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:10:17.699993 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.699966 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-bm97r\"" Apr 22 21:10:17.704485 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.704466 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ccfht\"" Apr 22 21:10:17.708178 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.708164 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:10:17.713782 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.713765 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-d7j8j" Apr 22 21:10:17.843039 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.842974 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b9rbt"] Apr 22 21:10:17.856104 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:10:17.856077 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8c1c10e_bf24_4fb2_9019_e759c35b5460.slice/crio-3a2e01d42d3d7550e51dd640984f6aa0b93f2f376a4c1569e075f9928c091810 WatchSource:0}: Error finding container 3a2e01d42d3d7550e51dd640984f6aa0b93f2f376a4c1569e075f9928c091810: Status 404 returned error can't find the container with id 3a2e01d42d3d7550e51dd640984f6aa0b93f2f376a4c1569e075f9928c091810 Apr 22 21:10:17.867756 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:17.867735 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-d7j8j"] Apr 22 21:10:17.870366 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:10:17.870342 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80bac7af_2767_4aee_b3fa_d0683f389b6a.slice/crio-7b8d26d92ae081d6bee9e0288685af740cb8da667e8c305cb39112b7bb89f381 WatchSource:0}: Error finding container 7b8d26d92ae081d6bee9e0288685af740cb8da667e8c305cb39112b7bb89f381: Status 404 returned error can't find the container with id 7b8d26d92ae081d6bee9e0288685af740cb8da667e8c305cb39112b7bb89f381 Apr 22 21:10:18.211778 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:18.211696 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b6c46488-hvgp7" event={"ID":"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207","Type":"ContainerStarted","Data":"33eba23e5cb0df63b9d3149757016719e2141f8bfc0593961cfc6ad019bb3a4f"} Apr 22 21:10:18.211778 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:18.211736 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b6c46488-hvgp7" event={"ID":"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207","Type":"ContainerStarted","Data":"f03a4de923839b341a75b05e8120be109e137ecb6ecb3115c8fcc0f9d07c1ebe"} Apr 22 21:10:18.212860 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:18.212839 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d7j8j" event={"ID":"80bac7af-2767-4aee-b3fa-d0683f389b6a","Type":"ContainerStarted","Data":"7b8d26d92ae081d6bee9e0288685af740cb8da667e8c305cb39112b7bb89f381"} Apr 22 21:10:18.213826 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:18.213808 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b9rbt" event={"ID":"d8c1c10e-bf24-4fb2-9019-e759c35b5460","Type":"ContainerStarted","Data":"3a2e01d42d3d7550e51dd640984f6aa0b93f2f376a4c1569e075f9928c091810"} Apr 22 21:10:18.227811 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:18.227766 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b6c46488-hvgp7" podStartSLOduration=2.227754023 podStartE2EDuration="2.227754023s" podCreationTimestamp="2026-04-22 21:10:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:10:18.226187356 +0000 UTC m=+66.099079209" watchObservedRunningTime="2026-04-22 21:10:18.227754023 +0000 UTC m=+66.100645874" Apr 22 21:10:20.222540 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:20.222496 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d7j8j" event={"ID":"80bac7af-2767-4aee-b3fa-d0683f389b6a","Type":"ContainerStarted","Data":"8f88e1eaa952808a035dbffb177eb8740a73947b74e580d3afee7c6b13320d78"} Apr 22 21:10:21.227380 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:21.227339 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-d7j8j" event={"ID":"80bac7af-2767-4aee-b3fa-d0683f389b6a","Type":"ContainerStarted","Data":"17603bc45cd9705d645d3cb360cf9a38f008f550f4555d92721f1b391e9a3aec"} Apr 22 21:10:21.228601 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:21.228581 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b9rbt" event={"ID":"d8c1c10e-bf24-4fb2-9019-e759c35b5460","Type":"ContainerStarted","Data":"c3c895167609bcb3e49fcb53b720d66d9d941681597d4f5de2225ccabb2b53e6"} Apr 22 21:10:21.228732 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:21.228721 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:10:21.244402 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:21.244354 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-d7j8j" podStartSLOduration=67.560880874 podStartE2EDuration="1m9.24433981s" podCreationTimestamp="2026-04-22 21:09:12 +0000 UTC" firstStartedPulling="2026-04-22 21:10:17.886877843 +0000 UTC m=+65.759769680" lastFinishedPulling="2026-04-22 21:10:19.570336779 +0000 UTC m=+67.443228616" observedRunningTime="2026-04-22 21:10:21.242799811 +0000 UTC m=+69.115691676" watchObservedRunningTime="2026-04-22 21:10:21.24433981 +0000 UTC m=+69.117231663" Apr 22 21:10:21.256471 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:21.256427 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-b9rbt" podStartSLOduration=66.325407599 podStartE2EDuration="1m9.256415345s" podCreationTimestamp="2026-04-22 21:09:12 +0000 UTC" firstStartedPulling="2026-04-22 21:10:17.858690919 +0000 UTC m=+65.731582748" lastFinishedPulling="2026-04-22 21:10:20.789698645 +0000 UTC m=+68.662590494" observedRunningTime="2026-04-22 21:10:21.255356079 +0000 UTC m=+69.128247930" watchObservedRunningTime="2026-04-22 21:10:21.256415345 +0000 UTC m=+69.129307196" Apr 22 21:10:21.262952 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:21.262928 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:21.282014 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:21.281990 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:22.246723 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:22.246698 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:24.270322 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:24.270289 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:24.270696 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:24.270369 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:27.203528 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:27.203484 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:27.203528 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:27.203530 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:27.208433 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:27.208412 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:27.252363 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:27.252340 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:10:27.291776 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:27.291748 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8fb9f8f44-5nlb2"] Apr 22 21:10:44.275416 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:44.275387 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:44.279197 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:44.279173 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-748ffcd58d-lljq4" Apr 22 21:10:49.252556 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:49.252526 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:10:49.253027 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:49.252982 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="alertmanager" containerID="cri-o://be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563" gracePeriod=120 Apr 22 21:10:49.253096 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:49.253056 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="kube-rbac-proxy-web" containerID="cri-o://4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c" gracePeriod=120 Apr 22 21:10:49.253185 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:49.253080 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="kube-rbac-proxy" containerID="cri-o://396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c" gracePeriod=120 Apr 22 21:10:49.253185 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:49.253047 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="kube-rbac-proxy-metric" containerID="cri-o://f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e" gracePeriod=120 Apr 22 21:10:49.253285 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:49.253173 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="config-reloader" containerID="cri-o://976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c" gracePeriod=120 Apr 22 21:10:49.253285 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:49.253212 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="prom-label-proxy" containerID="cri-o://7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61" gracePeriod=120 Apr 22 21:10:50.330422 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.330390 2569 generic.go:358] "Generic (PLEG): container finished" podID="bc3a7000-51c7-4478-b73c-963df00f3606" containerID="7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61" exitCode=0 Apr 22 21:10:50.330422 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.330417 2569 generic.go:358] "Generic (PLEG): container finished" podID="bc3a7000-51c7-4478-b73c-963df00f3606" containerID="396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c" exitCode=0 Apr 22 21:10:50.330422 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.330424 2569 generic.go:358] "Generic (PLEG): container finished" podID="bc3a7000-51c7-4478-b73c-963df00f3606" containerID="976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c" exitCode=0 Apr 22 21:10:50.330422 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.330430 2569 generic.go:358] "Generic (PLEG): container finished" podID="bc3a7000-51c7-4478-b73c-963df00f3606" containerID="be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563" exitCode=0 Apr 22 21:10:50.330921 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.330454 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerDied","Data":"7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61"} Apr 22 21:10:50.330921 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.330476 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerDied","Data":"396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c"} Apr 22 21:10:50.330921 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.330486 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerDied","Data":"976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c"} Apr 22 21:10:50.330921 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.330495 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerDied","Data":"be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563"} Apr 22 21:10:50.483443 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.483421 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:50.570638 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.570610 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmcrf\" (UniqueName: \"kubernetes.io/projected/bc3a7000-51c7-4478-b73c-963df00f3606-kube-api-access-xmcrf\") pod \"bc3a7000-51c7-4478-b73c-963df00f3606\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " Apr 22 21:10:50.570776 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.570650 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-metrics-client-ca\") pod \"bc3a7000-51c7-4478-b73c-963df00f3606\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " Apr 22 21:10:50.570776 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.570695 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy\") pod \"bc3a7000-51c7-4478-b73c-963df00f3606\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " Apr 22 21:10:50.570776 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.570729 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-web-config\") pod \"bc3a7000-51c7-4478-b73c-963df00f3606\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " Apr 22 21:10:50.570776 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.570752 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bc3a7000-51c7-4478-b73c-963df00f3606-tls-assets\") pod \"bc3a7000-51c7-4478-b73c-963df00f3606\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " Apr 22 21:10:50.571004 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.570793 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-main-tls\") pod \"bc3a7000-51c7-4478-b73c-963df00f3606\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " Apr 22 21:10:50.571004 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.570819 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy-web\") pod \"bc3a7000-51c7-4478-b73c-963df00f3606\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " Apr 22 21:10:50.571004 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.570863 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-main-db\") pod \"bc3a7000-51c7-4478-b73c-963df00f3606\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " Apr 22 21:10:50.571204 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.571175 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "bc3a7000-51c7-4478-b73c-963df00f3606" (UID: "bc3a7000-51c7-4478-b73c-963df00f3606"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:50.571272 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.571205 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bc3a7000-51c7-4478-b73c-963df00f3606-config-out\") pod \"bc3a7000-51c7-4478-b73c-963df00f3606\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " Apr 22 21:10:50.571272 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.571259 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-trusted-ca-bundle\") pod \"bc3a7000-51c7-4478-b73c-963df00f3606\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " Apr 22 21:10:50.571372 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.571302 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy-metric\") pod \"bc3a7000-51c7-4478-b73c-963df00f3606\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " Apr 22 21:10:50.571423 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.571381 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-cluster-tls-config\") pod \"bc3a7000-51c7-4478-b73c-963df00f3606\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " Apr 22 21:10:50.571474 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.571420 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-config-volume\") pod \"bc3a7000-51c7-4478-b73c-963df00f3606\" (UID: \"bc3a7000-51c7-4478-b73c-963df00f3606\") " Apr 22 21:10:50.571545 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.571521 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "bc3a7000-51c7-4478-b73c-963df00f3606" (UID: "bc3a7000-51c7-4478-b73c-963df00f3606"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:10:50.571737 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.571718 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-main-db\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:50.571796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.571743 2569 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-metrics-client-ca\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:50.573543 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.573515 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc3a7000-51c7-4478-b73c-963df00f3606-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "bc3a7000-51c7-4478-b73c-963df00f3606" (UID: "bc3a7000-51c7-4478-b73c-963df00f3606"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:10:50.573871 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.573843 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "bc3a7000-51c7-4478-b73c-963df00f3606" (UID: "bc3a7000-51c7-4478-b73c-963df00f3606"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:50.573955 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.573933 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc3a7000-51c7-4478-b73c-963df00f3606-kube-api-access-xmcrf" (OuterVolumeSpecName: "kube-api-access-xmcrf") pod "bc3a7000-51c7-4478-b73c-963df00f3606" (UID: "bc3a7000-51c7-4478-b73c-963df00f3606"). InnerVolumeSpecName "kube-api-access-xmcrf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:10:50.574506 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.574482 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "bc3a7000-51c7-4478-b73c-963df00f3606" (UID: "bc3a7000-51c7-4478-b73c-963df00f3606"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:50.574612 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.574524 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "bc3a7000-51c7-4478-b73c-963df00f3606" (UID: "bc3a7000-51c7-4478-b73c-963df00f3606"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:50.574612 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.574595 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "bc3a7000-51c7-4478-b73c-963df00f3606" (UID: "bc3a7000-51c7-4478-b73c-963df00f3606"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:50.574810 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.574775 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "bc3a7000-51c7-4478-b73c-963df00f3606" (UID: "bc3a7000-51c7-4478-b73c-963df00f3606"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:50.575408 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.575385 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc3a7000-51c7-4478-b73c-963df00f3606-config-out" (OuterVolumeSpecName: "config-out") pod "bc3a7000-51c7-4478-b73c-963df00f3606" (UID: "bc3a7000-51c7-4478-b73c-963df00f3606"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:10:50.575672 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.575655 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-config-volume" (OuterVolumeSpecName: "config-volume") pod "bc3a7000-51c7-4478-b73c-963df00f3606" (UID: "bc3a7000-51c7-4478-b73c-963df00f3606"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:50.578074 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.578000 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "bc3a7000-51c7-4478-b73c-963df00f3606" (UID: "bc3a7000-51c7-4478-b73c-963df00f3606"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:50.584224 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.584176 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-web-config" (OuterVolumeSpecName: "web-config") pod "bc3a7000-51c7-4478-b73c-963df00f3606" (UID: "bc3a7000-51c7-4478-b73c-963df00f3606"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:50.672279 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.672247 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc3a7000-51c7-4478-b73c-963df00f3606-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:50.672279 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.672278 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:50.672279 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.672289 2569 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-cluster-tls-config\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:50.672442 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.672298 2569 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-config-volume\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:50.672442 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.672307 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xmcrf\" (UniqueName: \"kubernetes.io/projected/bc3a7000-51c7-4478-b73c-963df00f3606-kube-api-access-xmcrf\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:50.672442 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.672316 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:50.672442 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.672325 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-web-config\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:50.672442 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.672333 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bc3a7000-51c7-4478-b73c-963df00f3606-tls-assets\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:50.672442 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.672341 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-main-tls\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:50.672442 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.672349 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bc3a7000-51c7-4478-b73c-963df00f3606-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:50.672442 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:50.672358 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bc3a7000-51c7-4478-b73c-963df00f3606-config-out\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:51.336665 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.336637 2569 generic.go:358] "Generic (PLEG): container finished" podID="bc3a7000-51c7-4478-b73c-963df00f3606" containerID="f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e" exitCode=0 Apr 22 21:10:51.336665 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.336659 2569 generic.go:358] "Generic (PLEG): container finished" podID="bc3a7000-51c7-4478-b73c-963df00f3606" containerID="4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c" exitCode=0 Apr 22 21:10:51.337117 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.336679 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerDied","Data":"f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e"} Apr 22 21:10:51.337117 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.336701 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerDied","Data":"4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c"} Apr 22 21:10:51.337117 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.336711 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bc3a7000-51c7-4478-b73c-963df00f3606","Type":"ContainerDied","Data":"55254b91c8e68d1e4c3136d314639b76c3570a1b212baea420b9a811b0e250d8"} Apr 22 21:10:51.337117 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.336726 2569 scope.go:117] "RemoveContainer" containerID="7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61" Apr 22 21:10:51.337117 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.336751 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.346695 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.346675 2569 scope.go:117] "RemoveContainer" containerID="f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e" Apr 22 21:10:51.355670 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.355653 2569 scope.go:117] "RemoveContainer" containerID="396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c" Apr 22 21:10:51.358773 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.358751 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:10:51.362987 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.362630 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:10:51.364995 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.364577 2569 scope.go:117] "RemoveContainer" containerID="4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c" Apr 22 21:10:51.371170 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.371136 2569 scope.go:117] "RemoveContainer" containerID="976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c" Apr 22 21:10:51.377563 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.377546 2569 scope.go:117] "RemoveContainer" containerID="be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563" Apr 22 21:10:51.384313 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.384282 2569 scope.go:117] "RemoveContainer" containerID="df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920" Apr 22 21:10:51.384682 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.384662 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:10:51.385189 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385090 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="kube-rbac-proxy" Apr 22 21:10:51.385189 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385105 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="kube-rbac-proxy" Apr 22 21:10:51.385189 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385114 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="prom-label-proxy" Apr 22 21:10:51.385189 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385119 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="prom-label-proxy" Apr 22 21:10:51.385189 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385139 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="kube-rbac-proxy-metric" Apr 22 21:10:51.385189 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385162 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="kube-rbac-proxy-metric" Apr 22 21:10:51.385189 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385170 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="alertmanager" Apr 22 21:10:51.385189 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385175 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="alertmanager" Apr 22 21:10:51.385189 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385181 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="kube-rbac-proxy-web" Apr 22 21:10:51.385189 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385186 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="kube-rbac-proxy-web" Apr 22 21:10:51.385189 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385194 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="config-reloader" Apr 22 21:10:51.385594 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385199 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="config-reloader" Apr 22 21:10:51.385594 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385206 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="init-config-reloader" Apr 22 21:10:51.385594 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385211 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="init-config-reloader" Apr 22 21:10:51.385594 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385253 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="kube-rbac-proxy-metric" Apr 22 21:10:51.385594 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385261 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="kube-rbac-proxy-web" Apr 22 21:10:51.385594 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385272 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="kube-rbac-proxy" Apr 22 21:10:51.385594 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385278 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="config-reloader" Apr 22 21:10:51.385594 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385284 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="alertmanager" Apr 22 21:10:51.385594 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.385290 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" containerName="prom-label-proxy" Apr 22 21:10:51.391555 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.391533 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.393747 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.393714 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 21:10:51.393860 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.393796 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 21:10:51.393860 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.393801 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 21:10:51.393860 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.393841 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 21:10:51.394069 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.394050 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 21:10:51.394231 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.394212 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 21:10:51.394320 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.394226 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-xb2qx\"" Apr 22 21:10:51.394320 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.394231 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 21:10:51.394320 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.394212 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 21:10:51.395084 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.395064 2569 scope.go:117] "RemoveContainer" containerID="7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61" Apr 22 21:10:51.395388 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:51.395361 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61\": container with ID starting with 7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61 not found: ID does not exist" containerID="7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61" Apr 22 21:10:51.395472 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.395399 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61"} err="failed to get container status \"7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61\": rpc error: code = NotFound desc = could not find container \"7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61\": container with ID starting with 7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61 not found: ID does not exist" Apr 22 21:10:51.395472 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.395451 2569 scope.go:117] "RemoveContainer" containerID="f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e" Apr 22 21:10:51.395718 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:51.395681 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e\": container with ID starting with f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e not found: ID does not exist" containerID="f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e" Apr 22 21:10:51.395767 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.395724 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e"} err="failed to get container status \"f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e\": rpc error: code = NotFound desc = could not find container \"f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e\": container with ID starting with f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e not found: ID does not exist" Apr 22 21:10:51.395767 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.395750 2569 scope.go:117] "RemoveContainer" containerID="396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c" Apr 22 21:10:51.395999 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:51.395977 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c\": container with ID starting with 396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c not found: ID does not exist" containerID="396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c" Apr 22 21:10:51.396067 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.396009 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c"} err="failed to get container status \"396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c\": rpc error: code = NotFound desc = could not find container \"396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c\": container with ID starting with 396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c not found: ID does not exist" Apr 22 21:10:51.396067 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.396025 2569 scope.go:117] "RemoveContainer" containerID="4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c" Apr 22 21:10:51.396319 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:51.396302 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c\": container with ID starting with 4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c not found: ID does not exist" containerID="4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c" Apr 22 21:10:51.396382 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.396322 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c"} err="failed to get container status \"4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c\": rpc error: code = NotFound desc = could not find container \"4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c\": container with ID starting with 4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c not found: ID does not exist" Apr 22 21:10:51.396382 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.396341 2569 scope.go:117] "RemoveContainer" containerID="976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c" Apr 22 21:10:51.396672 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:51.396647 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c\": container with ID starting with 976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c not found: ID does not exist" containerID="976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c" Apr 22 21:10:51.396791 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.396677 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c"} err="failed to get container status \"976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c\": rpc error: code = NotFound desc = could not find container \"976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c\": container with ID starting with 976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c not found: ID does not exist" Apr 22 21:10:51.396791 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.396697 2569 scope.go:117] "RemoveContainer" containerID="be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563" Apr 22 21:10:51.397115 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:51.397006 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563\": container with ID starting with be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563 not found: ID does not exist" containerID="be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563" Apr 22 21:10:51.397115 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.397026 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563"} err="failed to get container status \"be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563\": rpc error: code = NotFound desc = could not find container \"be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563\": container with ID starting with be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563 not found: ID does not exist" Apr 22 21:10:51.397115 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.397046 2569 scope.go:117] "RemoveContainer" containerID="df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920" Apr 22 21:10:51.398090 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:51.398008 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920\": container with ID starting with df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920 not found: ID does not exist" containerID="df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920" Apr 22 21:10:51.398090 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.398049 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920"} err="failed to get container status \"df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920\": rpc error: code = NotFound desc = could not find container \"df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920\": container with ID starting with df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920 not found: ID does not exist" Apr 22 21:10:51.398090 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.398073 2569 scope.go:117] "RemoveContainer" containerID="7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61" Apr 22 21:10:51.399241 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.399214 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61"} err="failed to get container status \"7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61\": rpc error: code = NotFound desc = could not find container \"7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61\": container with ID starting with 7c3910ab4ed1c953fb0448b52092083a1c48b3aeef35e81506a216856f561a61 not found: ID does not exist" Apr 22 21:10:51.399241 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.399241 2569 scope.go:117] "RemoveContainer" containerID="f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e" Apr 22 21:10:51.400058 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.400027 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 21:10:51.400482 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.400456 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e"} err="failed to get container status \"f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e\": rpc error: code = NotFound desc = could not find container \"f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e\": container with ID starting with f53dcb1d0411768aae8d06a9ec19444e82cb00d28d8d1dcad6808d483e7b667e not found: ID does not exist" Apr 22 21:10:51.400482 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.400483 2569 scope.go:117] "RemoveContainer" containerID="396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c" Apr 22 21:10:51.400750 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.400731 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c"} err="failed to get container status \"396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c\": rpc error: code = NotFound desc = could not find container \"396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c\": container with ID starting with 396884ce11c5c05c8a4dcaa4b6aa7f61470672e8e65ac4d87f1806fedca5c76c not found: ID does not exist" Apr 22 21:10:51.400814 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.400751 2569 scope.go:117] "RemoveContainer" containerID="4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c" Apr 22 21:10:51.400989 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.400973 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c"} err="failed to get container status \"4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c\": rpc error: code = NotFound desc = could not find container \"4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c\": container with ID starting with 4035caef45bb2822316272745e3b1277d6d92ccc969c50d629dbda1c9eaa1d8c not found: ID does not exist" Apr 22 21:10:51.400989 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.400989 2569 scope.go:117] "RemoveContainer" containerID="976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c" Apr 22 21:10:51.401297 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.401259 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:10:51.401297 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.401282 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c"} err="failed to get container status \"976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c\": rpc error: code = NotFound desc = could not find container \"976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c\": container with ID starting with 976108a686da0781f7c19e55ddd91d1a20b694fc48f5a9e3419433042f89133c not found: ID does not exist" Apr 22 21:10:51.401455 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.401303 2569 scope.go:117] "RemoveContainer" containerID="be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563" Apr 22 21:10:51.401592 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.401572 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563"} err="failed to get container status \"be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563\": rpc error: code = NotFound desc = could not find container \"be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563\": container with ID starting with be4009c8c8c3014be08707412515df6fe2d2822c8757699eb2ab5c15e39ab563 not found: ID does not exist" Apr 22 21:10:51.401699 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.401596 2569 scope.go:117] "RemoveContainer" containerID="df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920" Apr 22 21:10:51.401894 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.401876 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920"} err="failed to get container status \"df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920\": rpc error: code = NotFound desc = could not find container \"df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920\": container with ID starting with df8ff2dd672d57d638eddc3b73530c2d62c74d9db466e299976d1b4e7ae20920 not found: ID does not exist" Apr 22 21:10:51.479116 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.479086 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3ceb214d-3079-4c0d-a30f-961fe468d29b-config-out\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.479116 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.479124 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.479347 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.479164 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.479347 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.479181 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nm68\" (UniqueName: \"kubernetes.io/projected/3ceb214d-3079-4c0d-a30f-961fe468d29b-kube-api-access-7nm68\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.479347 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.479252 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.479347 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.479297 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3ceb214d-3079-4c0d-a30f-961fe468d29b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.479347 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.479320 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3ceb214d-3079-4c0d-a30f-961fe468d29b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.479347 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.479337 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ceb214d-3079-4c0d-a30f-961fe468d29b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.479524 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.479363 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-config-volume\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.479524 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.479411 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.479524 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.479431 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-web-config\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.479524 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.479476 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.479524 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.479495 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3ceb214d-3079-4c0d-a30f-961fe468d29b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.580561 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.580513 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3ceb214d-3079-4c0d-a30f-961fe468d29b-config-out\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.580561 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.580566 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.580748 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.580588 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.580748 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.580610 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nm68\" (UniqueName: \"kubernetes.io/projected/3ceb214d-3079-4c0d-a30f-961fe468d29b-kube-api-access-7nm68\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.580748 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.580652 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.580748 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.580698 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3ceb214d-3079-4c0d-a30f-961fe468d29b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.580945 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.580830 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3ceb214d-3079-4c0d-a30f-961fe468d29b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.580945 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.580873 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ceb214d-3079-4c0d-a30f-961fe468d29b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.580945 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.580923 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-config-volume\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.581105 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.580969 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.581105 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.580994 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-web-config\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.581105 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.581051 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.581105 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.581079 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3ceb214d-3079-4c0d-a30f-961fe468d29b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.581551 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.581468 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/3ceb214d-3079-4c0d-a30f-961fe468d29b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.582076 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.581690 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3ceb214d-3079-4c0d-a30f-961fe468d29b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.583656 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.583549 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3ceb214d-3079-4c0d-a30f-961fe468d29b-config-out\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.584074 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.583836 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.584074 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.583915 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.584074 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.583937 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.584074 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.584038 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-web-config\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.584323 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.584038 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.584323 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.584274 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.584466 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.584445 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3ceb214d-3079-4c0d-a30f-961fe468d29b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.584522 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.584491 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ceb214d-3079-4c0d-a30f-961fe468d29b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.585730 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.585712 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/3ceb214d-3079-4c0d-a30f-961fe468d29b-config-volume\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.588381 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.588329 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nm68\" (UniqueName: \"kubernetes.io/projected/3ceb214d-3079-4c0d-a30f-961fe468d29b-kube-api-access-7nm68\") pod \"alertmanager-main-0\" (UID: \"3ceb214d-3079-4c0d-a30f-961fe468d29b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.703523 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.703487 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 21:10:51.830206 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:51.830167 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 21:10:51.834114 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:10:51.834081 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ceb214d_3079_4c0d_a30f_961fe468d29b.slice/crio-c7ee08cc3a6c574ff4f6e9d66e38dfaf1eb7ced52da281f6fe4577ded09eb1ea WatchSource:0}: Error finding container c7ee08cc3a6c574ff4f6e9d66e38dfaf1eb7ced52da281f6fe4577ded09eb1ea: Status 404 returned error can't find the container with id c7ee08cc3a6c574ff4f6e9d66e38dfaf1eb7ced52da281f6fe4577ded09eb1ea Apr 22 21:10:52.234498 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.234417 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-b9rbt" Apr 22 21:10:52.312581 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.312545 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-8fb9f8f44-5nlb2" podUID="53580084-30b5-4540-b077-e50d91769724" containerName="console" containerID="cri-o://1a67edb01be928d30e9ce98089ad6aa15ab591a4474cd8f5ffccd657fab19507" gracePeriod=15 Apr 22 21:10:52.340341 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.340312 2569 generic.go:358] "Generic (PLEG): container finished" podID="3ceb214d-3079-4c0d-a30f-961fe468d29b" containerID="ee36a291ce22bb213c8dd02f21f52b4375c40bf7b197c96af5c5b2b481efaee1" exitCode=0 Apr 22 21:10:52.340650 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.340401 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3ceb214d-3079-4c0d-a30f-961fe468d29b","Type":"ContainerDied","Data":"ee36a291ce22bb213c8dd02f21f52b4375c40bf7b197c96af5c5b2b481efaee1"} Apr 22 21:10:52.340650 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.340432 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3ceb214d-3079-4c0d-a30f-961fe468d29b","Type":"ContainerStarted","Data":"c7ee08cc3a6c574ff4f6e9d66e38dfaf1eb7ced52da281f6fe4577ded09eb1ea"} Apr 22 21:10:52.550433 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.550411 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8fb9f8f44-5nlb2_53580084-30b5-4540-b077-e50d91769724/console/0.log" Apr 22 21:10:52.550519 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.550490 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:52.590001 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.589976 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd95v\" (UniqueName: \"kubernetes.io/projected/53580084-30b5-4540-b077-e50d91769724-kube-api-access-dd95v\") pod \"53580084-30b5-4540-b077-e50d91769724\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " Apr 22 21:10:52.590121 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.590025 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-console-config\") pod \"53580084-30b5-4540-b077-e50d91769724\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " Apr 22 21:10:52.590121 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.590101 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-service-ca\") pod \"53580084-30b5-4540-b077-e50d91769724\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " Apr 22 21:10:52.590344 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.590124 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53580084-30b5-4540-b077-e50d91769724-console-oauth-config\") pod \"53580084-30b5-4540-b077-e50d91769724\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " Apr 22 21:10:52.590344 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.590183 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53580084-30b5-4540-b077-e50d91769724-console-serving-cert\") pod \"53580084-30b5-4540-b077-e50d91769724\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " Apr 22 21:10:52.590344 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.590218 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-oauth-serving-cert\") pod \"53580084-30b5-4540-b077-e50d91769724\" (UID: \"53580084-30b5-4540-b077-e50d91769724\") " Apr 22 21:10:52.590876 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.590681 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "53580084-30b5-4540-b077-e50d91769724" (UID: "53580084-30b5-4540-b077-e50d91769724"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:52.590876 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.590827 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-console-config" (OuterVolumeSpecName: "console-config") pod "53580084-30b5-4540-b077-e50d91769724" (UID: "53580084-30b5-4540-b077-e50d91769724"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:52.590876 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.590839 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-service-ca" (OuterVolumeSpecName: "service-ca") pod "53580084-30b5-4540-b077-e50d91769724" (UID: "53580084-30b5-4540-b077-e50d91769724"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:52.592789 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.592767 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53580084-30b5-4540-b077-e50d91769724-kube-api-access-dd95v" (OuterVolumeSpecName: "kube-api-access-dd95v") pod "53580084-30b5-4540-b077-e50d91769724" (UID: "53580084-30b5-4540-b077-e50d91769724"). InnerVolumeSpecName "kube-api-access-dd95v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:10:52.592984 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.592949 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53580084-30b5-4540-b077-e50d91769724-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "53580084-30b5-4540-b077-e50d91769724" (UID: "53580084-30b5-4540-b077-e50d91769724"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:52.593490 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.593468 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53580084-30b5-4540-b077-e50d91769724-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "53580084-30b5-4540-b077-e50d91769724" (UID: "53580084-30b5-4540-b077-e50d91769724"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:52.691450 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.691425 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-service-ca\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:52.691450 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.691449 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/53580084-30b5-4540-b077-e50d91769724-console-oauth-config\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:52.691556 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.691461 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/53580084-30b5-4540-b077-e50d91769724-console-serving-cert\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:52.691556 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.691470 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-oauth-serving-cert\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:52.691556 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.691480 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dd95v\" (UniqueName: \"kubernetes.io/projected/53580084-30b5-4540-b077-e50d91769724-kube-api-access-dd95v\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:52.691556 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.691488 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/53580084-30b5-4540-b077-e50d91769724-console-config\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:52.789855 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:52.789832 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc3a7000-51c7-4478-b73c-963df00f3606" path="/var/lib/kubelet/pods/bc3a7000-51c7-4478-b73c-963df00f3606/volumes" Apr 22 21:10:53.346780 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.346748 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3ceb214d-3079-4c0d-a30f-961fe468d29b","Type":"ContainerStarted","Data":"bf85ba10d0687d2c35373d31b760c5fc2ec8857e9f3d41660d81259bba4eba27"} Apr 22 21:10:53.346780 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.346785 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3ceb214d-3079-4c0d-a30f-961fe468d29b","Type":"ContainerStarted","Data":"6ce8b3d5664f9e49390166983fa9d9583c34bdde8ccd493b25c4a80a1a76251c"} Apr 22 21:10:53.347272 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.346795 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3ceb214d-3079-4c0d-a30f-961fe468d29b","Type":"ContainerStarted","Data":"cfcda63b541ae4d724bf845c08514d06e80f9cca380664d4731d9f83a840727b"} Apr 22 21:10:53.347272 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.346807 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3ceb214d-3079-4c0d-a30f-961fe468d29b","Type":"ContainerStarted","Data":"3cb112a5dc10649e44380ab498b8436acc0c59b8944417929f360e1f79939a71"} Apr 22 21:10:53.347272 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.346819 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3ceb214d-3079-4c0d-a30f-961fe468d29b","Type":"ContainerStarted","Data":"8ffaf1f33f1b020a07f68116b5f402a997bbad050aed5c369eb75ff11cb01b3e"} Apr 22 21:10:53.347272 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.346832 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"3ceb214d-3079-4c0d-a30f-961fe468d29b","Type":"ContainerStarted","Data":"73ef2e6972fee3da4278603a77bd10319351ff9b843ef2e7de1e42be476e569c"} Apr 22 21:10:53.347805 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.347790 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-8fb9f8f44-5nlb2_53580084-30b5-4540-b077-e50d91769724/console/0.log" Apr 22 21:10:53.347854 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.347823 2569 generic.go:358] "Generic (PLEG): container finished" podID="53580084-30b5-4540-b077-e50d91769724" containerID="1a67edb01be928d30e9ce98089ad6aa15ab591a4474cd8f5ffccd657fab19507" exitCode=2 Apr 22 21:10:53.347884 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.347851 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8fb9f8f44-5nlb2" event={"ID":"53580084-30b5-4540-b077-e50d91769724","Type":"ContainerDied","Data":"1a67edb01be928d30e9ce98089ad6aa15ab591a4474cd8f5ffccd657fab19507"} Apr 22 21:10:53.347884 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.347880 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8fb9f8f44-5nlb2" event={"ID":"53580084-30b5-4540-b077-e50d91769724","Type":"ContainerDied","Data":"220661187a0d56e7140f5eb35885349be8a3e734671a1ff51f8a78ee3952888a"} Apr 22 21:10:53.347944 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.347898 2569 scope.go:117] "RemoveContainer" containerID="1a67edb01be928d30e9ce98089ad6aa15ab591a4474cd8f5ffccd657fab19507" Apr 22 21:10:53.347944 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.347901 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8fb9f8f44-5nlb2" Apr 22 21:10:53.355590 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.355576 2569 scope.go:117] "RemoveContainer" containerID="1a67edb01be928d30e9ce98089ad6aa15ab591a4474cd8f5ffccd657fab19507" Apr 22 21:10:53.355801 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:53.355784 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a67edb01be928d30e9ce98089ad6aa15ab591a4474cd8f5ffccd657fab19507\": container with ID starting with 1a67edb01be928d30e9ce98089ad6aa15ab591a4474cd8f5ffccd657fab19507 not found: ID does not exist" containerID="1a67edb01be928d30e9ce98089ad6aa15ab591a4474cd8f5ffccd657fab19507" Apr 22 21:10:53.355844 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.355807 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a67edb01be928d30e9ce98089ad6aa15ab591a4474cd8f5ffccd657fab19507"} err="failed to get container status \"1a67edb01be928d30e9ce98089ad6aa15ab591a4474cd8f5ffccd657fab19507\": rpc error: code = NotFound desc = could not find container \"1a67edb01be928d30e9ce98089ad6aa15ab591a4474cd8f5ffccd657fab19507\": container with ID starting with 1a67edb01be928d30e9ce98089ad6aa15ab591a4474cd8f5ffccd657fab19507 not found: ID does not exist" Apr 22 21:10:53.371319 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.371207 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.371196374 podStartE2EDuration="2.371196374s" podCreationTimestamp="2026-04-22 21:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:10:53.369678654 +0000 UTC m=+101.242570518" watchObservedRunningTime="2026-04-22 21:10:53.371196374 +0000 UTC m=+101.244088237" Apr 22 21:10:53.383040 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.383017 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-8fb9f8f44-5nlb2"] Apr 22 21:10:53.386963 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.386945 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-8fb9f8f44-5nlb2"] Apr 22 21:10:53.483676 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.483644 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 21:10:53.484094 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.484068 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="prometheus" containerID="cri-o://a8ae7fe4f43529491a5c17e64d510f2c1e307c4e0b03518e7e26f4e0ecc372d5" gracePeriod=600 Apr 22 21:10:53.484184 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.484094 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="kube-rbac-proxy" containerID="cri-o://a02ff8acd2e8ba92e7041cacf8413cf536f3cc2ce378d7ab3101cd1447f85c6b" gracePeriod=600 Apr 22 21:10:53.484184 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.484118 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="thanos-sidecar" containerID="cri-o://4baa9b78626000ebe203e47529a46a56036fe5fb18d3a3bd468321369567d75b" gracePeriod=600 Apr 22 21:10:53.484184 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.484127 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="config-reloader" containerID="cri-o://0459b12646a06bb44fb4b68eac8dadea2bd5c5c2780c71b6be8e0ec55fabd409" gracePeriod=600 Apr 22 21:10:53.484184 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.484174 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="kube-rbac-proxy-web" containerID="cri-o://d16ca88861410afb0b806f3b406d061adb1e2be9f4a7ee308d7b3cab6c5d3237" gracePeriod=600 Apr 22 21:10:53.484396 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:53.484103 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="kube-rbac-proxy-thanos" containerID="cri-o://a67daa047c0a284b73a0ac6881dfc296057c9ac3d492193780c9eb2aaa55fe47" gracePeriod=600 Apr 22 21:10:54.355033 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.355002 2569 generic.go:358] "Generic (PLEG): container finished" podID="fa12663f-80b2-43a3-b968-76c97dac6965" containerID="a67daa047c0a284b73a0ac6881dfc296057c9ac3d492193780c9eb2aaa55fe47" exitCode=0 Apr 22 21:10:54.355033 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.355026 2569 generic.go:358] "Generic (PLEG): container finished" podID="fa12663f-80b2-43a3-b968-76c97dac6965" containerID="a02ff8acd2e8ba92e7041cacf8413cf536f3cc2ce378d7ab3101cd1447f85c6b" exitCode=0 Apr 22 21:10:54.355033 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.355032 2569 generic.go:358] "Generic (PLEG): container finished" podID="fa12663f-80b2-43a3-b968-76c97dac6965" containerID="4baa9b78626000ebe203e47529a46a56036fe5fb18d3a3bd468321369567d75b" exitCode=0 Apr 22 21:10:54.355033 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.355038 2569 generic.go:358] "Generic (PLEG): container finished" podID="fa12663f-80b2-43a3-b968-76c97dac6965" containerID="0459b12646a06bb44fb4b68eac8dadea2bd5c5c2780c71b6be8e0ec55fabd409" exitCode=0 Apr 22 21:10:54.355033 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.355044 2569 generic.go:358] "Generic (PLEG): container finished" podID="fa12663f-80b2-43a3-b968-76c97dac6965" containerID="a8ae7fe4f43529491a5c17e64d510f2c1e307c4e0b03518e7e26f4e0ecc372d5" exitCode=0 Apr 22 21:10:54.355526 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.355072 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerDied","Data":"a67daa047c0a284b73a0ac6881dfc296057c9ac3d492193780c9eb2aaa55fe47"} Apr 22 21:10:54.355526 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.355101 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerDied","Data":"a02ff8acd2e8ba92e7041cacf8413cf536f3cc2ce378d7ab3101cd1447f85c6b"} Apr 22 21:10:54.355526 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.355111 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerDied","Data":"4baa9b78626000ebe203e47529a46a56036fe5fb18d3a3bd468321369567d75b"} Apr 22 21:10:54.355526 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.355120 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerDied","Data":"0459b12646a06bb44fb4b68eac8dadea2bd5c5c2780c71b6be8e0ec55fabd409"} Apr 22 21:10:54.355526 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.355128 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerDied","Data":"a8ae7fe4f43529491a5c17e64d510f2c1e307c4e0b03518e7e26f4e0ecc372d5"} Apr 22 21:10:54.723162 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.723128 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:54.789432 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.789402 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53580084-30b5-4540-b077-e50d91769724" path="/var/lib/kubelet/pods/53580084-30b5-4540-b077-e50d91769724/volumes" Apr 22 21:10:54.808828 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.808806 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-thanos-prometheus-http-client-file\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.808946 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.808838 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.808946 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.808854 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-kube-rbac-proxy\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.808946 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.808872 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fa12663f-80b2-43a3-b968-76c97dac6965-tls-assets\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.808946 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.808895 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-grpc-tls\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.808946 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.808919 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-kubelet-serving-ca-bundle\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.809257 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.808975 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.809257 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.809015 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-k8s-db\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.809257 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.809048 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-metrics-client-ca\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.809257 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.809096 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-serving-certs-ca-bundle\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.809257 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.809130 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cxgg\" (UniqueName: \"kubernetes.io/projected/fa12663f-80b2-43a3-b968-76c97dac6965-kube-api-access-8cxgg\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.809257 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.809175 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fa12663f-80b2-43a3-b968-76c97dac6965-config-out\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.809257 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.809200 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-config\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.809257 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.809259 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-k8s-rulefiles-0\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.809681 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.809284 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-web-config\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.809681 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.809325 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-tls\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.809681 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.809352 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-trusted-ca-bundle\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.809681 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.809378 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-metrics-client-certs\") pod \"fa12663f-80b2-43a3-b968-76c97dac6965\" (UID: \"fa12663f-80b2-43a3-b968-76c97dac6965\") " Apr 22 21:10:54.809681 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.809386 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:54.809681 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.809635 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.810780 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.810105 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:10:54.810780 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.810443 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:54.810780 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.810631 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:54.811116 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.811090 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:54.812519 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.812497 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:54.812630 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.812568 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa12663f-80b2-43a3-b968-76c97dac6965-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:10:54.812746 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.812618 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-config" (OuterVolumeSpecName: "config") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:54.812863 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.812762 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:54.813135 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.813094 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:54.813443 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.813420 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa12663f-80b2-43a3-b968-76c97dac6965-kube-api-access-8cxgg" (OuterVolumeSpecName: "kube-api-access-8cxgg") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "kube-api-access-8cxgg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:10:54.813920 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.813896 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:54.814004 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.813966 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:54.814205 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.814125 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:54.814205 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.814174 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa12663f-80b2-43a3-b968-76c97dac6965-config-out" (OuterVolumeSpecName: "config-out") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:10:54.814710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.814692 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:54.815544 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.815525 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:10:54.823970 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.823952 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-web-config" (OuterVolumeSpecName: "web-config") pod "fa12663f-80b2-43a3-b968-76c97dac6965" (UID: "fa12663f-80b2-43a3-b968-76c97dac6965"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:10:54.910499 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910450 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-k8s-db\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910499 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910470 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-metrics-client-ca\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910499 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910480 2569 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910499 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910489 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8cxgg\" (UniqueName: \"kubernetes.io/projected/fa12663f-80b2-43a3-b968-76c97dac6965-kube-api-access-8cxgg\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910499 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910498 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fa12663f-80b2-43a3-b968-76c97dac6965-config-out\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910678 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910507 2569 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-config\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910678 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910515 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910678 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910524 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-web-config\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910678 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910532 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-tls\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910678 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910540 2569 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa12663f-80b2-43a3-b968-76c97dac6965-prometheus-trusted-ca-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910678 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910550 2569 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-metrics-client-certs\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910678 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910559 2569 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-thanos-prometheus-http-client-file\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910678 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910568 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910678 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910577 2569 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-kube-rbac-proxy\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910678 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910586 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fa12663f-80b2-43a3-b968-76c97dac6965-tls-assets\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910678 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910595 2569 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-grpc-tls\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:54.910678 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:54.910603 2569 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fa12663f-80b2-43a3-b968-76c97dac6965-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:10:55.361557 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.361530 2569 generic.go:358] "Generic (PLEG): container finished" podID="fa12663f-80b2-43a3-b968-76c97dac6965" containerID="d16ca88861410afb0b806f3b406d061adb1e2be9f4a7ee308d7b3cab6c5d3237" exitCode=0 Apr 22 21:10:55.361959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.361619 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerDied","Data":"d16ca88861410afb0b806f3b406d061adb1e2be9f4a7ee308d7b3cab6c5d3237"} Apr 22 21:10:55.361959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.361658 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fa12663f-80b2-43a3-b968-76c97dac6965","Type":"ContainerDied","Data":"78168a1a63d1b46800bff9aa7b18eea2766fe47df5cae3715bfcdd42861ae0fb"} Apr 22 21:10:55.361959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.361676 2569 scope.go:117] "RemoveContainer" containerID="a67daa047c0a284b73a0ac6881dfc296057c9ac3d492193780c9eb2aaa55fe47" Apr 22 21:10:55.361959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.361629 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.377012 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.376986 2569 scope.go:117] "RemoveContainer" containerID="a02ff8acd2e8ba92e7041cacf8413cf536f3cc2ce378d7ab3101cd1447f85c6b" Apr 22 21:10:55.383551 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.383535 2569 scope.go:117] "RemoveContainer" containerID="d16ca88861410afb0b806f3b406d061adb1e2be9f4a7ee308d7b3cab6c5d3237" Apr 22 21:10:55.389615 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.389568 2569 scope.go:117] "RemoveContainer" containerID="4baa9b78626000ebe203e47529a46a56036fe5fb18d3a3bd468321369567d75b" Apr 22 21:10:55.391697 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.391680 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 21:10:55.398695 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.398679 2569 scope.go:117] "RemoveContainer" containerID="0459b12646a06bb44fb4b68eac8dadea2bd5c5c2780c71b6be8e0ec55fabd409" Apr 22 21:10:55.399019 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.399000 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 21:10:55.404751 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.404736 2569 scope.go:117] "RemoveContainer" containerID="a8ae7fe4f43529491a5c17e64d510f2c1e307c4e0b03518e7e26f4e0ecc372d5" Apr 22 21:10:55.411046 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.411031 2569 scope.go:117] "RemoveContainer" containerID="d463c704b73501561a1e2994b18ab2b534b8c93e013b0615932a247e642efdc5" Apr 22 21:10:55.417088 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.417072 2569 scope.go:117] "RemoveContainer" containerID="a67daa047c0a284b73a0ac6881dfc296057c9ac3d492193780c9eb2aaa55fe47" Apr 22 21:10:55.417334 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:55.417317 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a67daa047c0a284b73a0ac6881dfc296057c9ac3d492193780c9eb2aaa55fe47\": container with ID starting with a67daa047c0a284b73a0ac6881dfc296057c9ac3d492193780c9eb2aaa55fe47 not found: ID does not exist" containerID="a67daa047c0a284b73a0ac6881dfc296057c9ac3d492193780c9eb2aaa55fe47" Apr 22 21:10:55.417388 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.417341 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a67daa047c0a284b73a0ac6881dfc296057c9ac3d492193780c9eb2aaa55fe47"} err="failed to get container status \"a67daa047c0a284b73a0ac6881dfc296057c9ac3d492193780c9eb2aaa55fe47\": rpc error: code = NotFound desc = could not find container \"a67daa047c0a284b73a0ac6881dfc296057c9ac3d492193780c9eb2aaa55fe47\": container with ID starting with a67daa047c0a284b73a0ac6881dfc296057c9ac3d492193780c9eb2aaa55fe47 not found: ID does not exist" Apr 22 21:10:55.417388 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.417358 2569 scope.go:117] "RemoveContainer" containerID="a02ff8acd2e8ba92e7041cacf8413cf536f3cc2ce378d7ab3101cd1447f85c6b" Apr 22 21:10:55.417561 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:55.417541 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a02ff8acd2e8ba92e7041cacf8413cf536f3cc2ce378d7ab3101cd1447f85c6b\": container with ID starting with a02ff8acd2e8ba92e7041cacf8413cf536f3cc2ce378d7ab3101cd1447f85c6b not found: ID does not exist" containerID="a02ff8acd2e8ba92e7041cacf8413cf536f3cc2ce378d7ab3101cd1447f85c6b" Apr 22 21:10:55.417626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.417581 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a02ff8acd2e8ba92e7041cacf8413cf536f3cc2ce378d7ab3101cd1447f85c6b"} err="failed to get container status \"a02ff8acd2e8ba92e7041cacf8413cf536f3cc2ce378d7ab3101cd1447f85c6b\": rpc error: code = NotFound desc = could not find container \"a02ff8acd2e8ba92e7041cacf8413cf536f3cc2ce378d7ab3101cd1447f85c6b\": container with ID starting with a02ff8acd2e8ba92e7041cacf8413cf536f3cc2ce378d7ab3101cd1447f85c6b not found: ID does not exist" Apr 22 21:10:55.417626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.417605 2569 scope.go:117] "RemoveContainer" containerID="d16ca88861410afb0b806f3b406d061adb1e2be9f4a7ee308d7b3cab6c5d3237" Apr 22 21:10:55.417812 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:55.417796 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d16ca88861410afb0b806f3b406d061adb1e2be9f4a7ee308d7b3cab6c5d3237\": container with ID starting with d16ca88861410afb0b806f3b406d061adb1e2be9f4a7ee308d7b3cab6c5d3237 not found: ID does not exist" containerID="d16ca88861410afb0b806f3b406d061adb1e2be9f4a7ee308d7b3cab6c5d3237" Apr 22 21:10:55.417851 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.417815 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d16ca88861410afb0b806f3b406d061adb1e2be9f4a7ee308d7b3cab6c5d3237"} err="failed to get container status \"d16ca88861410afb0b806f3b406d061adb1e2be9f4a7ee308d7b3cab6c5d3237\": rpc error: code = NotFound desc = could not find container \"d16ca88861410afb0b806f3b406d061adb1e2be9f4a7ee308d7b3cab6c5d3237\": container with ID starting with d16ca88861410afb0b806f3b406d061adb1e2be9f4a7ee308d7b3cab6c5d3237 not found: ID does not exist" Apr 22 21:10:55.417851 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.417828 2569 scope.go:117] "RemoveContainer" containerID="4baa9b78626000ebe203e47529a46a56036fe5fb18d3a3bd468321369567d75b" Apr 22 21:10:55.417995 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:55.417979 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4baa9b78626000ebe203e47529a46a56036fe5fb18d3a3bd468321369567d75b\": container with ID starting with 4baa9b78626000ebe203e47529a46a56036fe5fb18d3a3bd468321369567d75b not found: ID does not exist" containerID="4baa9b78626000ebe203e47529a46a56036fe5fb18d3a3bd468321369567d75b" Apr 22 21:10:55.418047 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.418014 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4baa9b78626000ebe203e47529a46a56036fe5fb18d3a3bd468321369567d75b"} err="failed to get container status \"4baa9b78626000ebe203e47529a46a56036fe5fb18d3a3bd468321369567d75b\": rpc error: code = NotFound desc = could not find container \"4baa9b78626000ebe203e47529a46a56036fe5fb18d3a3bd468321369567d75b\": container with ID starting with 4baa9b78626000ebe203e47529a46a56036fe5fb18d3a3bd468321369567d75b not found: ID does not exist" Apr 22 21:10:55.418047 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.418036 2569 scope.go:117] "RemoveContainer" containerID="0459b12646a06bb44fb4b68eac8dadea2bd5c5c2780c71b6be8e0ec55fabd409" Apr 22 21:10:55.418274 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:55.418257 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0459b12646a06bb44fb4b68eac8dadea2bd5c5c2780c71b6be8e0ec55fabd409\": container with ID starting with 0459b12646a06bb44fb4b68eac8dadea2bd5c5c2780c71b6be8e0ec55fabd409 not found: ID does not exist" containerID="0459b12646a06bb44fb4b68eac8dadea2bd5c5c2780c71b6be8e0ec55fabd409" Apr 22 21:10:55.418325 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.418278 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0459b12646a06bb44fb4b68eac8dadea2bd5c5c2780c71b6be8e0ec55fabd409"} err="failed to get container status \"0459b12646a06bb44fb4b68eac8dadea2bd5c5c2780c71b6be8e0ec55fabd409\": rpc error: code = NotFound desc = could not find container \"0459b12646a06bb44fb4b68eac8dadea2bd5c5c2780c71b6be8e0ec55fabd409\": container with ID starting with 0459b12646a06bb44fb4b68eac8dadea2bd5c5c2780c71b6be8e0ec55fabd409 not found: ID does not exist" Apr 22 21:10:55.418325 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.418290 2569 scope.go:117] "RemoveContainer" containerID="a8ae7fe4f43529491a5c17e64d510f2c1e307c4e0b03518e7e26f4e0ecc372d5" Apr 22 21:10:55.418479 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:55.418466 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ae7fe4f43529491a5c17e64d510f2c1e307c4e0b03518e7e26f4e0ecc372d5\": container with ID starting with a8ae7fe4f43529491a5c17e64d510f2c1e307c4e0b03518e7e26f4e0ecc372d5 not found: ID does not exist" containerID="a8ae7fe4f43529491a5c17e64d510f2c1e307c4e0b03518e7e26f4e0ecc372d5" Apr 22 21:10:55.418521 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.418483 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ae7fe4f43529491a5c17e64d510f2c1e307c4e0b03518e7e26f4e0ecc372d5"} err="failed to get container status \"a8ae7fe4f43529491a5c17e64d510f2c1e307c4e0b03518e7e26f4e0ecc372d5\": rpc error: code = NotFound desc = could not find container \"a8ae7fe4f43529491a5c17e64d510f2c1e307c4e0b03518e7e26f4e0ecc372d5\": container with ID starting with a8ae7fe4f43529491a5c17e64d510f2c1e307c4e0b03518e7e26f4e0ecc372d5 not found: ID does not exist" Apr 22 21:10:55.418521 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.418494 2569 scope.go:117] "RemoveContainer" containerID="d463c704b73501561a1e2994b18ab2b534b8c93e013b0615932a247e642efdc5" Apr 22 21:10:55.418670 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:10:55.418645 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d463c704b73501561a1e2994b18ab2b534b8c93e013b0615932a247e642efdc5\": container with ID starting with d463c704b73501561a1e2994b18ab2b534b8c93e013b0615932a247e642efdc5 not found: ID does not exist" containerID="d463c704b73501561a1e2994b18ab2b534b8c93e013b0615932a247e642efdc5" Apr 22 21:10:55.418715 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.418677 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d463c704b73501561a1e2994b18ab2b534b8c93e013b0615932a247e642efdc5"} err="failed to get container status \"d463c704b73501561a1e2994b18ab2b534b8c93e013b0615932a247e642efdc5\": rpc error: code = NotFound desc = could not find container \"d463c704b73501561a1e2994b18ab2b534b8c93e013b0615932a247e642efdc5\": container with ID starting with d463c704b73501561a1e2994b18ab2b534b8c93e013b0615932a247e642efdc5 not found: ID does not exist" Apr 22 21:10:55.427583 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.427561 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 21:10:55.427887 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.427871 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="prometheus" Apr 22 21:10:55.427964 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.427890 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="prometheus" Apr 22 21:10:55.427964 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.427908 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="kube-rbac-proxy" Apr 22 21:10:55.427964 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.427917 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="kube-rbac-proxy" Apr 22 21:10:55.427964 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.427929 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="kube-rbac-proxy-thanos" Apr 22 21:10:55.427964 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.427937 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="kube-rbac-proxy-thanos" Apr 22 21:10:55.427964 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.427952 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="config-reloader" Apr 22 21:10:55.427964 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.427961 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="config-reloader" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.427973 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="init-config-reloader" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.427981 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="init-config-reloader" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.427995 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53580084-30b5-4540-b077-e50d91769724" containerName="console" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.428004 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="53580084-30b5-4540-b077-e50d91769724" containerName="console" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.428018 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="kube-rbac-proxy-web" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.428026 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="kube-rbac-proxy-web" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.428061 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="thanos-sidecar" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.428070 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="thanos-sidecar" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.428136 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="thanos-sidecar" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.428166 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="53580084-30b5-4540-b077-e50d91769724" containerName="console" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.428179 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="kube-rbac-proxy-thanos" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.428189 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="kube-rbac-proxy-web" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.428202 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="prometheus" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.428214 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="kube-rbac-proxy" Apr 22 21:10:55.428358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.428224 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" containerName="config-reloader" Apr 22 21:10:55.434497 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.434476 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.436766 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.436746 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 21:10:55.436881 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.436857 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 21:10:55.436978 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.436893 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 21:10:55.437034 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.437001 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 21:10:55.437097 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.437074 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 21:10:55.437332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.437310 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 21:10:55.437452 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.437426 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 21:10:55.437538 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.437464 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 21:10:55.437970 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.437954 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-1bkvgoads58m8\"" Apr 22 21:10:55.438044 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.437999 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-ktw22\"" Apr 22 21:10:55.438117 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.438099 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 21:10:55.438195 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.438173 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 21:10:55.439488 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.439461 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 21:10:55.443230 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.443213 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 21:10:55.444212 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.444191 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 21:10:55.515890 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.515865 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516068 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.515905 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516068 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.515939 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516068 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.515999 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516068 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.516023 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516068 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.516047 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dvdf\" (UniqueName: \"kubernetes.io/projected/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-kube-api-access-9dvdf\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516271 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.516108 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516271 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.516207 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516271 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.516230 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516271 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.516248 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.516272 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.516308 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.516323 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-web-config\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.516356 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-config-out\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.516383 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516407 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.516401 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516590 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.516416 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-config\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.516590 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.516447 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.617746 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.617646 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dvdf\" (UniqueName: \"kubernetes.io/projected/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-kube-api-access-9dvdf\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.617746 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.617701 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.617746 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.617741 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618017 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.617770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618017 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.617796 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618017 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.617821 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618017 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.617848 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618017 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.617872 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-web-config\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618017 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.617908 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-config-out\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618017 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.617935 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618017 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.617960 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618017 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.617986 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-config\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618017 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.618009 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618518 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.618033 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618518 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.618062 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618518 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.618098 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618518 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.618171 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618518 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.618201 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.618914 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.618886 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.619070 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.619047 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.620029 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.620005 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.621053 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.620931 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.621525 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.621502 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.621790 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.621769 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.622612 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.622588 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.622725 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.622629 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.622828 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.622739 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-config\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.622934 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.622854 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.623019 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.622876 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.623110 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.622921 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.623396 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.623371 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.625030 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.625007 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-config-out\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.625163 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.625123 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-web-config\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.625448 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.625422 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.625775 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.625752 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.625941 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.625923 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dvdf\" (UniqueName: \"kubernetes.io/projected/0e1037e8-6f25-4a2d-b5f0-fb90d011338c-kube-api-access-9dvdf\") pod \"prometheus-k8s-0\" (UID: \"0e1037e8-6f25-4a2d-b5f0-fb90d011338c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.745709 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.745675 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:10:55.872609 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:55.872586 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 21:10:55.874447 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:10:55.874420 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e1037e8_6f25_4a2d_b5f0_fb90d011338c.slice/crio-b6ba0afa39c89d4774a59827999e0ea11e3015f2c8747ef9f107d15700649f18 WatchSource:0}: Error finding container b6ba0afa39c89d4774a59827999e0ea11e3015f2c8747ef9f107d15700649f18: Status 404 returned error can't find the container with id b6ba0afa39c89d4774a59827999e0ea11e3015f2c8747ef9f107d15700649f18 Apr 22 21:10:56.295057 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:56.295027 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b6c46488-hvgp7"] Apr 22 21:10:56.367198 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:56.367167 2569 generic.go:358] "Generic (PLEG): container finished" podID="0e1037e8-6f25-4a2d-b5f0-fb90d011338c" containerID="f9276bb412e6065dcab5b4973d3831cfd18968e2ccac31f1a571e2d2e646adb5" exitCode=0 Apr 22 21:10:56.367532 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:56.367219 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1037e8-6f25-4a2d-b5f0-fb90d011338c","Type":"ContainerDied","Data":"f9276bb412e6065dcab5b4973d3831cfd18968e2ccac31f1a571e2d2e646adb5"} Apr 22 21:10:56.367532 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:56.367256 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1037e8-6f25-4a2d-b5f0-fb90d011338c","Type":"ContainerStarted","Data":"b6ba0afa39c89d4774a59827999e0ea11e3015f2c8747ef9f107d15700649f18"} Apr 22 21:10:56.789771 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:56.789740 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa12663f-80b2-43a3-b968-76c97dac6965" path="/var/lib/kubelet/pods/fa12663f-80b2-43a3-b968-76c97dac6965/volumes" Apr 22 21:10:57.373625 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:57.373589 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1037e8-6f25-4a2d-b5f0-fb90d011338c","Type":"ContainerStarted","Data":"6179d79230404f94d18d90352147e5333fe0fff433fb00b8d6567229e86e4556"} Apr 22 21:10:57.374073 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:57.373629 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1037e8-6f25-4a2d-b5f0-fb90d011338c","Type":"ContainerStarted","Data":"ab568cab3b36e99a7f15efbb0f2cf5b8b7bd70cb07d1ba49bea8853124531473"} Apr 22 21:10:57.374073 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:57.373644 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1037e8-6f25-4a2d-b5f0-fb90d011338c","Type":"ContainerStarted","Data":"dec338e270ba963f25446baef57a54f088faec69ebc5d55568dec88744f9eccc"} Apr 22 21:10:57.374073 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:57.373657 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1037e8-6f25-4a2d-b5f0-fb90d011338c","Type":"ContainerStarted","Data":"268cd59948b06ab288e51c8e820f09e94cb2dd470723254db295b9d4836d54d4"} Apr 22 21:10:57.374073 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:57.373670 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1037e8-6f25-4a2d-b5f0-fb90d011338c","Type":"ContainerStarted","Data":"12ff9d14c68816041c201800635355106d80207785406d19aa0683d49f7be53c"} Apr 22 21:10:57.374073 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:57.373681 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0e1037e8-6f25-4a2d-b5f0-fb90d011338c","Type":"ContainerStarted","Data":"3967cdc56c61b6d4f6f3a8b519c5da46295b294ecebacc9de7521c647aa656ea"} Apr 22 21:10:57.399413 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:10:57.399369 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.399356035 podStartE2EDuration="2.399356035s" podCreationTimestamp="2026-04-22 21:10:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:10:57.397121638 +0000 UTC m=+105.270013489" watchObservedRunningTime="2026-04-22 21:10:57.399356035 +0000 UTC m=+105.272247886" Apr 22 21:11:00.745952 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:00.745915 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:11:16.242963 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.242922 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-szpzh"] Apr 22 21:11:16.246250 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.246227 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-szpzh" Apr 22 21:11:16.248335 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.248316 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 21:11:16.251759 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.251735 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-szpzh"] Apr 22 21:11:16.390278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.390252 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0be939b4-a64f-442a-957d-d341f369c11b-dbus\") pod \"global-pull-secret-syncer-szpzh\" (UID: \"0be939b4-a64f-442a-957d-d341f369c11b\") " pod="kube-system/global-pull-secret-syncer-szpzh" Apr 22 21:11:16.390278 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.390286 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0be939b4-a64f-442a-957d-d341f369c11b-kubelet-config\") pod \"global-pull-secret-syncer-szpzh\" (UID: \"0be939b4-a64f-442a-957d-d341f369c11b\") " pod="kube-system/global-pull-secret-syncer-szpzh" Apr 22 21:11:16.390479 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.390305 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0be939b4-a64f-442a-957d-d341f369c11b-original-pull-secret\") pod \"global-pull-secret-syncer-szpzh\" (UID: \"0be939b4-a64f-442a-957d-d341f369c11b\") " pod="kube-system/global-pull-secret-syncer-szpzh" Apr 22 21:11:16.490892 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.490861 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0be939b4-a64f-442a-957d-d341f369c11b-dbus\") pod \"global-pull-secret-syncer-szpzh\" (UID: \"0be939b4-a64f-442a-957d-d341f369c11b\") " pod="kube-system/global-pull-secret-syncer-szpzh" Apr 22 21:11:16.491041 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.490899 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0be939b4-a64f-442a-957d-d341f369c11b-kubelet-config\") pod \"global-pull-secret-syncer-szpzh\" (UID: \"0be939b4-a64f-442a-957d-d341f369c11b\") " pod="kube-system/global-pull-secret-syncer-szpzh" Apr 22 21:11:16.491041 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.490924 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0be939b4-a64f-442a-957d-d341f369c11b-original-pull-secret\") pod \"global-pull-secret-syncer-szpzh\" (UID: \"0be939b4-a64f-442a-957d-d341f369c11b\") " pod="kube-system/global-pull-secret-syncer-szpzh" Apr 22 21:11:16.491041 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.490991 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/0be939b4-a64f-442a-957d-d341f369c11b-kubelet-config\") pod \"global-pull-secret-syncer-szpzh\" (UID: \"0be939b4-a64f-442a-957d-d341f369c11b\") " pod="kube-system/global-pull-secret-syncer-szpzh" Apr 22 21:11:16.491228 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.491061 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/0be939b4-a64f-442a-957d-d341f369c11b-dbus\") pod \"global-pull-secret-syncer-szpzh\" (UID: \"0be939b4-a64f-442a-957d-d341f369c11b\") " pod="kube-system/global-pull-secret-syncer-szpzh" Apr 22 21:11:16.493217 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.493167 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/0be939b4-a64f-442a-957d-d341f369c11b-original-pull-secret\") pod \"global-pull-secret-syncer-szpzh\" (UID: \"0be939b4-a64f-442a-957d-d341f369c11b\") " pod="kube-system/global-pull-secret-syncer-szpzh" Apr 22 21:11:16.556282 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.556259 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-szpzh" Apr 22 21:11:16.668838 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:16.668699 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-szpzh"] Apr 22 21:11:16.671431 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:11:16.671406 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0be939b4_a64f_442a_957d_d341f369c11b.slice/crio-d03f86b8bcc33609356538274a15c97107341641b8555309e65fbea855882606 WatchSource:0}: Error finding container d03f86b8bcc33609356538274a15c97107341641b8555309e65fbea855882606: Status 404 returned error can't find the container with id d03f86b8bcc33609356538274a15c97107341641b8555309e65fbea855882606 Apr 22 21:11:17.433033 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:17.432990 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-szpzh" event={"ID":"0be939b4-a64f-442a-957d-d341f369c11b","Type":"ContainerStarted","Data":"d03f86b8bcc33609356538274a15c97107341641b8555309e65fbea855882606"} Apr 22 21:11:21.313550 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.313508 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-b6c46488-hvgp7" podUID="d6f8fd9e-523a-4f00-a44d-15d6e4dbb207" containerName="console" containerID="cri-o://33eba23e5cb0df63b9d3149757016719e2141f8bfc0593961cfc6ad019bb3a4f" gracePeriod=15 Apr 22 21:11:21.448015 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.447994 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b6c46488-hvgp7_d6f8fd9e-523a-4f00-a44d-15d6e4dbb207/console/0.log" Apr 22 21:11:21.448243 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.448029 2569 generic.go:358] "Generic (PLEG): container finished" podID="d6f8fd9e-523a-4f00-a44d-15d6e4dbb207" containerID="33eba23e5cb0df63b9d3149757016719e2141f8bfc0593961cfc6ad019bb3a4f" exitCode=2 Apr 22 21:11:21.448243 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.448109 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b6c46488-hvgp7" event={"ID":"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207","Type":"ContainerDied","Data":"33eba23e5cb0df63b9d3149757016719e2141f8bfc0593961cfc6ad019bb3a4f"} Apr 22 21:11:21.449384 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.449360 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-szpzh" event={"ID":"0be939b4-a64f-442a-957d-d341f369c11b","Type":"ContainerStarted","Data":"896d1e992a152922ae3d1012f054ca62bf181295b35c1395e5367f0e5bec72fe"} Apr 22 21:11:21.466158 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.466098 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-szpzh" podStartSLOduration=1.6075815580000001 podStartE2EDuration="5.466081237s" podCreationTimestamp="2026-04-22 21:11:16 +0000 UTC" firstStartedPulling="2026-04-22 21:11:16.673221717 +0000 UTC m=+124.546113557" lastFinishedPulling="2026-04-22 21:11:20.531721402 +0000 UTC m=+128.404613236" observedRunningTime="2026-04-22 21:11:21.465062239 +0000 UTC m=+129.337954093" watchObservedRunningTime="2026-04-22 21:11:21.466081237 +0000 UTC m=+129.338973092" Apr 22 21:11:21.546301 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.546279 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b6c46488-hvgp7_d6f8fd9e-523a-4f00-a44d-15d6e4dbb207/console/0.log" Apr 22 21:11:21.546386 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.546333 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:11:21.639926 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.639860 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-oauth-serving-cert\") pod \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " Apr 22 21:11:21.640065 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.639950 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np27f\" (UniqueName: \"kubernetes.io/projected/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-kube-api-access-np27f\") pod \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " Apr 22 21:11:21.640065 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.639968 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-config\") pod \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " Apr 22 21:11:21.640065 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.639992 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-serving-cert\") pod \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " Apr 22 21:11:21.640065 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.640024 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-oauth-config\") pod \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " Apr 22 21:11:21.640065 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.640043 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-trusted-ca-bundle\") pod \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " Apr 22 21:11:21.640065 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.640067 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-service-ca\") pod \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\" (UID: \"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207\") " Apr 22 21:11:21.640416 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.640284 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d6f8fd9e-523a-4f00-a44d-15d6e4dbb207" (UID: "d6f8fd9e-523a-4f00-a44d-15d6e4dbb207"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:11:21.640536 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.640502 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d6f8fd9e-523a-4f00-a44d-15d6e4dbb207" (UID: "d6f8fd9e-523a-4f00-a44d-15d6e4dbb207"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:11:21.640536 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.640517 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-config" (OuterVolumeSpecName: "console-config") pod "d6f8fd9e-523a-4f00-a44d-15d6e4dbb207" (UID: "d6f8fd9e-523a-4f00-a44d-15d6e4dbb207"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:11:21.640653 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.640567 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-service-ca" (OuterVolumeSpecName: "service-ca") pod "d6f8fd9e-523a-4f00-a44d-15d6e4dbb207" (UID: "d6f8fd9e-523a-4f00-a44d-15d6e4dbb207"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:11:21.642240 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.642215 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d6f8fd9e-523a-4f00-a44d-15d6e4dbb207" (UID: "d6f8fd9e-523a-4f00-a44d-15d6e4dbb207"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:11:21.642626 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.642612 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-kube-api-access-np27f" (OuterVolumeSpecName: "kube-api-access-np27f") pod "d6f8fd9e-523a-4f00-a44d-15d6e4dbb207" (UID: "d6f8fd9e-523a-4f00-a44d-15d6e4dbb207"). InnerVolumeSpecName "kube-api-access-np27f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:11:21.642678 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.642638 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d6f8fd9e-523a-4f00-a44d-15d6e4dbb207" (UID: "d6f8fd9e-523a-4f00-a44d-15d6e4dbb207"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:11:21.740660 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.740630 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-oauth-serving-cert\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:11:21.740660 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.740655 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-np27f\" (UniqueName: \"kubernetes.io/projected/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-kube-api-access-np27f\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:11:21.740811 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.740671 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-config\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:11:21.740811 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.740685 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-serving-cert\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:11:21.740811 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.740697 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-console-oauth-config\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:11:21.740811 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.740710 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-trusted-ca-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:11:21.740811 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:21.740724 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207-service-ca\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:11:22.453046 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:22.453019 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b6c46488-hvgp7_d6f8fd9e-523a-4f00-a44d-15d6e4dbb207/console/0.log" Apr 22 21:11:22.453508 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:22.453162 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b6c46488-hvgp7" Apr 22 21:11:22.453508 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:22.453165 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b6c46488-hvgp7" event={"ID":"d6f8fd9e-523a-4f00-a44d-15d6e4dbb207","Type":"ContainerDied","Data":"f03a4de923839b341a75b05e8120be109e137ecb6ecb3115c8fcc0f9d07c1ebe"} Apr 22 21:11:22.453508 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:22.453209 2569 scope.go:117] "RemoveContainer" containerID="33eba23e5cb0df63b9d3149757016719e2141f8bfc0593961cfc6ad019bb3a4f" Apr 22 21:11:22.474414 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:22.474382 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b6c46488-hvgp7"] Apr 22 21:11:22.475639 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:22.475618 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-b6c46488-hvgp7"] Apr 22 21:11:22.789179 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:22.789135 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6f8fd9e-523a-4f00-a44d-15d6e4dbb207" path="/var/lib/kubelet/pods/d6f8fd9e-523a-4f00-a44d-15d6e4dbb207/volumes" Apr 22 21:11:28.578477 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.578442 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66"] Apr 22 21:11:28.578989 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.578774 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d6f8fd9e-523a-4f00-a44d-15d6e4dbb207" containerName="console" Apr 22 21:11:28.578989 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.578789 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f8fd9e-523a-4f00-a44d-15d6e4dbb207" containerName="console" Apr 22 21:11:28.578989 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.578864 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d6f8fd9e-523a-4f00-a44d-15d6e4dbb207" containerName="console" Apr 22 21:11:28.582620 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.582604 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" Apr 22 21:11:28.584782 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.584752 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 21:11:28.584782 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.584777 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 21:11:28.584941 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.584808 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8dc7t\"" Apr 22 21:11:28.588292 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.588271 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66"] Apr 22 21:11:28.693041 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.693011 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lhxf\" (UniqueName: \"kubernetes.io/projected/7c0a32ac-970f-4f97-9e67-1c94039a0619-kube-api-access-4lhxf\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66\" (UID: \"7c0a32ac-970f-4f97-9e67-1c94039a0619\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" Apr 22 21:11:28.693217 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.693047 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c0a32ac-970f-4f97-9e67-1c94039a0619-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66\" (UID: \"7c0a32ac-970f-4f97-9e67-1c94039a0619\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" Apr 22 21:11:28.693217 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.693073 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c0a32ac-970f-4f97-9e67-1c94039a0619-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66\" (UID: \"7c0a32ac-970f-4f97-9e67-1c94039a0619\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" Apr 22 21:11:28.793639 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.793614 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lhxf\" (UniqueName: \"kubernetes.io/projected/7c0a32ac-970f-4f97-9e67-1c94039a0619-kube-api-access-4lhxf\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66\" (UID: \"7c0a32ac-970f-4f97-9e67-1c94039a0619\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" Apr 22 21:11:28.793762 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.793648 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c0a32ac-970f-4f97-9e67-1c94039a0619-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66\" (UID: \"7c0a32ac-970f-4f97-9e67-1c94039a0619\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" Apr 22 21:11:28.793762 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.793669 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c0a32ac-970f-4f97-9e67-1c94039a0619-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66\" (UID: \"7c0a32ac-970f-4f97-9e67-1c94039a0619\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" Apr 22 21:11:28.794001 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.793981 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c0a32ac-970f-4f97-9e67-1c94039a0619-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66\" (UID: \"7c0a32ac-970f-4f97-9e67-1c94039a0619\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" Apr 22 21:11:28.794041 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.794001 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c0a32ac-970f-4f97-9e67-1c94039a0619-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66\" (UID: \"7c0a32ac-970f-4f97-9e67-1c94039a0619\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" Apr 22 21:11:28.801767 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.801739 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lhxf\" (UniqueName: \"kubernetes.io/projected/7c0a32ac-970f-4f97-9e67-1c94039a0619-kube-api-access-4lhxf\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66\" (UID: \"7c0a32ac-970f-4f97-9e67-1c94039a0619\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" Apr 22 21:11:28.892589 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:28.892537 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" Apr 22 21:11:29.006853 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:29.006827 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66"] Apr 22 21:11:29.008715 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:11:29.008681 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c0a32ac_970f_4f97_9e67_1c94039a0619.slice/crio-15d9b1372ff17ab4bf34cc99e6d3c6e762d9baeec06285cd0ab0323f0636d456 WatchSource:0}: Error finding container 15d9b1372ff17ab4bf34cc99e6d3c6e762d9baeec06285cd0ab0323f0636d456: Status 404 returned error can't find the container with id 15d9b1372ff17ab4bf34cc99e6d3c6e762d9baeec06285cd0ab0323f0636d456 Apr 22 21:11:29.479353 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:29.479317 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" event={"ID":"7c0a32ac-970f-4f97-9e67-1c94039a0619","Type":"ContainerStarted","Data":"15d9b1372ff17ab4bf34cc99e6d3c6e762d9baeec06285cd0ab0323f0636d456"} Apr 22 21:11:34.495997 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:34.495964 2569 generic.go:358] "Generic (PLEG): container finished" podID="7c0a32ac-970f-4f97-9e67-1c94039a0619" containerID="2dbd38f701dd30848360fc58df206c0ae8b20625c648c48173ae3281ba0f95f4" exitCode=0 Apr 22 21:11:34.496474 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:34.496056 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" event={"ID":"7c0a32ac-970f-4f97-9e67-1c94039a0619","Type":"ContainerDied","Data":"2dbd38f701dd30848360fc58df206c0ae8b20625c648c48173ae3281ba0f95f4"} Apr 22 21:11:37.507545 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:37.507510 2569 generic.go:358] "Generic (PLEG): container finished" podID="7c0a32ac-970f-4f97-9e67-1c94039a0619" containerID="261a388ba1722cafc7e92f2c67dc416b0ebde0168285118dec433e247dc536ae" exitCode=0 Apr 22 21:11:37.508013 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:37.507599 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" event={"ID":"7c0a32ac-970f-4f97-9e67-1c94039a0619","Type":"ContainerDied","Data":"261a388ba1722cafc7e92f2c67dc416b0ebde0168285118dec433e247dc536ae"} Apr 22 21:11:45.532213 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:45.532176 2569 generic.go:358] "Generic (PLEG): container finished" podID="7c0a32ac-970f-4f97-9e67-1c94039a0619" containerID="1039f79b3b3add4d21535d84f0a25c50c0fe3010a9178eafb40b09a347c95594" exitCode=0 Apr 22 21:11:45.532585 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:45.532233 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" event={"ID":"7c0a32ac-970f-4f97-9e67-1c94039a0619","Type":"ContainerDied","Data":"1039f79b3b3add4d21535d84f0a25c50c0fe3010a9178eafb40b09a347c95594"} Apr 22 21:11:46.654173 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:46.654136 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" Apr 22 21:11:46.723455 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:46.723435 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lhxf\" (UniqueName: \"kubernetes.io/projected/7c0a32ac-970f-4f97-9e67-1c94039a0619-kube-api-access-4lhxf\") pod \"7c0a32ac-970f-4f97-9e67-1c94039a0619\" (UID: \"7c0a32ac-970f-4f97-9e67-1c94039a0619\") " Apr 22 21:11:46.723588 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:46.723479 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c0a32ac-970f-4f97-9e67-1c94039a0619-bundle\") pod \"7c0a32ac-970f-4f97-9e67-1c94039a0619\" (UID: \"7c0a32ac-970f-4f97-9e67-1c94039a0619\") " Apr 22 21:11:46.723588 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:46.723494 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c0a32ac-970f-4f97-9e67-1c94039a0619-util\") pod \"7c0a32ac-970f-4f97-9e67-1c94039a0619\" (UID: \"7c0a32ac-970f-4f97-9e67-1c94039a0619\") " Apr 22 21:11:46.724100 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:46.724075 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c0a32ac-970f-4f97-9e67-1c94039a0619-bundle" (OuterVolumeSpecName: "bundle") pod "7c0a32ac-970f-4f97-9e67-1c94039a0619" (UID: "7c0a32ac-970f-4f97-9e67-1c94039a0619"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:11:46.725543 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:46.725519 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c0a32ac-970f-4f97-9e67-1c94039a0619-kube-api-access-4lhxf" (OuterVolumeSpecName: "kube-api-access-4lhxf") pod "7c0a32ac-970f-4f97-9e67-1c94039a0619" (UID: "7c0a32ac-970f-4f97-9e67-1c94039a0619"). InnerVolumeSpecName "kube-api-access-4lhxf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:11:46.727635 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:46.727610 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c0a32ac-970f-4f97-9e67-1c94039a0619-util" (OuterVolumeSpecName: "util") pod "7c0a32ac-970f-4f97-9e67-1c94039a0619" (UID: "7c0a32ac-970f-4f97-9e67-1c94039a0619"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:11:46.824356 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:46.824334 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4lhxf\" (UniqueName: \"kubernetes.io/projected/7c0a32ac-970f-4f97-9e67-1c94039a0619-kube-api-access-4lhxf\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:11:46.824356 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:46.824355 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c0a32ac-970f-4f97-9e67-1c94039a0619-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:11:46.824486 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:46.824365 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c0a32ac-970f-4f97-9e67-1c94039a0619-util\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:11:47.539094 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:47.539069 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" Apr 22 21:11:47.539261 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:47.539065 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2zm66" event={"ID":"7c0a32ac-970f-4f97-9e67-1c94039a0619","Type":"ContainerDied","Data":"15d9b1372ff17ab4bf34cc99e6d3c6e762d9baeec06285cd0ab0323f0636d456"} Apr 22 21:11:47.539261 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:47.539177 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15d9b1372ff17ab4bf34cc99e6d3c6e762d9baeec06285cd0ab0323f0636d456" Apr 22 21:11:51.280648 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.280608 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gb7rn"] Apr 22 21:11:51.281212 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.281068 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c0a32ac-970f-4f97-9e67-1c94039a0619" containerName="extract" Apr 22 21:11:51.281212 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.281087 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0a32ac-970f-4f97-9e67-1c94039a0619" containerName="extract" Apr 22 21:11:51.281212 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.281116 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c0a32ac-970f-4f97-9e67-1c94039a0619" containerName="util" Apr 22 21:11:51.281212 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.281124 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0a32ac-970f-4f97-9e67-1c94039a0619" containerName="util" Apr 22 21:11:51.281212 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.281134 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c0a32ac-970f-4f97-9e67-1c94039a0619" containerName="pull" Apr 22 21:11:51.281212 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.281166 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0a32ac-970f-4f97-9e67-1c94039a0619" containerName="pull" Apr 22 21:11:51.281531 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.281247 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c0a32ac-970f-4f97-9e67-1c94039a0619" containerName="extract" Apr 22 21:11:51.283603 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.283583 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gb7rn" Apr 22 21:11:51.286073 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.286051 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-5tx6r\"" Apr 22 21:11:51.286180 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.286106 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 21:11:51.286295 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.286274 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:11:51.295983 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.295961 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gb7rn"] Apr 22 21:11:51.357830 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.357802 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbm7n\" (UniqueName: \"kubernetes.io/projected/2ec82946-a813-4bc8-9a9a-bbf1c87d7ec5-kube-api-access-pbm7n\") pod \"cert-manager-operator-controller-manager-54b9655956-gb7rn\" (UID: \"2ec82946-a813-4bc8-9a9a-bbf1c87d7ec5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gb7rn" Apr 22 21:11:51.357934 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.357911 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ec82946-a813-4bc8-9a9a-bbf1c87d7ec5-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-gb7rn\" (UID: \"2ec82946-a813-4bc8-9a9a-bbf1c87d7ec5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gb7rn" Apr 22 21:11:51.458862 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.458834 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbm7n\" (UniqueName: \"kubernetes.io/projected/2ec82946-a813-4bc8-9a9a-bbf1c87d7ec5-kube-api-access-pbm7n\") pod \"cert-manager-operator-controller-manager-54b9655956-gb7rn\" (UID: \"2ec82946-a813-4bc8-9a9a-bbf1c87d7ec5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gb7rn" Apr 22 21:11:51.458967 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.458895 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ec82946-a813-4bc8-9a9a-bbf1c87d7ec5-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-gb7rn\" (UID: \"2ec82946-a813-4bc8-9a9a-bbf1c87d7ec5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gb7rn" Apr 22 21:11:51.459224 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.459209 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ec82946-a813-4bc8-9a9a-bbf1c87d7ec5-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-gb7rn\" (UID: \"2ec82946-a813-4bc8-9a9a-bbf1c87d7ec5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gb7rn" Apr 22 21:11:51.466751 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.466728 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbm7n\" (UniqueName: \"kubernetes.io/projected/2ec82946-a813-4bc8-9a9a-bbf1c87d7ec5-kube-api-access-pbm7n\") pod \"cert-manager-operator-controller-manager-54b9655956-gb7rn\" (UID: \"2ec82946-a813-4bc8-9a9a-bbf1c87d7ec5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gb7rn" Apr 22 21:11:51.592772 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.592702 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gb7rn" Apr 22 21:11:51.713339 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:51.713315 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gb7rn"] Apr 22 21:11:51.715535 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:11:51.715508 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ec82946_a813_4bc8_9a9a_bbf1c87d7ec5.slice/crio-a814ee9c727fe6239c5b6c1725d62589398e60a3e6af293e91f992c9b1f9e7e9 WatchSource:0}: Error finding container a814ee9c727fe6239c5b6c1725d62589398e60a3e6af293e91f992c9b1f9e7e9: Status 404 returned error can't find the container with id a814ee9c727fe6239c5b6c1725d62589398e60a3e6af293e91f992c9b1f9e7e9 Apr 22 21:11:52.555862 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:52.555819 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gb7rn" event={"ID":"2ec82946-a813-4bc8-9a9a-bbf1c87d7ec5","Type":"ContainerStarted","Data":"a814ee9c727fe6239c5b6c1725d62589398e60a3e6af293e91f992c9b1f9e7e9"} Apr 22 21:11:54.563721 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:54.563686 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gb7rn" event={"ID":"2ec82946-a813-4bc8-9a9a-bbf1c87d7ec5","Type":"ContainerStarted","Data":"2a25acccfd67b4fc6129ca9b0af97be7b2c3e8bbb38e05b7d410d477a33e4535"} Apr 22 21:11:54.582297 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:54.582255 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gb7rn" podStartSLOduration=1.5823819449999998 podStartE2EDuration="3.582242231s" podCreationTimestamp="2026-04-22 21:11:51 +0000 UTC" firstStartedPulling="2026-04-22 21:11:51.717889787 +0000 UTC m=+159.590781617" lastFinishedPulling="2026-04-22 21:11:53.717750067 +0000 UTC m=+161.590641903" observedRunningTime="2026-04-22 21:11:54.580841803 +0000 UTC m=+162.453733676" watchObservedRunningTime="2026-04-22 21:11:54.582242231 +0000 UTC m=+162.455134082" Apr 22 21:11:55.551004 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.550971 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt"] Apr 22 21:11:55.553295 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.553277 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" Apr 22 21:11:55.555654 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.555632 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 21:11:55.555766 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.555633 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8dc7t\"" Apr 22 21:11:55.555766 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.555675 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 21:11:55.561409 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.561389 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt"] Apr 22 21:11:55.591950 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.591927 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/036c1253-f650-439c-8948-323ab1bd29c0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt\" (UID: \"036c1253-f650-439c-8948-323ab1bd29c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" Apr 22 21:11:55.592265 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.591959 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdmg6\" (UniqueName: \"kubernetes.io/projected/036c1253-f650-439c-8948-323ab1bd29c0-kube-api-access-mdmg6\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt\" (UID: \"036c1253-f650-439c-8948-323ab1bd29c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" Apr 22 21:11:55.592265 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.591980 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/036c1253-f650-439c-8948-323ab1bd29c0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt\" (UID: \"036c1253-f650-439c-8948-323ab1bd29c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" Apr 22 21:11:55.692499 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.692469 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/036c1253-f650-439c-8948-323ab1bd29c0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt\" (UID: \"036c1253-f650-439c-8948-323ab1bd29c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" Apr 22 21:11:55.692499 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.692503 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdmg6\" (UniqueName: \"kubernetes.io/projected/036c1253-f650-439c-8948-323ab1bd29c0-kube-api-access-mdmg6\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt\" (UID: \"036c1253-f650-439c-8948-323ab1bd29c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" Apr 22 21:11:55.692719 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.692524 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/036c1253-f650-439c-8948-323ab1bd29c0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt\" (UID: \"036c1253-f650-439c-8948-323ab1bd29c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" Apr 22 21:11:55.692935 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.692914 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/036c1253-f650-439c-8948-323ab1bd29c0-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt\" (UID: \"036c1253-f650-439c-8948-323ab1bd29c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" Apr 22 21:11:55.692982 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.692930 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/036c1253-f650-439c-8948-323ab1bd29c0-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt\" (UID: \"036c1253-f650-439c-8948-323ab1bd29c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" Apr 22 21:11:55.699873 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.699853 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdmg6\" (UniqueName: \"kubernetes.io/projected/036c1253-f650-439c-8948-323ab1bd29c0-kube-api-access-mdmg6\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt\" (UID: \"036c1253-f650-439c-8948-323ab1bd29c0\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" Apr 22 21:11:55.746315 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.746290 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:11:55.761175 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.761153 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:11:55.863542 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.863478 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" Apr 22 21:11:55.998397 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:55.998374 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt"] Apr 22 21:11:56.000182 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:11:56.000130 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod036c1253_f650_439c_8948_323ab1bd29c0.slice/crio-a444a7171aeec314b0f96ce48cc96bd3ac6e0fe099cfe5bfeb11364711279e3c WatchSource:0}: Error finding container a444a7171aeec314b0f96ce48cc96bd3ac6e0fe099cfe5bfeb11364711279e3c: Status 404 returned error can't find the container with id a444a7171aeec314b0f96ce48cc96bd3ac6e0fe099cfe5bfeb11364711279e3c Apr 22 21:11:56.571859 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:56.571825 2569 generic.go:358] "Generic (PLEG): container finished" podID="036c1253-f650-439c-8948-323ab1bd29c0" containerID="dc544e5348e0703cf6fa3e5477da42bd4a3e260953d283d4960a1940f39ddd6a" exitCode=0 Apr 22 21:11:56.572026 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:56.571905 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" event={"ID":"036c1253-f650-439c-8948-323ab1bd29c0","Type":"ContainerDied","Data":"dc544e5348e0703cf6fa3e5477da42bd4a3e260953d283d4960a1940f39ddd6a"} Apr 22 21:11:56.572026 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:56.571938 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" event={"ID":"036c1253-f650-439c-8948-323ab1bd29c0","Type":"ContainerStarted","Data":"a444a7171aeec314b0f96ce48cc96bd3ac6e0fe099cfe5bfeb11364711279e3c"} Apr 22 21:11:56.588028 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:56.588010 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 21:11:56.978550 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:56.978478 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-59d27"] Apr 22 21:11:56.980685 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:56.980667 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-59d27" Apr 22 21:11:56.982901 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:56.982879 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 21:11:56.983010 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:56.982880 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 21:11:56.983121 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:56.983107 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-d45xs\"" Apr 22 21:11:56.988953 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:56.988933 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-59d27"] Apr 22 21:11:57.107086 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:57.107061 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-592tk\" (UniqueName: \"kubernetes.io/projected/67621c3b-d050-44bd-908a-deb6ca11addf-kube-api-access-592tk\") pod \"cert-manager-webhook-587ccfb98-59d27\" (UID: \"67621c3b-d050-44bd-908a-deb6ca11addf\") " pod="cert-manager/cert-manager-webhook-587ccfb98-59d27" Apr 22 21:11:57.107233 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:57.107110 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67621c3b-d050-44bd-908a-deb6ca11addf-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-59d27\" (UID: \"67621c3b-d050-44bd-908a-deb6ca11addf\") " pod="cert-manager/cert-manager-webhook-587ccfb98-59d27" Apr 22 21:11:57.207591 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:57.207568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-592tk\" (UniqueName: \"kubernetes.io/projected/67621c3b-d050-44bd-908a-deb6ca11addf-kube-api-access-592tk\") pod \"cert-manager-webhook-587ccfb98-59d27\" (UID: \"67621c3b-d050-44bd-908a-deb6ca11addf\") " pod="cert-manager/cert-manager-webhook-587ccfb98-59d27" Apr 22 21:11:57.207703 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:57.207610 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67621c3b-d050-44bd-908a-deb6ca11addf-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-59d27\" (UID: \"67621c3b-d050-44bd-908a-deb6ca11addf\") " pod="cert-manager/cert-manager-webhook-587ccfb98-59d27" Apr 22 21:11:57.215063 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:57.215043 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67621c3b-d050-44bd-908a-deb6ca11addf-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-59d27\" (UID: \"67621c3b-d050-44bd-908a-deb6ca11addf\") " pod="cert-manager/cert-manager-webhook-587ccfb98-59d27" Apr 22 21:11:57.215304 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:57.215282 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-592tk\" (UniqueName: \"kubernetes.io/projected/67621c3b-d050-44bd-908a-deb6ca11addf-kube-api-access-592tk\") pod \"cert-manager-webhook-587ccfb98-59d27\" (UID: \"67621c3b-d050-44bd-908a-deb6ca11addf\") " pod="cert-manager/cert-manager-webhook-587ccfb98-59d27" Apr 22 21:11:57.299643 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:57.299618 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-59d27" Apr 22 21:11:57.412857 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:57.412834 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-59d27"] Apr 22 21:11:57.414751 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:11:57.414717 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67621c3b_d050_44bd_908a_deb6ca11addf.slice/crio-89118cace8ff38ac6282d08570d6cfc1bb9978b0880ce5840ff72eaf64cf6c0f WatchSource:0}: Error finding container 89118cace8ff38ac6282d08570d6cfc1bb9978b0880ce5840ff72eaf64cf6c0f: Status 404 returned error can't find the container with id 89118cace8ff38ac6282d08570d6cfc1bb9978b0880ce5840ff72eaf64cf6c0f Apr 22 21:11:57.577257 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:57.577168 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-59d27" event={"ID":"67621c3b-d050-44bd-908a-deb6ca11addf","Type":"ContainerStarted","Data":"89118cace8ff38ac6282d08570d6cfc1bb9978b0880ce5840ff72eaf64cf6c0f"} Apr 22 21:11:59.586346 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:59.586309 2569 generic.go:358] "Generic (PLEG): container finished" podID="036c1253-f650-439c-8948-323ab1bd29c0" containerID="a673039a4dfa856d7f8e2cd23f49753be83f5393fde4e47dc5f0d4f9e22f90da" exitCode=0 Apr 22 21:11:59.586872 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:11:59.586401 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" event={"ID":"036c1253-f650-439c-8948-323ab1bd29c0","Type":"ContainerDied","Data":"a673039a4dfa856d7f8e2cd23f49753be83f5393fde4e47dc5f0d4f9e22f90da"} Apr 22 21:12:00.591136 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:00.591065 2569 generic.go:358] "Generic (PLEG): container finished" podID="036c1253-f650-439c-8948-323ab1bd29c0" containerID="79d7e0685c3149aafadcbc0c5210d6f2368a1c442ea4fba2fe4957e00fc3e095" exitCode=0 Apr 22 21:12:00.591480 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:00.591172 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" event={"ID":"036c1253-f650-439c-8948-323ab1bd29c0","Type":"ContainerDied","Data":"79d7e0685c3149aafadcbc0c5210d6f2368a1c442ea4fba2fe4957e00fc3e095"} Apr 22 21:12:01.595237 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:01.595201 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-59d27" event={"ID":"67621c3b-d050-44bd-908a-deb6ca11addf","Type":"ContainerStarted","Data":"c429e4f751da0d9ac69ebdccb25e26a4fdfef8ac45f9a8ed87b617a736e9297b"} Apr 22 21:12:01.595674 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:01.595489 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-59d27" Apr 22 21:12:01.611910 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:01.611744 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-59d27" podStartSLOduration=2.377808458 podStartE2EDuration="5.611729374s" podCreationTimestamp="2026-04-22 21:11:56 +0000 UTC" firstStartedPulling="2026-04-22 21:11:57.416650688 +0000 UTC m=+165.289542520" lastFinishedPulling="2026-04-22 21:12:00.650571606 +0000 UTC m=+168.523463436" observedRunningTime="2026-04-22 21:12:01.611302035 +0000 UTC m=+169.484193888" watchObservedRunningTime="2026-04-22 21:12:01.611729374 +0000 UTC m=+169.484621228" Apr 22 21:12:01.715607 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:01.715582 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" Apr 22 21:12:01.851347 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:01.851274 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/036c1253-f650-439c-8948-323ab1bd29c0-util\") pod \"036c1253-f650-439c-8948-323ab1bd29c0\" (UID: \"036c1253-f650-439c-8948-323ab1bd29c0\") " Apr 22 21:12:01.851347 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:01.851309 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/036c1253-f650-439c-8948-323ab1bd29c0-bundle\") pod \"036c1253-f650-439c-8948-323ab1bd29c0\" (UID: \"036c1253-f650-439c-8948-323ab1bd29c0\") " Apr 22 21:12:01.851521 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:01.851413 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdmg6\" (UniqueName: \"kubernetes.io/projected/036c1253-f650-439c-8948-323ab1bd29c0-kube-api-access-mdmg6\") pod \"036c1253-f650-439c-8948-323ab1bd29c0\" (UID: \"036c1253-f650-439c-8948-323ab1bd29c0\") " Apr 22 21:12:01.851748 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:01.851722 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/036c1253-f650-439c-8948-323ab1bd29c0-bundle" (OuterVolumeSpecName: "bundle") pod "036c1253-f650-439c-8948-323ab1bd29c0" (UID: "036c1253-f650-439c-8948-323ab1bd29c0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:12:01.853497 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:01.853473 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/036c1253-f650-439c-8948-323ab1bd29c0-kube-api-access-mdmg6" (OuterVolumeSpecName: "kube-api-access-mdmg6") pod "036c1253-f650-439c-8948-323ab1bd29c0" (UID: "036c1253-f650-439c-8948-323ab1bd29c0"). InnerVolumeSpecName "kube-api-access-mdmg6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:12:01.856242 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:01.856222 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/036c1253-f650-439c-8948-323ab1bd29c0-util" (OuterVolumeSpecName: "util") pod "036c1253-f650-439c-8948-323ab1bd29c0" (UID: "036c1253-f650-439c-8948-323ab1bd29c0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:12:01.952975 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:01.952956 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/036c1253-f650-439c-8948-323ab1bd29c0-util\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:12:01.952975 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:01.952975 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/036c1253-f650-439c-8948-323ab1bd29c0-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:12:01.953092 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:01.952984 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mdmg6\" (UniqueName: \"kubernetes.io/projected/036c1253-f650-439c-8948-323ab1bd29c0-kube-api-access-mdmg6\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:12:02.604207 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:02.604088 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" event={"ID":"036c1253-f650-439c-8948-323ab1bd29c0","Type":"ContainerDied","Data":"a444a7171aeec314b0f96ce48cc96bd3ac6e0fe099cfe5bfeb11364711279e3c"} Apr 22 21:12:02.604207 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:02.604133 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a444a7171aeec314b0f96ce48cc96bd3ac6e0fe099cfe5bfeb11364711279e3c" Apr 22 21:12:02.604207 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:02.604166 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwqfnt" Apr 22 21:12:07.606606 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:07.606572 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-59d27" Apr 22 21:12:09.141213 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.141182 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-gfrvn"] Apr 22 21:12:09.141569 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.141550 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="036c1253-f650-439c-8948-323ab1bd29c0" containerName="extract" Apr 22 21:12:09.141569 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.141564 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="036c1253-f650-439c-8948-323ab1bd29c0" containerName="extract" Apr 22 21:12:09.141643 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.141577 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="036c1253-f650-439c-8948-323ab1bd29c0" containerName="util" Apr 22 21:12:09.141643 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.141583 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="036c1253-f650-439c-8948-323ab1bd29c0" containerName="util" Apr 22 21:12:09.141643 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.141602 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="036c1253-f650-439c-8948-323ab1bd29c0" containerName="pull" Apr 22 21:12:09.141643 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.141607 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="036c1253-f650-439c-8948-323ab1bd29c0" containerName="pull" Apr 22 21:12:09.141763 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.141658 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="036c1253-f650-439c-8948-323ab1bd29c0" containerName="extract" Apr 22 21:12:09.145973 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.145956 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-gfrvn" Apr 22 21:12:09.148426 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.148404 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:12:09.148521 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.148458 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 21:12:09.149361 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.149345 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-p9qcl\"" Apr 22 21:12:09.155415 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.155388 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-gfrvn"] Apr 22 21:12:09.215561 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.215534 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkg2h\" (UniqueName: \"kubernetes.io/projected/1ba1824a-dda6-4c46-a8f1-c95420a43eb1-kube-api-access-hkg2h\") pod \"openshift-lws-operator-bfc7f696d-gfrvn\" (UID: \"1ba1824a-dda6-4c46-a8f1-c95420a43eb1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-gfrvn" Apr 22 21:12:09.215661 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.215600 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ba1824a-dda6-4c46-a8f1-c95420a43eb1-tmp\") pod \"openshift-lws-operator-bfc7f696d-gfrvn\" (UID: \"1ba1824a-dda6-4c46-a8f1-c95420a43eb1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-gfrvn" Apr 22 21:12:09.316129 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.316108 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkg2h\" (UniqueName: \"kubernetes.io/projected/1ba1824a-dda6-4c46-a8f1-c95420a43eb1-kube-api-access-hkg2h\") pod \"openshift-lws-operator-bfc7f696d-gfrvn\" (UID: \"1ba1824a-dda6-4c46-a8f1-c95420a43eb1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-gfrvn" Apr 22 21:12:09.316255 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.316182 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ba1824a-dda6-4c46-a8f1-c95420a43eb1-tmp\") pod \"openshift-lws-operator-bfc7f696d-gfrvn\" (UID: \"1ba1824a-dda6-4c46-a8f1-c95420a43eb1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-gfrvn" Apr 22 21:12:09.316502 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.316486 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ba1824a-dda6-4c46-a8f1-c95420a43eb1-tmp\") pod \"openshift-lws-operator-bfc7f696d-gfrvn\" (UID: \"1ba1824a-dda6-4c46-a8f1-c95420a43eb1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-gfrvn" Apr 22 21:12:09.327753 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.327728 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkg2h\" (UniqueName: \"kubernetes.io/projected/1ba1824a-dda6-4c46-a8f1-c95420a43eb1-kube-api-access-hkg2h\") pod \"openshift-lws-operator-bfc7f696d-gfrvn\" (UID: \"1ba1824a-dda6-4c46-a8f1-c95420a43eb1\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-gfrvn" Apr 22 21:12:09.455984 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.455933 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-gfrvn" Apr 22 21:12:09.568777 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.568718 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-gfrvn"] Apr 22 21:12:09.570963 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:12:09.570935 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba1824a_dda6_4c46_a8f1_c95420a43eb1.slice/crio-9f7056907c72aad5c5460f8c71b24c45d32d4e72032908034bca542741e0528d WatchSource:0}: Error finding container 9f7056907c72aad5c5460f8c71b24c45d32d4e72032908034bca542741e0528d: Status 404 returned error can't find the container with id 9f7056907c72aad5c5460f8c71b24c45d32d4e72032908034bca542741e0528d Apr 22 21:12:09.627162 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:09.627115 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-gfrvn" event={"ID":"1ba1824a-dda6-4c46-a8f1-c95420a43eb1","Type":"ContainerStarted","Data":"9f7056907c72aad5c5460f8c71b24c45d32d4e72032908034bca542741e0528d"} Apr 22 21:12:11.635724 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:11.635638 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-gfrvn" event={"ID":"1ba1824a-dda6-4c46-a8f1-c95420a43eb1","Type":"ContainerStarted","Data":"2cbe09dea034adc2bf44a959869e2c2e883ad731bdbf07b91ca595b07fba3dcd"} Apr 22 21:12:11.650522 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:11.650469 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-gfrvn" podStartSLOduration=0.905065265 podStartE2EDuration="2.650451651s" podCreationTimestamp="2026-04-22 21:12:09 +0000 UTC" firstStartedPulling="2026-04-22 21:12:09.572434491 +0000 UTC m=+177.445326320" lastFinishedPulling="2026-04-22 21:12:11.317820866 +0000 UTC m=+179.190712706" observedRunningTime="2026-04-22 21:12:11.650429238 +0000 UTC m=+179.523321094" watchObservedRunningTime="2026-04-22 21:12:11.650451651 +0000 UTC m=+179.523343504" Apr 22 21:12:14.162320 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.162287 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h"] Apr 22 21:12:14.167178 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.167138 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" Apr 22 21:12:14.169531 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.169499 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 21:12:14.170511 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.170488 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 21:12:14.170932 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.170912 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8dc7t\"" Apr 22 21:12:14.171789 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.171767 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h"] Apr 22 21:12:14.259677 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.259647 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27d41add-e91d-4bb8-aa8f-f144b7e8105d-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h\" (UID: \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" Apr 22 21:12:14.259825 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.259708 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7txn\" (UniqueName: \"kubernetes.io/projected/27d41add-e91d-4bb8-aa8f-f144b7e8105d-kube-api-access-s7txn\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h\" (UID: \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" Apr 22 21:12:14.259825 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.259804 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27d41add-e91d-4bb8-aa8f-f144b7e8105d-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h\" (UID: \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" Apr 22 21:12:14.360675 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.360643 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27d41add-e91d-4bb8-aa8f-f144b7e8105d-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h\" (UID: \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" Apr 22 21:12:14.360800 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.360686 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27d41add-e91d-4bb8-aa8f-f144b7e8105d-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h\" (UID: \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" Apr 22 21:12:14.360800 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.360740 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7txn\" (UniqueName: \"kubernetes.io/projected/27d41add-e91d-4bb8-aa8f-f144b7e8105d-kube-api-access-s7txn\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h\" (UID: \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" Apr 22 21:12:14.361043 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.361021 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27d41add-e91d-4bb8-aa8f-f144b7e8105d-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h\" (UID: \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" Apr 22 21:12:14.361085 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.361054 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27d41add-e91d-4bb8-aa8f-f144b7e8105d-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h\" (UID: \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" Apr 22 21:12:14.368480 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.368447 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7txn\" (UniqueName: \"kubernetes.io/projected/27d41add-e91d-4bb8-aa8f-f144b7e8105d-kube-api-access-s7txn\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h\" (UID: \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" Apr 22 21:12:14.477693 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.477608 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" Apr 22 21:12:14.591451 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.591425 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h"] Apr 22 21:12:14.593368 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:12:14.593343 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27d41add_e91d_4bb8_aa8f_f144b7e8105d.slice/crio-6f2634d49d2e30c488c0dc1b3690c3d73d7ff50d36f65b0797347d030cbf5a88 WatchSource:0}: Error finding container 6f2634d49d2e30c488c0dc1b3690c3d73d7ff50d36f65b0797347d030cbf5a88: Status 404 returned error can't find the container with id 6f2634d49d2e30c488c0dc1b3690c3d73d7ff50d36f65b0797347d030cbf5a88 Apr 22 21:12:14.648083 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:14.648052 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" event={"ID":"27d41add-e91d-4bb8-aa8f-f144b7e8105d","Type":"ContainerStarted","Data":"6f2634d49d2e30c488c0dc1b3690c3d73d7ff50d36f65b0797347d030cbf5a88"} Apr 22 21:12:15.652602 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:15.652515 2569 generic.go:358] "Generic (PLEG): container finished" podID="27d41add-e91d-4bb8-aa8f-f144b7e8105d" containerID="e614aef2e3ae0888ebef9e18e7bfb90ed1bc3f9ec743176b89f617bb413dfe88" exitCode=0 Apr 22 21:12:15.652602 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:15.652575 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" event={"ID":"27d41add-e91d-4bb8-aa8f-f144b7e8105d","Type":"ContainerDied","Data":"e614aef2e3ae0888ebef9e18e7bfb90ed1bc3f9ec743176b89f617bb413dfe88"} Apr 22 21:12:16.657507 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:16.657422 2569 generic.go:358] "Generic (PLEG): container finished" podID="27d41add-e91d-4bb8-aa8f-f144b7e8105d" containerID="65b3627ae76bc19574439a6be16fabde891d010e1203b636ccd72a6a968cfa02" exitCode=0 Apr 22 21:12:16.657860 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:16.657502 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" event={"ID":"27d41add-e91d-4bb8-aa8f-f144b7e8105d","Type":"ContainerDied","Data":"65b3627ae76bc19574439a6be16fabde891d010e1203b636ccd72a6a968cfa02"} Apr 22 21:12:17.662272 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:17.662238 2569 generic.go:358] "Generic (PLEG): container finished" podID="27d41add-e91d-4bb8-aa8f-f144b7e8105d" containerID="ec95e2981a19179bdb774c4447156fd98aa341bcc615ec0b8cc4dc30dbe07e37" exitCode=0 Apr 22 21:12:17.662651 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:17.662291 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" event={"ID":"27d41add-e91d-4bb8-aa8f-f144b7e8105d","Type":"ContainerDied","Data":"ec95e2981a19179bdb774c4447156fd98aa341bcc615ec0b8cc4dc30dbe07e37"} Apr 22 21:12:18.784431 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:18.784405 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" Apr 22 21:12:18.899606 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:18.899578 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7txn\" (UniqueName: \"kubernetes.io/projected/27d41add-e91d-4bb8-aa8f-f144b7e8105d-kube-api-access-s7txn\") pod \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\" (UID: \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\") " Apr 22 21:12:18.899740 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:18.899666 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27d41add-e91d-4bb8-aa8f-f144b7e8105d-util\") pod \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\" (UID: \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\") " Apr 22 21:12:18.899796 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:18.899741 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27d41add-e91d-4bb8-aa8f-f144b7e8105d-bundle\") pod \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\" (UID: \"27d41add-e91d-4bb8-aa8f-f144b7e8105d\") " Apr 22 21:12:18.900437 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:18.900406 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27d41add-e91d-4bb8-aa8f-f144b7e8105d-bundle" (OuterVolumeSpecName: "bundle") pod "27d41add-e91d-4bb8-aa8f-f144b7e8105d" (UID: "27d41add-e91d-4bb8-aa8f-f144b7e8105d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:12:18.901727 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:18.901701 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27d41add-e91d-4bb8-aa8f-f144b7e8105d-kube-api-access-s7txn" (OuterVolumeSpecName: "kube-api-access-s7txn") pod "27d41add-e91d-4bb8-aa8f-f144b7e8105d" (UID: "27d41add-e91d-4bb8-aa8f-f144b7e8105d"). InnerVolumeSpecName "kube-api-access-s7txn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:12:18.905086 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:18.905064 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27d41add-e91d-4bb8-aa8f-f144b7e8105d-util" (OuterVolumeSpecName: "util") pod "27d41add-e91d-4bb8-aa8f-f144b7e8105d" (UID: "27d41add-e91d-4bb8-aa8f-f144b7e8105d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:12:19.000820 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:19.000763 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/27d41add-e91d-4bb8-aa8f-f144b7e8105d-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:12:19.000820 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:19.000786 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s7txn\" (UniqueName: \"kubernetes.io/projected/27d41add-e91d-4bb8-aa8f-f144b7e8105d-kube-api-access-s7txn\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:12:19.000820 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:19.000797 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/27d41add-e91d-4bb8-aa8f-f144b7e8105d-util\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:12:19.671017 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:19.670990 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" Apr 22 21:12:19.671017 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:19.671002 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5mwc8h" event={"ID":"27d41add-e91d-4bb8-aa8f-f144b7e8105d","Type":"ContainerDied","Data":"6f2634d49d2e30c488c0dc1b3690c3d73d7ff50d36f65b0797347d030cbf5a88"} Apr 22 21:12:19.671313 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:19.671038 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f2634d49d2e30c488c0dc1b3690c3d73d7ff50d36f65b0797347d030cbf5a88" Apr 22 21:12:30.970340 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.970302 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd"] Apr 22 21:12:30.970812 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.970787 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27d41add-e91d-4bb8-aa8f-f144b7e8105d" containerName="extract" Apr 22 21:12:30.970812 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.970807 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d41add-e91d-4bb8-aa8f-f144b7e8105d" containerName="extract" Apr 22 21:12:30.970945 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.970825 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27d41add-e91d-4bb8-aa8f-f144b7e8105d" containerName="util" Apr 22 21:12:30.970945 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.970831 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d41add-e91d-4bb8-aa8f-f144b7e8105d" containerName="util" Apr 22 21:12:30.970945 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.970841 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27d41add-e91d-4bb8-aa8f-f144b7e8105d" containerName="pull" Apr 22 21:12:30.970945 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.970847 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d41add-e91d-4bb8-aa8f-f144b7e8105d" containerName="pull" Apr 22 21:12:30.970945 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.970908 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="27d41add-e91d-4bb8-aa8f-f144b7e8105d" containerName="extract" Apr 22 21:12:30.974188 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.974171 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" Apr 22 21:12:30.976509 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.976490 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8dc7t\"" Apr 22 21:12:30.976588 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.976532 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 21:12:30.977422 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.977408 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 21:12:30.980410 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.980392 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd"] Apr 22 21:12:30.990837 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.990806 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddxvs\" (UniqueName: \"kubernetes.io/projected/e647d760-3960-43cf-bf3c-8fd914a12af8-kube-api-access-ddxvs\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd\" (UID: \"e647d760-3960-43cf-bf3c-8fd914a12af8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" Apr 22 21:12:30.990941 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.990863 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e647d760-3960-43cf-bf3c-8fd914a12af8-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd\" (UID: \"e647d760-3960-43cf-bf3c-8fd914a12af8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" Apr 22 21:12:30.990941 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:30.990928 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e647d760-3960-43cf-bf3c-8fd914a12af8-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd\" (UID: \"e647d760-3960-43cf-bf3c-8fd914a12af8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" Apr 22 21:12:31.091592 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.091561 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e647d760-3960-43cf-bf3c-8fd914a12af8-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd\" (UID: \"e647d760-3960-43cf-bf3c-8fd914a12af8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" Apr 22 21:12:31.091728 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.091638 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddxvs\" (UniqueName: \"kubernetes.io/projected/e647d760-3960-43cf-bf3c-8fd914a12af8-kube-api-access-ddxvs\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd\" (UID: \"e647d760-3960-43cf-bf3c-8fd914a12af8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" Apr 22 21:12:31.091728 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.091681 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e647d760-3960-43cf-bf3c-8fd914a12af8-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd\" (UID: \"e647d760-3960-43cf-bf3c-8fd914a12af8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" Apr 22 21:12:31.091937 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.091917 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e647d760-3960-43cf-bf3c-8fd914a12af8-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd\" (UID: \"e647d760-3960-43cf-bf3c-8fd914a12af8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" Apr 22 21:12:31.092000 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.091956 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e647d760-3960-43cf-bf3c-8fd914a12af8-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd\" (UID: \"e647d760-3960-43cf-bf3c-8fd914a12af8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" Apr 22 21:12:31.101715 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.101696 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddxvs\" (UniqueName: \"kubernetes.io/projected/e647d760-3960-43cf-bf3c-8fd914a12af8-kube-api-access-ddxvs\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd\" (UID: \"e647d760-3960-43cf-bf3c-8fd914a12af8\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" Apr 22 21:12:31.285415 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.285394 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" Apr 22 21:12:31.400625 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.400602 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd"] Apr 22 21:12:31.402396 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:12:31.402369 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode647d760_3960_43cf_bf3c_8fd914a12af8.slice/crio-83c41515985fbca1e025e16d64df4b3aefc9fcee5d20f9c7923692a938852405 WatchSource:0}: Error finding container 83c41515985fbca1e025e16d64df4b3aefc9fcee5d20f9c7923692a938852405: Status 404 returned error can't find the container with id 83c41515985fbca1e025e16d64df4b3aefc9fcee5d20f9c7923692a938852405 Apr 22 21:12:31.715866 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.715781 2569 generic.go:358] "Generic (PLEG): container finished" podID="e647d760-3960-43cf-bf3c-8fd914a12af8" containerID="78f0170e56f106f7b13548cdbe15031de1460592870d4c39292cea30786899d8" exitCode=0 Apr 22 21:12:31.715866 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.715855 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" event={"ID":"e647d760-3960-43cf-bf3c-8fd914a12af8","Type":"ContainerDied","Data":"78f0170e56f106f7b13548cdbe15031de1460592870d4c39292cea30786899d8"} Apr 22 21:12:31.716032 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.715877 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" event={"ID":"e647d760-3960-43cf-bf3c-8fd914a12af8","Type":"ContainerStarted","Data":"83c41515985fbca1e025e16d64df4b3aefc9fcee5d20f9c7923692a938852405"} Apr 22 21:12:31.949334 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.949305 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj"] Apr 22 21:12:31.952556 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.952535 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" Apr 22 21:12:31.954879 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.954859 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jq55h\"" Apr 22 21:12:31.954982 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.954887 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 21:12:31.955048 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.954981 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 21:12:31.955120 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.955100 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 21:12:31.955347 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.955334 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 21:12:31.968039 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.967998 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj"] Apr 22 21:12:31.997308 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.997283 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c42afd88-4c86-464c-b4d1-b5705f790287-apiservice-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-28svj\" (UID: \"c42afd88-4c86-464c-b4d1-b5705f790287\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" Apr 22 21:12:31.997731 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.997313 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckmvz\" (UniqueName: \"kubernetes.io/projected/c42afd88-4c86-464c-b4d1-b5705f790287-kube-api-access-ckmvz\") pod \"opendatahub-operator-controller-manager-754bfc4657-28svj\" (UID: \"c42afd88-4c86-464c-b4d1-b5705f790287\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" Apr 22 21:12:31.997731 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:31.997342 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c42afd88-4c86-464c-b4d1-b5705f790287-webhook-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-28svj\" (UID: \"c42afd88-4c86-464c-b4d1-b5705f790287\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" Apr 22 21:12:32.097949 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:32.097921 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c42afd88-4c86-464c-b4d1-b5705f790287-apiservice-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-28svj\" (UID: \"c42afd88-4c86-464c-b4d1-b5705f790287\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" Apr 22 21:12:32.098077 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:32.097964 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckmvz\" (UniqueName: \"kubernetes.io/projected/c42afd88-4c86-464c-b4d1-b5705f790287-kube-api-access-ckmvz\") pod \"opendatahub-operator-controller-manager-754bfc4657-28svj\" (UID: \"c42afd88-4c86-464c-b4d1-b5705f790287\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" Apr 22 21:12:32.098077 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:32.097999 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c42afd88-4c86-464c-b4d1-b5705f790287-webhook-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-28svj\" (UID: \"c42afd88-4c86-464c-b4d1-b5705f790287\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" Apr 22 21:12:32.100504 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:32.100483 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c42afd88-4c86-464c-b4d1-b5705f790287-apiservice-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-28svj\" (UID: \"c42afd88-4c86-464c-b4d1-b5705f790287\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" Apr 22 21:12:32.100605 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:32.100573 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c42afd88-4c86-464c-b4d1-b5705f790287-webhook-cert\") pod \"opendatahub-operator-controller-manager-754bfc4657-28svj\" (UID: \"c42afd88-4c86-464c-b4d1-b5705f790287\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" Apr 22 21:12:32.106034 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:32.106006 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckmvz\" (UniqueName: \"kubernetes.io/projected/c42afd88-4c86-464c-b4d1-b5705f790287-kube-api-access-ckmvz\") pod \"opendatahub-operator-controller-manager-754bfc4657-28svj\" (UID: \"c42afd88-4c86-464c-b4d1-b5705f790287\") " pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" Apr 22 21:12:32.263260 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:32.263234 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" Apr 22 21:12:32.398346 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:32.398292 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj"] Apr 22 21:12:32.411518 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:12:32.411488 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc42afd88_4c86_464c_b4d1_b5705f790287.slice/crio-1a4a2fd75f2c3706092c7971d1f0f49673e0f1574afa01251f8f05fb1c9417c1 WatchSource:0}: Error finding container 1a4a2fd75f2c3706092c7971d1f0f49673e0f1574afa01251f8f05fb1c9417c1: Status 404 returned error can't find the container with id 1a4a2fd75f2c3706092c7971d1f0f49673e0f1574afa01251f8f05fb1c9417c1 Apr 22 21:12:32.720587 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:32.720555 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" event={"ID":"c42afd88-4c86-464c-b4d1-b5705f790287","Type":"ContainerStarted","Data":"1a4a2fd75f2c3706092c7971d1f0f49673e0f1574afa01251f8f05fb1c9417c1"} Apr 22 21:12:32.722303 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:32.722276 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" event={"ID":"e647d760-3960-43cf-bf3c-8fd914a12af8","Type":"ContainerStarted","Data":"8820f017d08551cea7eb7abbe7569329409c9fba170a5ee832d39ff6b2945b80"} Apr 22 21:12:33.728226 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:33.728166 2569 generic.go:358] "Generic (PLEG): container finished" podID="e647d760-3960-43cf-bf3c-8fd914a12af8" containerID="8820f017d08551cea7eb7abbe7569329409c9fba170a5ee832d39ff6b2945b80" exitCode=0 Apr 22 21:12:33.728640 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:33.728337 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" event={"ID":"e647d760-3960-43cf-bf3c-8fd914a12af8","Type":"ContainerDied","Data":"8820f017d08551cea7eb7abbe7569329409c9fba170a5ee832d39ff6b2945b80"} Apr 22 21:12:34.734361 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:34.734323 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" event={"ID":"e647d760-3960-43cf-bf3c-8fd914a12af8","Type":"ContainerStarted","Data":"dbca0ee42bd2c2105f3d4c3e16a60f82a735e707c4619d07d64942123c61dfd3"} Apr 22 21:12:34.753681 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:34.753633 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" podStartSLOduration=3.861726021 podStartE2EDuration="4.753618187s" podCreationTimestamp="2026-04-22 21:12:30 +0000 UTC" firstStartedPulling="2026-04-22 21:12:31.716370218 +0000 UTC m=+199.589262048" lastFinishedPulling="2026-04-22 21:12:32.608262369 +0000 UTC m=+200.481154214" observedRunningTime="2026-04-22 21:12:34.751625933 +0000 UTC m=+202.624517821" watchObservedRunningTime="2026-04-22 21:12:34.753618187 +0000 UTC m=+202.626510090" Apr 22 21:12:35.739933 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:35.739902 2569 generic.go:358] "Generic (PLEG): container finished" podID="e647d760-3960-43cf-bf3c-8fd914a12af8" containerID="dbca0ee42bd2c2105f3d4c3e16a60f82a735e707c4619d07d64942123c61dfd3" exitCode=0 Apr 22 21:12:35.740305 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:35.740041 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" event={"ID":"e647d760-3960-43cf-bf3c-8fd914a12af8","Type":"ContainerDied","Data":"dbca0ee42bd2c2105f3d4c3e16a60f82a735e707c4619d07d64942123c61dfd3"} Apr 22 21:12:35.741475 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:35.741456 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" event={"ID":"c42afd88-4c86-464c-b4d1-b5705f790287","Type":"ContainerStarted","Data":"6f7fe9e92e1287b094f634cd7993abe067ac6b6b45c32f787822a1d65fa9ea74"} Apr 22 21:12:35.741602 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:35.741591 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" Apr 22 21:12:35.782884 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:35.782841 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" podStartSLOduration=2.010207162 podStartE2EDuration="4.782828093s" podCreationTimestamp="2026-04-22 21:12:31 +0000 UTC" firstStartedPulling="2026-04-22 21:12:32.413459755 +0000 UTC m=+200.286351585" lastFinishedPulling="2026-04-22 21:12:35.186080683 +0000 UTC m=+203.058972516" observedRunningTime="2026-04-22 21:12:35.781056666 +0000 UTC m=+203.653948515" watchObservedRunningTime="2026-04-22 21:12:35.782828093 +0000 UTC m=+203.655719944" Apr 22 21:12:36.862107 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:36.862084 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" Apr 22 21:12:36.943399 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:36.943364 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddxvs\" (UniqueName: \"kubernetes.io/projected/e647d760-3960-43cf-bf3c-8fd914a12af8-kube-api-access-ddxvs\") pod \"e647d760-3960-43cf-bf3c-8fd914a12af8\" (UID: \"e647d760-3960-43cf-bf3c-8fd914a12af8\") " Apr 22 21:12:36.943399 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:36.943403 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e647d760-3960-43cf-bf3c-8fd914a12af8-util\") pod \"e647d760-3960-43cf-bf3c-8fd914a12af8\" (UID: \"e647d760-3960-43cf-bf3c-8fd914a12af8\") " Apr 22 21:12:36.943602 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:36.943429 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e647d760-3960-43cf-bf3c-8fd914a12af8-bundle\") pod \"e647d760-3960-43cf-bf3c-8fd914a12af8\" (UID: \"e647d760-3960-43cf-bf3c-8fd914a12af8\") " Apr 22 21:12:36.944197 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:36.944167 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e647d760-3960-43cf-bf3c-8fd914a12af8-bundle" (OuterVolumeSpecName: "bundle") pod "e647d760-3960-43cf-bf3c-8fd914a12af8" (UID: "e647d760-3960-43cf-bf3c-8fd914a12af8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:12:36.945432 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:36.945408 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e647d760-3960-43cf-bf3c-8fd914a12af8-kube-api-access-ddxvs" (OuterVolumeSpecName: "kube-api-access-ddxvs") pod "e647d760-3960-43cf-bf3c-8fd914a12af8" (UID: "e647d760-3960-43cf-bf3c-8fd914a12af8"). InnerVolumeSpecName "kube-api-access-ddxvs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:12:36.947807 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:36.947765 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e647d760-3960-43cf-bf3c-8fd914a12af8-util" (OuterVolumeSpecName: "util") pod "e647d760-3960-43cf-bf3c-8fd914a12af8" (UID: "e647d760-3960-43cf-bf3c-8fd914a12af8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:12:37.044585 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:37.044557 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e647d760-3960-43cf-bf3c-8fd914a12af8-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:12:37.044585 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:37.044581 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ddxvs\" (UniqueName: \"kubernetes.io/projected/e647d760-3960-43cf-bf3c-8fd914a12af8-kube-api-access-ddxvs\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:12:37.044585 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:37.044591 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e647d760-3960-43cf-bf3c-8fd914a12af8-util\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:12:37.750979 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:37.750950 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" Apr 22 21:12:37.750979 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:37.750965 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97bkzd" event={"ID":"e647d760-3960-43cf-bf3c-8fd914a12af8","Type":"ContainerDied","Data":"83c41515985fbca1e025e16d64df4b3aefc9fcee5d20f9c7923692a938852405"} Apr 22 21:12:37.751197 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:37.750998 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83c41515985fbca1e025e16d64df4b3aefc9fcee5d20f9c7923692a938852405" Apr 22 21:12:46.746990 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:46.746960 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-754bfc4657-28svj" Apr 22 21:12:48.942259 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:48.942224 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw"] Apr 22 21:12:48.942609 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:48.942546 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e647d760-3960-43cf-bf3c-8fd914a12af8" containerName="pull" Apr 22 21:12:48.942609 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:48.942557 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e647d760-3960-43cf-bf3c-8fd914a12af8" containerName="pull" Apr 22 21:12:48.942609 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:48.942573 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e647d760-3960-43cf-bf3c-8fd914a12af8" containerName="util" Apr 22 21:12:48.942609 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:48.942578 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e647d760-3960-43cf-bf3c-8fd914a12af8" containerName="util" Apr 22 21:12:48.942609 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:48.942586 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e647d760-3960-43cf-bf3c-8fd914a12af8" containerName="extract" Apr 22 21:12:48.942609 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:48.942592 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e647d760-3960-43cf-bf3c-8fd914a12af8" containerName="extract" Apr 22 21:12:48.942787 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:48.942647 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e647d760-3960-43cf-bf3c-8fd914a12af8" containerName="extract" Apr 22 21:12:48.950804 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:48.950776 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:48.953644 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:48.953623 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw"] Apr 22 21:12:48.954480 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:48.954414 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 21:12:48.954480 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:48.954482 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 21:12:48.954647 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:48.954484 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-ktfdx\"" Apr 22 21:12:48.954647 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:48.954417 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 21:12:49.038490 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.038459 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn7tb\" (UniqueName: \"kubernetes.io/projected/257ddb76-5df7-44e1-8222-4f8fd3909da9-kube-api-access-hn7tb\") pod \"lws-controller-manager-7979f84667-9gxqw\" (UID: \"257ddb76-5df7-44e1-8222-4f8fd3909da9\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:49.038490 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.038492 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/257ddb76-5df7-44e1-8222-4f8fd3909da9-cert\") pod \"lws-controller-manager-7979f84667-9gxqw\" (UID: \"257ddb76-5df7-44e1-8222-4f8fd3909da9\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:49.038650 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.038522 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/257ddb76-5df7-44e1-8222-4f8fd3909da9-manager-config\") pod \"lws-controller-manager-7979f84667-9gxqw\" (UID: \"257ddb76-5df7-44e1-8222-4f8fd3909da9\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:49.038650 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.038558 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/257ddb76-5df7-44e1-8222-4f8fd3909da9-metrics-cert\") pod \"lws-controller-manager-7979f84667-9gxqw\" (UID: \"257ddb76-5df7-44e1-8222-4f8fd3909da9\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:49.139550 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.139501 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/257ddb76-5df7-44e1-8222-4f8fd3909da9-manager-config\") pod \"lws-controller-manager-7979f84667-9gxqw\" (UID: \"257ddb76-5df7-44e1-8222-4f8fd3909da9\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:49.139550 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.139557 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/257ddb76-5df7-44e1-8222-4f8fd3909da9-metrics-cert\") pod \"lws-controller-manager-7979f84667-9gxqw\" (UID: \"257ddb76-5df7-44e1-8222-4f8fd3909da9\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:49.139785 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.139647 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn7tb\" (UniqueName: \"kubernetes.io/projected/257ddb76-5df7-44e1-8222-4f8fd3909da9-kube-api-access-hn7tb\") pod \"lws-controller-manager-7979f84667-9gxqw\" (UID: \"257ddb76-5df7-44e1-8222-4f8fd3909da9\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:49.139785 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.139680 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/257ddb76-5df7-44e1-8222-4f8fd3909da9-cert\") pod \"lws-controller-manager-7979f84667-9gxqw\" (UID: \"257ddb76-5df7-44e1-8222-4f8fd3909da9\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:49.140167 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.140120 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/257ddb76-5df7-44e1-8222-4f8fd3909da9-manager-config\") pod \"lws-controller-manager-7979f84667-9gxqw\" (UID: \"257ddb76-5df7-44e1-8222-4f8fd3909da9\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:49.141931 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.141901 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/257ddb76-5df7-44e1-8222-4f8fd3909da9-metrics-cert\") pod \"lws-controller-manager-7979f84667-9gxqw\" (UID: \"257ddb76-5df7-44e1-8222-4f8fd3909da9\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:49.142074 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.142052 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/257ddb76-5df7-44e1-8222-4f8fd3909da9-cert\") pod \"lws-controller-manager-7979f84667-9gxqw\" (UID: \"257ddb76-5df7-44e1-8222-4f8fd3909da9\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:49.152449 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.152430 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn7tb\" (UniqueName: \"kubernetes.io/projected/257ddb76-5df7-44e1-8222-4f8fd3909da9-kube-api-access-hn7tb\") pod \"lws-controller-manager-7979f84667-9gxqw\" (UID: \"257ddb76-5df7-44e1-8222-4f8fd3909da9\") " pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:49.261417 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.261396 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:49.380127 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.380103 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw"] Apr 22 21:12:49.382098 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:12:49.382071 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod257ddb76_5df7_44e1_8222_4f8fd3909da9.slice/crio-23397a8e9e48e816dbea0e8a74d8e2be89811b6e1ff4b975f82de01a117d342e WatchSource:0}: Error finding container 23397a8e9e48e816dbea0e8a74d8e2be89811b6e1ff4b975f82de01a117d342e: Status 404 returned error can't find the container with id 23397a8e9e48e816dbea0e8a74d8e2be89811b6e1ff4b975f82de01a117d342e Apr 22 21:12:49.801038 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:49.800999 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" event={"ID":"257ddb76-5df7-44e1-8222-4f8fd3909da9","Type":"ContainerStarted","Data":"23397a8e9e48e816dbea0e8a74d8e2be89811b6e1ff4b975f82de01a117d342e"} Apr 22 21:12:51.809984 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:51.809944 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" event={"ID":"257ddb76-5df7-44e1-8222-4f8fd3909da9","Type":"ContainerStarted","Data":"2a761ca002b525a2200a3bbad426377984e7c1ba6450ed24459c49890eaf8303"} Apr 22 21:12:51.810374 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:51.810003 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:12:51.824524 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:12:51.824472 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" podStartSLOduration=1.978139744 podStartE2EDuration="3.824454912s" podCreationTimestamp="2026-04-22 21:12:48 +0000 UTC" firstStartedPulling="2026-04-22 21:12:49.383823865 +0000 UTC m=+217.256715695" lastFinishedPulling="2026-04-22 21:12:51.230139033 +0000 UTC m=+219.103030863" observedRunningTime="2026-04-22 21:12:51.82304559 +0000 UTC m=+219.695937572" watchObservedRunningTime="2026-04-22 21:12:51.824454912 +0000 UTC m=+219.697346769" Apr 22 21:13:00.870679 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.870645 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml"] Apr 22 21:13:00.879162 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.879108 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml" Apr 22 21:13:00.881484 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.881461 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 22 21:13:00.882389 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.881550 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-rsqq4\"" Apr 22 21:13:00.882717 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.882696 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 21:13:00.882851 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.882828 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 22 21:13:00.882961 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.882776 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 21:13:00.885207 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.885189 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml"] Apr 22 21:13:00.890167 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.890133 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb"] Apr 22 21:13:00.892533 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.892517 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" Apr 22 21:13:00.894550 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.894526 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 21:13:00.894652 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.894528 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8dc7t\"" Apr 22 21:13:00.894652 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.894604 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 21:13:00.901687 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.901666 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb"] Apr 22 21:13:00.946050 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.946025 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a-tmp\") pod \"kube-auth-proxy-567cb9698d-xq5ml\" (UID: \"8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml" Apr 22 21:13:00.946215 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.946117 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a-tls-certs\") pod \"kube-auth-proxy-567cb9698d-xq5ml\" (UID: \"8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml" Apr 22 21:13:00.946280 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:00.946235 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp2hf\" (UniqueName: \"kubernetes.io/projected/8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a-kube-api-access-cp2hf\") pod \"kube-auth-proxy-567cb9698d-xq5ml\" (UID: \"8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml" Apr 22 21:13:01.047581 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.047545 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb\" (UID: \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" Apr 22 21:13:01.047581 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.047584 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a-tmp\") pod \"kube-auth-proxy-567cb9698d-xq5ml\" (UID: \"8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml" Apr 22 21:13:01.047760 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.047652 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a-tls-certs\") pod \"kube-auth-proxy-567cb9698d-xq5ml\" (UID: \"8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml" Apr 22 21:13:01.047760 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.047707 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cp2hf\" (UniqueName: \"kubernetes.io/projected/8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a-kube-api-access-cp2hf\") pod \"kube-auth-proxy-567cb9698d-xq5ml\" (UID: \"8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml" Apr 22 21:13:01.047760 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.047740 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swtcr\" (UniqueName: \"kubernetes.io/projected/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-kube-api-access-swtcr\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb\" (UID: \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" Apr 22 21:13:01.047864 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.047771 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb\" (UID: \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" Apr 22 21:13:01.049756 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.049731 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a-tmp\") pod \"kube-auth-proxy-567cb9698d-xq5ml\" (UID: \"8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml" Apr 22 21:13:01.050054 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.050036 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a-tls-certs\") pod \"kube-auth-proxy-567cb9698d-xq5ml\" (UID: \"8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml" Apr 22 21:13:01.054955 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.054931 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp2hf\" (UniqueName: \"kubernetes.io/projected/8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a-kube-api-access-cp2hf\") pod \"kube-auth-proxy-567cb9698d-xq5ml\" (UID: \"8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a\") " pod="openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml" Apr 22 21:13:01.148606 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.148529 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb\" (UID: \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" Apr 22 21:13:01.148606 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.148593 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swtcr\" (UniqueName: \"kubernetes.io/projected/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-kube-api-access-swtcr\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb\" (UID: \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" Apr 22 21:13:01.148819 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.148620 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb\" (UID: \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" Apr 22 21:13:01.149025 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.149001 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb\" (UID: \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" Apr 22 21:13:01.149122 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.149039 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb\" (UID: \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" Apr 22 21:13:01.155966 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.155944 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swtcr\" (UniqueName: \"kubernetes.io/projected/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-kube-api-access-swtcr\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb\" (UID: \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" Apr 22 21:13:01.190774 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.190750 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml" Apr 22 21:13:01.202614 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.202594 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" Apr 22 21:13:01.351415 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.351378 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml"] Apr 22 21:13:01.352222 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:13:01.352137 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fd18c56_fd13_490f_8e1b_a5c5afa8cc0a.slice/crio-4080643d76d3bdf33722f6c2302d3decbcf64e91f49a33b0f56f7a2d660df037 WatchSource:0}: Error finding container 4080643d76d3bdf33722f6c2302d3decbcf64e91f49a33b0f56f7a2d660df037: Status 404 returned error can't find the container with id 4080643d76d3bdf33722f6c2302d3decbcf64e91f49a33b0f56f7a2d660df037 Apr 22 21:13:01.366210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.366185 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb"] Apr 22 21:13:01.368222 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:13:01.368189 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0ed6ad2_e03d_4ae3_95f2_0b826990d902.slice/crio-4f2ab49c90ab2872082c4a043cfcec315771b2808db8265cdea0ab1b16bcf78f WatchSource:0}: Error finding container 4f2ab49c90ab2872082c4a043cfcec315771b2808db8265cdea0ab1b16bcf78f: Status 404 returned error can't find the container with id 4f2ab49c90ab2872082c4a043cfcec315771b2808db8265cdea0ab1b16bcf78f Apr 22 21:13:01.843591 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.843558 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml" event={"ID":"8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a","Type":"ContainerStarted","Data":"4080643d76d3bdf33722f6c2302d3decbcf64e91f49a33b0f56f7a2d660df037"} Apr 22 21:13:01.844774 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.844746 2569 generic.go:358] "Generic (PLEG): container finished" podID="f0ed6ad2-e03d-4ae3-95f2-0b826990d902" containerID="ed4c1dbcd0d9f9dd2e4ed809272fddcee8b7fc258952a9cbdfd4fdd1d1404154" exitCode=0 Apr 22 21:13:01.844910 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.844813 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" event={"ID":"f0ed6ad2-e03d-4ae3-95f2-0b826990d902","Type":"ContainerDied","Data":"ed4c1dbcd0d9f9dd2e4ed809272fddcee8b7fc258952a9cbdfd4fdd1d1404154"} Apr 22 21:13:01.844910 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:01.844830 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" event={"ID":"f0ed6ad2-e03d-4ae3-95f2-0b826990d902","Type":"ContainerStarted","Data":"4f2ab49c90ab2872082c4a043cfcec315771b2808db8265cdea0ab1b16bcf78f"} Apr 22 21:13:02.815804 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:02.815777 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7979f84667-9gxqw" Apr 22 21:13:03.854497 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:03.854462 2569 generic.go:358] "Generic (PLEG): container finished" podID="f0ed6ad2-e03d-4ae3-95f2-0b826990d902" containerID="ed6624aa9d6cc918e576c1d86bb8d2daf6e42c27003e839dc53c7b617eceacd8" exitCode=0 Apr 22 21:13:03.854897 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:03.854543 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" event={"ID":"f0ed6ad2-e03d-4ae3-95f2-0b826990d902","Type":"ContainerDied","Data":"ed6624aa9d6cc918e576c1d86bb8d2daf6e42c27003e839dc53c7b617eceacd8"} Apr 22 21:13:04.860240 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:04.860197 2569 generic.go:358] "Generic (PLEG): container finished" podID="f0ed6ad2-e03d-4ae3-95f2-0b826990d902" containerID="d7433d950b420e4ad76b7865334b0ac5431e9aea728339a0f624780d30fca5d0" exitCode=0 Apr 22 21:13:04.860581 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:04.860282 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" event={"ID":"f0ed6ad2-e03d-4ae3-95f2-0b826990d902","Type":"ContainerDied","Data":"d7433d950b420e4ad76b7865334b0ac5431e9aea728339a0f624780d30fca5d0"} Apr 22 21:13:05.866965 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:05.866883 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml" event={"ID":"8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a","Type":"ContainerStarted","Data":"7e0928feb90746327e4c7f31b8c9bec553396747b12c513031b62b714400e83f"} Apr 22 21:13:05.883435 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:05.883388 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-567cb9698d-xq5ml" podStartSLOduration=1.711630652 podStartE2EDuration="5.883372953s" podCreationTimestamp="2026-04-22 21:13:00 +0000 UTC" firstStartedPulling="2026-04-22 21:13:01.353884645 +0000 UTC m=+229.226776474" lastFinishedPulling="2026-04-22 21:13:05.525626938 +0000 UTC m=+233.398518775" observedRunningTime="2026-04-22 21:13:05.881139313 +0000 UTC m=+233.754031187" watchObservedRunningTime="2026-04-22 21:13:05.883372953 +0000 UTC m=+233.756264805" Apr 22 21:13:05.993951 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:05.993927 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" Apr 22 21:13:06.091780 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:06.091749 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-util\") pod \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\" (UID: \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\") " Apr 22 21:13:06.091931 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:06.091839 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-bundle\") pod \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\" (UID: \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\") " Apr 22 21:13:06.091931 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:06.091870 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swtcr\" (UniqueName: \"kubernetes.io/projected/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-kube-api-access-swtcr\") pod \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\" (UID: \"f0ed6ad2-e03d-4ae3-95f2-0b826990d902\") " Apr 22 21:13:06.092701 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:06.092664 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-bundle" (OuterVolumeSpecName: "bundle") pod "f0ed6ad2-e03d-4ae3-95f2-0b826990d902" (UID: "f0ed6ad2-e03d-4ae3-95f2-0b826990d902"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:13:06.093961 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:06.093933 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-kube-api-access-swtcr" (OuterVolumeSpecName: "kube-api-access-swtcr") pod "f0ed6ad2-e03d-4ae3-95f2-0b826990d902" (UID: "f0ed6ad2-e03d-4ae3-95f2-0b826990d902"). InnerVolumeSpecName "kube-api-access-swtcr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:13:06.096340 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:06.096320 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-util" (OuterVolumeSpecName: "util") pod "f0ed6ad2-e03d-4ae3-95f2-0b826990d902" (UID: "f0ed6ad2-e03d-4ae3-95f2-0b826990d902"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:13:06.193372 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:06.193320 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:13:06.193372 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:06.193343 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-swtcr\" (UniqueName: \"kubernetes.io/projected/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-kube-api-access-swtcr\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:13:06.193372 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:06.193353 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f0ed6ad2-e03d-4ae3-95f2-0b826990d902-util\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:13:06.871621 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:06.871589 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" event={"ID":"f0ed6ad2-e03d-4ae3-95f2-0b826990d902","Type":"ContainerDied","Data":"4f2ab49c90ab2872082c4a043cfcec315771b2808db8265cdea0ab1b16bcf78f"} Apr 22 21:13:06.871621 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:06.871615 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483529ctb" Apr 22 21:13:06.872059 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:06.871637 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f2ab49c90ab2872082c4a043cfcec315771b2808db8265cdea0ab1b16bcf78f" Apr 22 21:13:14.991013 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:14.990976 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw"] Apr 22 21:13:14.991546 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:14.991525 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0ed6ad2-e03d-4ae3-95f2-0b826990d902" containerName="extract" Apr 22 21:13:14.991628 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:14.991549 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ed6ad2-e03d-4ae3-95f2-0b826990d902" containerName="extract" Apr 22 21:13:14.991628 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:14.991566 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0ed6ad2-e03d-4ae3-95f2-0b826990d902" containerName="pull" Apr 22 21:13:14.991628 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:14.991574 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ed6ad2-e03d-4ae3-95f2-0b826990d902" containerName="pull" Apr 22 21:13:14.991628 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:14.991598 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f0ed6ad2-e03d-4ae3-95f2-0b826990d902" containerName="util" Apr 22 21:13:14.991628 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:14.991605 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0ed6ad2-e03d-4ae3-95f2-0b826990d902" containerName="util" Apr 22 21:13:14.991870 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:14.991684 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f0ed6ad2-e03d-4ae3-95f2-0b826990d902" containerName="extract" Apr 22 21:13:14.998196 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:14.998174 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" Apr 22 21:13:15.004007 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.003315 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 21:13:15.004007 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.003934 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8dc7t\"" Apr 22 21:13:15.004292 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.004182 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 21:13:15.007181 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.007158 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw"] Apr 22 21:13:15.175092 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.175060 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a41d5ba-61c3-4163-8334-88833be9b245-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw\" (UID: \"3a41d5ba-61c3-4163-8334-88833be9b245\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" Apr 22 21:13:15.175290 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.175101 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a41d5ba-61c3-4163-8334-88833be9b245-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw\" (UID: \"3a41d5ba-61c3-4163-8334-88833be9b245\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" Apr 22 21:13:15.175290 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.175165 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxkf9\" (UniqueName: \"kubernetes.io/projected/3a41d5ba-61c3-4163-8334-88833be9b245-kube-api-access-qxkf9\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw\" (UID: \"3a41d5ba-61c3-4163-8334-88833be9b245\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" Apr 22 21:13:15.275984 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.275956 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxkf9\" (UniqueName: \"kubernetes.io/projected/3a41d5ba-61c3-4163-8334-88833be9b245-kube-api-access-qxkf9\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw\" (UID: \"3a41d5ba-61c3-4163-8334-88833be9b245\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" Apr 22 21:13:15.276134 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.276115 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a41d5ba-61c3-4163-8334-88833be9b245-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw\" (UID: \"3a41d5ba-61c3-4163-8334-88833be9b245\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" Apr 22 21:13:15.276200 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.276175 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a41d5ba-61c3-4163-8334-88833be9b245-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw\" (UID: \"3a41d5ba-61c3-4163-8334-88833be9b245\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" Apr 22 21:13:15.276482 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.276466 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a41d5ba-61c3-4163-8334-88833be9b245-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw\" (UID: \"3a41d5ba-61c3-4163-8334-88833be9b245\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" Apr 22 21:13:15.276526 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.276479 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a41d5ba-61c3-4163-8334-88833be9b245-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw\" (UID: \"3a41d5ba-61c3-4163-8334-88833be9b245\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" Apr 22 21:13:15.292187 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.292167 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxkf9\" (UniqueName: \"kubernetes.io/projected/3a41d5ba-61c3-4163-8334-88833be9b245-kube-api-access-qxkf9\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw\" (UID: \"3a41d5ba-61c3-4163-8334-88833be9b245\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" Apr 22 21:13:15.308991 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.308973 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" Apr 22 21:13:15.437939 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.437787 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw"] Apr 22 21:13:15.440036 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:13:15.439996 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a41d5ba_61c3_4163_8334_88833be9b245.slice/crio-2181a814e98e9099b6d7511898c5267f88a667e97af584f82cdc359310c44bb2 WatchSource:0}: Error finding container 2181a814e98e9099b6d7511898c5267f88a667e97af584f82cdc359310c44bb2: Status 404 returned error can't find the container with id 2181a814e98e9099b6d7511898c5267f88a667e97af584f82cdc359310c44bb2 Apr 22 21:13:15.906991 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.906906 2569 generic.go:358] "Generic (PLEG): container finished" podID="3a41d5ba-61c3-4163-8334-88833be9b245" containerID="2810a9d23259c6d78729f847ce099730b1ce1a426a3d1ad4ff4a43d1586f414c" exitCode=0 Apr 22 21:13:15.906991 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.906970 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" event={"ID":"3a41d5ba-61c3-4163-8334-88833be9b245","Type":"ContainerDied","Data":"2810a9d23259c6d78729f847ce099730b1ce1a426a3d1ad4ff4a43d1586f414c"} Apr 22 21:13:15.907182 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:15.906999 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" event={"ID":"3a41d5ba-61c3-4163-8334-88833be9b245","Type":"ContainerStarted","Data":"2181a814e98e9099b6d7511898c5267f88a667e97af584f82cdc359310c44bb2"} Apr 22 21:13:16.911797 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:16.911762 2569 generic.go:358] "Generic (PLEG): container finished" podID="3a41d5ba-61c3-4163-8334-88833be9b245" containerID="3a8280fe0dfb02ad482c80240462684fd75f0eafd16df2d4cb12be3eecf0d3e7" exitCode=0 Apr 22 21:13:16.912246 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:16.911846 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" event={"ID":"3a41d5ba-61c3-4163-8334-88833be9b245","Type":"ContainerDied","Data":"3a8280fe0dfb02ad482c80240462684fd75f0eafd16df2d4cb12be3eecf0d3e7"} Apr 22 21:13:17.917402 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:17.917367 2569 generic.go:358] "Generic (PLEG): container finished" podID="3a41d5ba-61c3-4163-8334-88833be9b245" containerID="6b2728f52792c4bd52d325f65780b49d111566f64b1e5ad1692c009be01755ca" exitCode=0 Apr 22 21:13:17.917759 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:17.917417 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" event={"ID":"3a41d5ba-61c3-4163-8334-88833be9b245","Type":"ContainerDied","Data":"6b2728f52792c4bd52d325f65780b49d111566f64b1e5ad1692c009be01755ca"} Apr 22 21:13:19.039014 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:19.038985 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" Apr 22 21:13:19.205350 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:19.205269 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a41d5ba-61c3-4163-8334-88833be9b245-util\") pod \"3a41d5ba-61c3-4163-8334-88833be9b245\" (UID: \"3a41d5ba-61c3-4163-8334-88833be9b245\") " Apr 22 21:13:19.205522 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:19.205399 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxkf9\" (UniqueName: \"kubernetes.io/projected/3a41d5ba-61c3-4163-8334-88833be9b245-kube-api-access-qxkf9\") pod \"3a41d5ba-61c3-4163-8334-88833be9b245\" (UID: \"3a41d5ba-61c3-4163-8334-88833be9b245\") " Apr 22 21:13:19.205522 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:19.205425 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a41d5ba-61c3-4163-8334-88833be9b245-bundle\") pod \"3a41d5ba-61c3-4163-8334-88833be9b245\" (UID: \"3a41d5ba-61c3-4163-8334-88833be9b245\") " Apr 22 21:13:19.206391 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:19.206367 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a41d5ba-61c3-4163-8334-88833be9b245-bundle" (OuterVolumeSpecName: "bundle") pod "3a41d5ba-61c3-4163-8334-88833be9b245" (UID: "3a41d5ba-61c3-4163-8334-88833be9b245"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:13:19.207530 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:19.207507 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a41d5ba-61c3-4163-8334-88833be9b245-kube-api-access-qxkf9" (OuterVolumeSpecName: "kube-api-access-qxkf9") pod "3a41d5ba-61c3-4163-8334-88833be9b245" (UID: "3a41d5ba-61c3-4163-8334-88833be9b245"). InnerVolumeSpecName "kube-api-access-qxkf9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:13:19.210675 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:19.210652 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a41d5ba-61c3-4163-8334-88833be9b245-util" (OuterVolumeSpecName: "util") pod "3a41d5ba-61c3-4163-8334-88833be9b245" (UID: "3a41d5ba-61c3-4163-8334-88833be9b245"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:13:19.306866 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:19.306839 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qxkf9\" (UniqueName: \"kubernetes.io/projected/3a41d5ba-61c3-4163-8334-88833be9b245-kube-api-access-qxkf9\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:13:19.306866 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:19.306863 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3a41d5ba-61c3-4163-8334-88833be9b245-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:13:19.307005 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:19.306873 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3a41d5ba-61c3-4163-8334-88833be9b245-util\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:13:19.926668 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:19.926623 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" event={"ID":"3a41d5ba-61c3-4163-8334-88833be9b245","Type":"ContainerDied","Data":"2181a814e98e9099b6d7511898c5267f88a667e97af584f82cdc359310c44bb2"} Apr 22 21:13:19.926668 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:19.926658 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebh76sw" Apr 22 21:13:19.926668 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:13:19.926669 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2181a814e98e9099b6d7511898c5267f88a667e97af584f82cdc359310c44bb2" Apr 22 21:14:05.259809 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.259776 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq"] Apr 22 21:14:05.260222 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.260162 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a41d5ba-61c3-4163-8334-88833be9b245" containerName="pull" Apr 22 21:14:05.260222 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.260174 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a41d5ba-61c3-4163-8334-88833be9b245" containerName="pull" Apr 22 21:14:05.260222 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.260198 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a41d5ba-61c3-4163-8334-88833be9b245" containerName="extract" Apr 22 21:14:05.260222 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.260207 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a41d5ba-61c3-4163-8334-88833be9b245" containerName="extract" Apr 22 21:14:05.260352 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.260228 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a41d5ba-61c3-4163-8334-88833be9b245" containerName="util" Apr 22 21:14:05.260352 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.260234 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a41d5ba-61c3-4163-8334-88833be9b245" containerName="util" Apr 22 21:14:05.260352 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.260284 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a41d5ba-61c3-4163-8334-88833be9b245" containerName="extract" Apr 22 21:14:05.262325 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.262310 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" Apr 22 21:14:05.264544 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.264524 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 21:14:05.264646 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.264567 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-d8nc4\"" Apr 22 21:14:05.265480 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.265467 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 21:14:05.273123 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.273104 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq"] Apr 22 21:14:05.370332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.370303 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3569a2f-8690-48b6-8973-ea29801279cc-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq\" (UID: \"d3569a2f-8690-48b6-8973-ea29801279cc\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" Apr 22 21:14:05.370332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.370333 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3569a2f-8690-48b6-8973-ea29801279cc-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq\" (UID: \"d3569a2f-8690-48b6-8973-ea29801279cc\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" Apr 22 21:14:05.370502 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.370385 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdpzn\" (UniqueName: \"kubernetes.io/projected/d3569a2f-8690-48b6-8973-ea29801279cc-kube-api-access-fdpzn\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq\" (UID: \"d3569a2f-8690-48b6-8973-ea29801279cc\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" Apr 22 21:14:05.470918 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.470878 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3569a2f-8690-48b6-8973-ea29801279cc-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq\" (UID: \"d3569a2f-8690-48b6-8973-ea29801279cc\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" Apr 22 21:14:05.470918 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.470920 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3569a2f-8690-48b6-8973-ea29801279cc-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq\" (UID: \"d3569a2f-8690-48b6-8973-ea29801279cc\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" Apr 22 21:14:05.471086 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.470962 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fdpzn\" (UniqueName: \"kubernetes.io/projected/d3569a2f-8690-48b6-8973-ea29801279cc-kube-api-access-fdpzn\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq\" (UID: \"d3569a2f-8690-48b6-8973-ea29801279cc\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" Apr 22 21:14:05.471330 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.471308 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3569a2f-8690-48b6-8973-ea29801279cc-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq\" (UID: \"d3569a2f-8690-48b6-8973-ea29801279cc\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" Apr 22 21:14:05.471377 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.471315 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3569a2f-8690-48b6-8973-ea29801279cc-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq\" (UID: \"d3569a2f-8690-48b6-8973-ea29801279cc\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" Apr 22 21:14:05.487401 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.487375 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdpzn\" (UniqueName: \"kubernetes.io/projected/d3569a2f-8690-48b6-8973-ea29801279cc-kube-api-access-fdpzn\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq\" (UID: \"d3569a2f-8690-48b6-8973-ea29801279cc\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" Apr 22 21:14:05.571712 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.571637 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" Apr 22 21:14:05.685646 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:05.685607 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq"] Apr 22 21:14:05.688381 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:14:05.688354 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3569a2f_8690_48b6_8973_ea29801279cc.slice/crio-8cad36f751672652cbfd95b0a9e51311fdf0e17852d86a987c35303e6599ef1a WatchSource:0}: Error finding container 8cad36f751672652cbfd95b0a9e51311fdf0e17852d86a987c35303e6599ef1a: Status 404 returned error can't find the container with id 8cad36f751672652cbfd95b0a9e51311fdf0e17852d86a987c35303e6599ef1a Apr 22 21:14:06.013048 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.013018 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g"] Apr 22 21:14:06.015622 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.015607 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" Apr 22 21:14:06.021419 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.021390 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g"] Apr 22 21:14:06.084135 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.084103 2569 generic.go:358] "Generic (PLEG): container finished" podID="d3569a2f-8690-48b6-8973-ea29801279cc" containerID="41bd824f0f26d9efe68f4a652c28c8b403c5c05ba95487728b84a437ba18c35d" exitCode=0 Apr 22 21:14:06.084291 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.084193 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" event={"ID":"d3569a2f-8690-48b6-8973-ea29801279cc","Type":"ContainerDied","Data":"41bd824f0f26d9efe68f4a652c28c8b403c5c05ba95487728b84a437ba18c35d"} Apr 22 21:14:06.084291 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.084222 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" event={"ID":"d3569a2f-8690-48b6-8973-ea29801279cc","Type":"ContainerStarted","Data":"8cad36f751672652cbfd95b0a9e51311fdf0e17852d86a987c35303e6599ef1a"} Apr 22 21:14:06.179099 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.179072 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9f19330-b51c-4326-b771-8a44d93476d1-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g\" (UID: \"f9f19330-b51c-4326-b771-8a44d93476d1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" Apr 22 21:14:06.179259 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.179168 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9f19330-b51c-4326-b771-8a44d93476d1-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g\" (UID: \"f9f19330-b51c-4326-b771-8a44d93476d1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" Apr 22 21:14:06.179259 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.179189 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s45sg\" (UniqueName: \"kubernetes.io/projected/f9f19330-b51c-4326-b771-8a44d93476d1-kube-api-access-s45sg\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g\" (UID: \"f9f19330-b51c-4326-b771-8a44d93476d1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" Apr 22 21:14:06.280216 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.280127 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9f19330-b51c-4326-b771-8a44d93476d1-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g\" (UID: \"f9f19330-b51c-4326-b771-8a44d93476d1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" Apr 22 21:14:06.280216 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.280179 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s45sg\" (UniqueName: \"kubernetes.io/projected/f9f19330-b51c-4326-b771-8a44d93476d1-kube-api-access-s45sg\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g\" (UID: \"f9f19330-b51c-4326-b771-8a44d93476d1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" Apr 22 21:14:06.280557 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.280238 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9f19330-b51c-4326-b771-8a44d93476d1-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g\" (UID: \"f9f19330-b51c-4326-b771-8a44d93476d1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" Apr 22 21:14:06.280557 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.280496 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9f19330-b51c-4326-b771-8a44d93476d1-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g\" (UID: \"f9f19330-b51c-4326-b771-8a44d93476d1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" Apr 22 21:14:06.280623 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.280572 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9f19330-b51c-4326-b771-8a44d93476d1-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g\" (UID: \"f9f19330-b51c-4326-b771-8a44d93476d1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" Apr 22 21:14:06.287621 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.287597 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s45sg\" (UniqueName: \"kubernetes.io/projected/f9f19330-b51c-4326-b771-8a44d93476d1-kube-api-access-s45sg\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g\" (UID: \"f9f19330-b51c-4326-b771-8a44d93476d1\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" Apr 22 21:14:06.325568 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.325546 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" Apr 22 21:14:06.438436 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.438412 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g"] Apr 22 21:14:06.440245 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:14:06.440219 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f19330_b51c_4326_b771_8a44d93476d1.slice/crio-3f1d06423d181c065a495f5cc0c2817b02adf13233ae489845aee1188623eaaa WatchSource:0}: Error finding container 3f1d06423d181c065a495f5cc0c2817b02adf13233ae489845aee1188623eaaa: Status 404 returned error can't find the container with id 3f1d06423d181c065a495f5cc0c2817b02adf13233ae489845aee1188623eaaa Apr 22 21:14:06.613903 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.613868 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh"] Apr 22 21:14:06.616357 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.616338 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" Apr 22 21:14:06.621910 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.621890 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh"] Apr 22 21:14:06.785495 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.785430 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vskjq\" (UniqueName: \"kubernetes.io/projected/382a169a-813d-4ae9-b423-42732d0bf9cf-kube-api-access-vskjq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh\" (UID: \"382a169a-813d-4ae9-b423-42732d0bf9cf\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" Apr 22 21:14:06.785495 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.785486 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/382a169a-813d-4ae9-b423-42732d0bf9cf-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh\" (UID: \"382a169a-813d-4ae9-b423-42732d0bf9cf\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" Apr 22 21:14:06.785672 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.785588 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/382a169a-813d-4ae9-b423-42732d0bf9cf-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh\" (UID: \"382a169a-813d-4ae9-b423-42732d0bf9cf\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" Apr 22 21:14:06.886017 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.885992 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/382a169a-813d-4ae9-b423-42732d0bf9cf-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh\" (UID: \"382a169a-813d-4ae9-b423-42732d0bf9cf\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" Apr 22 21:14:06.886113 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.886053 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/382a169a-813d-4ae9-b423-42732d0bf9cf-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh\" (UID: \"382a169a-813d-4ae9-b423-42732d0bf9cf\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" Apr 22 21:14:06.886113 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.886105 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vskjq\" (UniqueName: \"kubernetes.io/projected/382a169a-813d-4ae9-b423-42732d0bf9cf-kube-api-access-vskjq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh\" (UID: \"382a169a-813d-4ae9-b423-42732d0bf9cf\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" Apr 22 21:14:06.886448 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.886426 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/382a169a-813d-4ae9-b423-42732d0bf9cf-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh\" (UID: \"382a169a-813d-4ae9-b423-42732d0bf9cf\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" Apr 22 21:14:06.886520 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.886458 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/382a169a-813d-4ae9-b423-42732d0bf9cf-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh\" (UID: \"382a169a-813d-4ae9-b423-42732d0bf9cf\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" Apr 22 21:14:06.893678 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.893656 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vskjq\" (UniqueName: \"kubernetes.io/projected/382a169a-813d-4ae9-b423-42732d0bf9cf-kube-api-access-vskjq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh\" (UID: \"382a169a-813d-4ae9-b423-42732d0bf9cf\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" Apr 22 21:14:06.926704 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:06.926680 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" Apr 22 21:14:07.072962 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.072939 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh"] Apr 22 21:14:07.074432 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:14:07.074409 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod382a169a_813d_4ae9_b423_42732d0bf9cf.slice/crio-fd4a605cbb0d6dcedfe9d2e2ea5b0fe67bef0d958c19ca0fcb09b71b369729aa WatchSource:0}: Error finding container fd4a605cbb0d6dcedfe9d2e2ea5b0fe67bef0d958c19ca0fcb09b71b369729aa: Status 404 returned error can't find the container with id fd4a605cbb0d6dcedfe9d2e2ea5b0fe67bef0d958c19ca0fcb09b71b369729aa Apr 22 21:14:07.089663 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.089634 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" event={"ID":"382a169a-813d-4ae9-b423-42732d0bf9cf","Type":"ContainerStarted","Data":"fd4a605cbb0d6dcedfe9d2e2ea5b0fe67bef0d958c19ca0fcb09b71b369729aa"} Apr 22 21:14:07.091254 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.091228 2569 generic.go:358] "Generic (PLEG): container finished" podID="d3569a2f-8690-48b6-8973-ea29801279cc" containerID="62d254e0ce6abd4baeac9e0b67ec87a4b21b199ebc180c85348f4f9611b4c2ee" exitCode=0 Apr 22 21:14:07.091358 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.091316 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" event={"ID":"d3569a2f-8690-48b6-8973-ea29801279cc","Type":"ContainerDied","Data":"62d254e0ce6abd4baeac9e0b67ec87a4b21b199ebc180c85348f4f9611b4c2ee"} Apr 22 21:14:07.092965 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.092946 2569 generic.go:358] "Generic (PLEG): container finished" podID="f9f19330-b51c-4326-b771-8a44d93476d1" containerID="8ed98461b17bd7e9c988db67a16b2190bf2cbc52c40b2ee656ad0b570e18d007" exitCode=0 Apr 22 21:14:07.093079 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.093032 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" event={"ID":"f9f19330-b51c-4326-b771-8a44d93476d1","Type":"ContainerDied","Data":"8ed98461b17bd7e9c988db67a16b2190bf2cbc52c40b2ee656ad0b570e18d007"} Apr 22 21:14:07.093079 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.093070 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" event={"ID":"f9f19330-b51c-4326-b771-8a44d93476d1","Type":"ContainerStarted","Data":"3f1d06423d181c065a495f5cc0c2817b02adf13233ae489845aee1188623eaaa"} Apr 22 21:14:07.098989 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.098969 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj"] Apr 22 21:14:07.101878 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.101859 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" Apr 22 21:14:07.113589 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.113566 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj"] Apr 22 21:14:07.189252 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.189225 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj\" (UID: \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" Apr 22 21:14:07.189369 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.189267 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj\" (UID: \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" Apr 22 21:14:07.189369 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.189332 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz89w\" (UniqueName: \"kubernetes.io/projected/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-kube-api-access-bz89w\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj\" (UID: \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" Apr 22 21:14:07.290116 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.290084 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj\" (UID: \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" Apr 22 21:14:07.290517 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.290126 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj\" (UID: \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" Apr 22 21:14:07.290517 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.290202 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bz89w\" (UniqueName: \"kubernetes.io/projected/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-kube-api-access-bz89w\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj\" (UID: \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" Apr 22 21:14:07.290517 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.290506 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj\" (UID: \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" Apr 22 21:14:07.290675 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.290517 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj\" (UID: \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" Apr 22 21:14:07.297676 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.297655 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz89w\" (UniqueName: \"kubernetes.io/projected/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-kube-api-access-bz89w\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj\" (UID: \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" Apr 22 21:14:07.428020 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.427951 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" Apr 22 21:14:07.544239 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:07.544214 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj"] Apr 22 21:14:07.545935 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:14:07.545910 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce7edcdd_59ce_46aa_ac0f_da1c0bb6b70c.slice/crio-278d02ad6748b99296ff4d0db95543f298dd8675ec8b25d6c78930b8cb611eba WatchSource:0}: Error finding container 278d02ad6748b99296ff4d0db95543f298dd8675ec8b25d6c78930b8cb611eba: Status 404 returned error can't find the container with id 278d02ad6748b99296ff4d0db95543f298dd8675ec8b25d6c78930b8cb611eba Apr 22 21:14:08.097734 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:08.097700 2569 generic.go:358] "Generic (PLEG): container finished" podID="382a169a-813d-4ae9-b423-42732d0bf9cf" containerID="83845880e8ddd4a2aad998c27074349404df61fe99d6e8129f2397ca52ee75ff" exitCode=0 Apr 22 21:14:08.097939 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:08.097774 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" event={"ID":"382a169a-813d-4ae9-b423-42732d0bf9cf","Type":"ContainerDied","Data":"83845880e8ddd4a2aad998c27074349404df61fe99d6e8129f2397ca52ee75ff"} Apr 22 21:14:08.099131 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:08.099110 2569 generic.go:358] "Generic (PLEG): container finished" podID="ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c" containerID="9d44ef867ae494ffa8f1d7745b022f05b9e21877c4210c915eceee95d41d6cfd" exitCode=0 Apr 22 21:14:08.099215 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:08.099200 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" event={"ID":"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c","Type":"ContainerDied","Data":"9d44ef867ae494ffa8f1d7745b022f05b9e21877c4210c915eceee95d41d6cfd"} Apr 22 21:14:08.099259 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:08.099229 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" event={"ID":"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c","Type":"ContainerStarted","Data":"278d02ad6748b99296ff4d0db95543f298dd8675ec8b25d6c78930b8cb611eba"} Apr 22 21:14:08.101307 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:08.101176 2569 generic.go:358] "Generic (PLEG): container finished" podID="d3569a2f-8690-48b6-8973-ea29801279cc" containerID="5f757cfd2616a024608eb430cdd345e9d4714765d39ad584353f7704814f8e41" exitCode=0 Apr 22 21:14:08.101307 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:08.101249 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" event={"ID":"d3569a2f-8690-48b6-8973-ea29801279cc","Type":"ContainerDied","Data":"5f757cfd2616a024608eb430cdd345e9d4714765d39ad584353f7704814f8e41"} Apr 22 21:14:08.102902 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:08.102882 2569 generic.go:358] "Generic (PLEG): container finished" podID="f9f19330-b51c-4326-b771-8a44d93476d1" containerID="d69448aa145858986cb28cfb748382afb8ebd1451f89b63fd9724aeb457d8241" exitCode=0 Apr 22 21:14:08.103009 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:08.102968 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" event={"ID":"f9f19330-b51c-4326-b771-8a44d93476d1","Type":"ContainerDied","Data":"d69448aa145858986cb28cfb748382afb8ebd1451f89b63fd9724aeb457d8241"} Apr 22 21:14:09.108674 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.108640 2569 generic.go:358] "Generic (PLEG): container finished" podID="f9f19330-b51c-4326-b771-8a44d93476d1" containerID="c2d65b2690ba83a2aa4947928290d11b73b132b9ad1d87b62523c11224aba5e2" exitCode=0 Apr 22 21:14:09.109046 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.108724 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" event={"ID":"f9f19330-b51c-4326-b771-8a44d93476d1","Type":"ContainerDied","Data":"c2d65b2690ba83a2aa4947928290d11b73b132b9ad1d87b62523c11224aba5e2"} Apr 22 21:14:09.110270 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.110247 2569 generic.go:358] "Generic (PLEG): container finished" podID="382a169a-813d-4ae9-b423-42732d0bf9cf" containerID="39c5f444dcdfe16495dc033c6e87106f2ce0ee537de60feec3afed3e2be5f5f9" exitCode=0 Apr 22 21:14:09.110375 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.110322 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" event={"ID":"382a169a-813d-4ae9-b423-42732d0bf9cf","Type":"ContainerDied","Data":"39c5f444dcdfe16495dc033c6e87106f2ce0ee537de60feec3afed3e2be5f5f9"} Apr 22 21:14:09.114921 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.114900 2569 generic.go:358] "Generic (PLEG): container finished" podID="ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c" containerID="0231c6ce77933b41b9820da37771f734e21951e31ea0450ba4adad26cffede74" exitCode=0 Apr 22 21:14:09.115027 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.114973 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" event={"ID":"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c","Type":"ContainerDied","Data":"0231c6ce77933b41b9820da37771f734e21951e31ea0450ba4adad26cffede74"} Apr 22 21:14:09.235337 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.235311 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" Apr 22 21:14:09.411035 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.410974 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3569a2f-8690-48b6-8973-ea29801279cc-util\") pod \"d3569a2f-8690-48b6-8973-ea29801279cc\" (UID: \"d3569a2f-8690-48b6-8973-ea29801279cc\") " Apr 22 21:14:09.411035 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.411031 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdpzn\" (UniqueName: \"kubernetes.io/projected/d3569a2f-8690-48b6-8973-ea29801279cc-kube-api-access-fdpzn\") pod \"d3569a2f-8690-48b6-8973-ea29801279cc\" (UID: \"d3569a2f-8690-48b6-8973-ea29801279cc\") " Apr 22 21:14:09.411267 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.411080 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3569a2f-8690-48b6-8973-ea29801279cc-bundle\") pod \"d3569a2f-8690-48b6-8973-ea29801279cc\" (UID: \"d3569a2f-8690-48b6-8973-ea29801279cc\") " Apr 22 21:14:09.411648 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.411620 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3569a2f-8690-48b6-8973-ea29801279cc-bundle" (OuterVolumeSpecName: "bundle") pod "d3569a2f-8690-48b6-8973-ea29801279cc" (UID: "d3569a2f-8690-48b6-8973-ea29801279cc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:14:09.413169 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.413131 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3569a2f-8690-48b6-8973-ea29801279cc-kube-api-access-fdpzn" (OuterVolumeSpecName: "kube-api-access-fdpzn") pod "d3569a2f-8690-48b6-8973-ea29801279cc" (UID: "d3569a2f-8690-48b6-8973-ea29801279cc"). InnerVolumeSpecName "kube-api-access-fdpzn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:14:09.416102 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.416079 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3569a2f-8690-48b6-8973-ea29801279cc-util" (OuterVolumeSpecName: "util") pod "d3569a2f-8690-48b6-8973-ea29801279cc" (UID: "d3569a2f-8690-48b6-8973-ea29801279cc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:14:09.511831 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.511806 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fdpzn\" (UniqueName: \"kubernetes.io/projected/d3569a2f-8690-48b6-8973-ea29801279cc-kube-api-access-fdpzn\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:14:09.511831 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.511830 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3569a2f-8690-48b6-8973-ea29801279cc-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:14:09.511982 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:09.511839 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3569a2f-8690-48b6-8973-ea29801279cc-util\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:14:10.120505 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.120471 2569 generic.go:358] "Generic (PLEG): container finished" podID="382a169a-813d-4ae9-b423-42732d0bf9cf" containerID="18b935c0f1ceec218f36c1002a7a908ce3eb735053459ac02c7aeff854b22138" exitCode=0 Apr 22 21:14:10.120897 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.120556 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" event={"ID":"382a169a-813d-4ae9-b423-42732d0bf9cf","Type":"ContainerDied","Data":"18b935c0f1ceec218f36c1002a7a908ce3eb735053459ac02c7aeff854b22138"} Apr 22 21:14:10.122380 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.122359 2569 generic.go:358] "Generic (PLEG): container finished" podID="ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c" containerID="1dee7f35de9da5c5b20c67c0d892041ce3e70057c9c29d0f51729701e1ba5a93" exitCode=0 Apr 22 21:14:10.122502 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.122438 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" event={"ID":"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c","Type":"ContainerDied","Data":"1dee7f35de9da5c5b20c67c0d892041ce3e70057c9c29d0f51729701e1ba5a93"} Apr 22 21:14:10.123910 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.123890 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" event={"ID":"d3569a2f-8690-48b6-8973-ea29801279cc","Type":"ContainerDied","Data":"8cad36f751672652cbfd95b0a9e51311fdf0e17852d86a987c35303e6599ef1a"} Apr 22 21:14:10.123971 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.123919 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cad36f751672652cbfd95b0a9e51311fdf0e17852d86a987c35303e6599ef1a" Apr 22 21:14:10.124066 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.124053 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq" Apr 22 21:14:10.248781 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.248762 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" Apr 22 21:14:10.419349 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.419268 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s45sg\" (UniqueName: \"kubernetes.io/projected/f9f19330-b51c-4326-b771-8a44d93476d1-kube-api-access-s45sg\") pod \"f9f19330-b51c-4326-b771-8a44d93476d1\" (UID: \"f9f19330-b51c-4326-b771-8a44d93476d1\") " Apr 22 21:14:10.419349 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.419331 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9f19330-b51c-4326-b771-8a44d93476d1-bundle\") pod \"f9f19330-b51c-4326-b771-8a44d93476d1\" (UID: \"f9f19330-b51c-4326-b771-8a44d93476d1\") " Apr 22 21:14:10.419551 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.419364 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9f19330-b51c-4326-b771-8a44d93476d1-util\") pod \"f9f19330-b51c-4326-b771-8a44d93476d1\" (UID: \"f9f19330-b51c-4326-b771-8a44d93476d1\") " Apr 22 21:14:10.419907 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.419880 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f19330-b51c-4326-b771-8a44d93476d1-bundle" (OuterVolumeSpecName: "bundle") pod "f9f19330-b51c-4326-b771-8a44d93476d1" (UID: "f9f19330-b51c-4326-b771-8a44d93476d1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:14:10.421443 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.421417 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f19330-b51c-4326-b771-8a44d93476d1-kube-api-access-s45sg" (OuterVolumeSpecName: "kube-api-access-s45sg") pod "f9f19330-b51c-4326-b771-8a44d93476d1" (UID: "f9f19330-b51c-4326-b771-8a44d93476d1"). InnerVolumeSpecName "kube-api-access-s45sg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:14:10.425063 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.425031 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9f19330-b51c-4326-b771-8a44d93476d1-util" (OuterVolumeSpecName: "util") pod "f9f19330-b51c-4326-b771-8a44d93476d1" (UID: "f9f19330-b51c-4326-b771-8a44d93476d1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:14:10.519889 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.519863 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9f19330-b51c-4326-b771-8a44d93476d1-util\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:14:10.519889 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.519886 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s45sg\" (UniqueName: \"kubernetes.io/projected/f9f19330-b51c-4326-b771-8a44d93476d1-kube-api-access-s45sg\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:14:10.520021 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:10.519896 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9f19330-b51c-4326-b771-8a44d93476d1-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:14:11.129209 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.129108 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" event={"ID":"f9f19330-b51c-4326-b771-8a44d93476d1","Type":"ContainerDied","Data":"3f1d06423d181c065a495f5cc0c2817b02adf13233ae489845aee1188623eaaa"} Apr 22 21:14:11.129209 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.129156 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g" Apr 22 21:14:11.129609 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.129159 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f1d06423d181c065a495f5cc0c2817b02adf13233ae489845aee1188623eaaa" Apr 22 21:14:11.275094 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.275076 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" Apr 22 21:14:11.278698 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.278680 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" Apr 22 21:14:11.427367 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.427304 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/382a169a-813d-4ae9-b423-42732d0bf9cf-bundle\") pod \"382a169a-813d-4ae9-b423-42732d0bf9cf\" (UID: \"382a169a-813d-4ae9-b423-42732d0bf9cf\") " Apr 22 21:14:11.427367 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.427333 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz89w\" (UniqueName: \"kubernetes.io/projected/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-kube-api-access-bz89w\") pod \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\" (UID: \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\") " Apr 22 21:14:11.427549 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.427371 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-bundle\") pod \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\" (UID: \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\") " Apr 22 21:14:11.427549 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.427500 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/382a169a-813d-4ae9-b423-42732d0bf9cf-util\") pod \"382a169a-813d-4ae9-b423-42732d0bf9cf\" (UID: \"382a169a-813d-4ae9-b423-42732d0bf9cf\") " Apr 22 21:14:11.427549 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.427537 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-util\") pod \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\" (UID: \"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c\") " Apr 22 21:14:11.427693 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.427580 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vskjq\" (UniqueName: \"kubernetes.io/projected/382a169a-813d-4ae9-b423-42732d0bf9cf-kube-api-access-vskjq\") pod \"382a169a-813d-4ae9-b423-42732d0bf9cf\" (UID: \"382a169a-813d-4ae9-b423-42732d0bf9cf\") " Apr 22 21:14:11.428007 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.427963 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-bundle" (OuterVolumeSpecName: "bundle") pod "ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c" (UID: "ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:14:11.428122 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.428041 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/382a169a-813d-4ae9-b423-42732d0bf9cf-bundle" (OuterVolumeSpecName: "bundle") pod "382a169a-813d-4ae9-b423-42732d0bf9cf" (UID: "382a169a-813d-4ae9-b423-42732d0bf9cf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:14:11.429544 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.429511 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-kube-api-access-bz89w" (OuterVolumeSpecName: "kube-api-access-bz89w") pod "ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c" (UID: "ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c"). InnerVolumeSpecName "kube-api-access-bz89w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:14:11.429658 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.429603 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/382a169a-813d-4ae9-b423-42732d0bf9cf-kube-api-access-vskjq" (OuterVolumeSpecName: "kube-api-access-vskjq") pod "382a169a-813d-4ae9-b423-42732d0bf9cf" (UID: "382a169a-813d-4ae9-b423-42732d0bf9cf"). InnerVolumeSpecName "kube-api-access-vskjq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:14:11.433137 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.433101 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/382a169a-813d-4ae9-b423-42732d0bf9cf-util" (OuterVolumeSpecName: "util") pod "382a169a-813d-4ae9-b423-42732d0bf9cf" (UID: "382a169a-813d-4ae9-b423-42732d0bf9cf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:14:11.433413 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.433390 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-util" (OuterVolumeSpecName: "util") pod "ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c" (UID: "ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:14:11.528585 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.528554 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/382a169a-813d-4ae9-b423-42732d0bf9cf-util\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:14:11.528585 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.528578 2569 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-util\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:14:11.528585 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.528588 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vskjq\" (UniqueName: \"kubernetes.io/projected/382a169a-813d-4ae9-b423-42732d0bf9cf-kube-api-access-vskjq\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:14:11.528793 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.528598 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/382a169a-813d-4ae9-b423-42732d0bf9cf-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:14:11.528793 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.528610 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bz89w\" (UniqueName: \"kubernetes.io/projected/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-kube-api-access-bz89w\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:14:11.528793 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:11.528618 2569 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c-bundle\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:14:12.134812 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:12.134786 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" Apr 22 21:14:12.135231 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:12.134781 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh" event={"ID":"382a169a-813d-4ae9-b423-42732d0bf9cf","Type":"ContainerDied","Data":"fd4a605cbb0d6dcedfe9d2e2ea5b0fe67bef0d958c19ca0fcb09b71b369729aa"} Apr 22 21:14:12.135231 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:12.134889 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd4a605cbb0d6dcedfe9d2e2ea5b0fe67bef0d958c19ca0fcb09b71b369729aa" Apr 22 21:14:12.136474 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:12.136450 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" event={"ID":"ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c","Type":"ContainerDied","Data":"278d02ad6748b99296ff4d0db95543f298dd8675ec8b25d6c78930b8cb611eba"} Apr 22 21:14:12.136594 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:12.136482 2569 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="278d02ad6748b99296ff4d0db95543f298dd8675ec8b25d6c78930b8cb611eba" Apr 22 21:14:12.136594 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:12.136462 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj" Apr 22 21:14:12.669239 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:12.669213 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:14:12.670092 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:12.670060 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:14:12.672553 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:12.672536 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 21:14:30.389723 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.389691 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-vz65t"] Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390203 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3569a2f-8690-48b6-8973-ea29801279cc" containerName="pull" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390218 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3569a2f-8690-48b6-8973-ea29801279cc" containerName="pull" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390230 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9f19330-b51c-4326-b771-8a44d93476d1" containerName="util" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390235 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f19330-b51c-4326-b771-8a44d93476d1" containerName="util" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390242 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c" containerName="pull" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390248 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c" containerName="pull" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390259 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="382a169a-813d-4ae9-b423-42732d0bf9cf" containerName="util" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390263 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="382a169a-813d-4ae9-b423-42732d0bf9cf" containerName="util" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390269 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3569a2f-8690-48b6-8973-ea29801279cc" containerName="extract" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390274 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3569a2f-8690-48b6-8973-ea29801279cc" containerName="extract" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390283 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="382a169a-813d-4ae9-b423-42732d0bf9cf" containerName="extract" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390289 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="382a169a-813d-4ae9-b423-42732d0bf9cf" containerName="extract" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390296 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="382a169a-813d-4ae9-b423-42732d0bf9cf" containerName="pull" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390301 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="382a169a-813d-4ae9-b423-42732d0bf9cf" containerName="pull" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390308 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9f19330-b51c-4326-b771-8a44d93476d1" containerName="extract" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390313 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f19330-b51c-4326-b771-8a44d93476d1" containerName="extract" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390319 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c" containerName="extract" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390324 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c" containerName="extract" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390330 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c" containerName="util" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390334 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c" containerName="util" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390341 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9f19330-b51c-4326-b771-8a44d93476d1" containerName="pull" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390346 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f19330-b51c-4326-b771-8a44d93476d1" containerName="pull" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390352 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3569a2f-8690-48b6-8973-ea29801279cc" containerName="util" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390356 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3569a2f-8690-48b6-8973-ea29801279cc" containerName="util" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390421 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c" containerName="extract" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390428 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9f19330-b51c-4326-b771-8a44d93476d1" containerName="extract" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390435 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3569a2f-8690-48b6-8973-ea29801279cc" containerName="extract" Apr 22 21:14:30.391210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.390443 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="382a169a-813d-4ae9-b423-42732d0bf9cf" containerName="extract" Apr 22 21:14:30.392342 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.392323 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-vz65t" Apr 22 21:14:30.394734 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.394714 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 21:14:30.394826 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.394737 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 22 21:14:30.394826 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.394796 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 21:14:30.395663 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.395647 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-mmcqs\"" Apr 22 21:14:30.406203 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.406180 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-vz65t"] Apr 22 21:14:30.476084 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.476059 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z47m8\" (UniqueName: \"kubernetes.io/projected/610ed830-b74b-4aca-b02f-3be039bea560-kube-api-access-z47m8\") pod \"dns-operator-controller-manager-648d5c98bc-vz65t\" (UID: \"610ed830-b74b-4aca-b02f-3be039bea560\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-vz65t" Apr 22 21:14:30.576670 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.576645 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z47m8\" (UniqueName: \"kubernetes.io/projected/610ed830-b74b-4aca-b02f-3be039bea560-kube-api-access-z47m8\") pod \"dns-operator-controller-manager-648d5c98bc-vz65t\" (UID: \"610ed830-b74b-4aca-b02f-3be039bea560\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-vz65t" Apr 22 21:14:30.586722 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.586702 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z47m8\" (UniqueName: \"kubernetes.io/projected/610ed830-b74b-4aca-b02f-3be039bea560-kube-api-access-z47m8\") pod \"dns-operator-controller-manager-648d5c98bc-vz65t\" (UID: \"610ed830-b74b-4aca-b02f-3be039bea560\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-vz65t" Apr 22 21:14:30.702403 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.702336 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-vz65t" Apr 22 21:14:30.821220 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.821194 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-vz65t"] Apr 22 21:14:30.822845 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:14:30.822816 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod610ed830_b74b_4aca_b02f_3be039bea560.slice/crio-0c96dee00b1becb4887337873ac769066dec6c2c49ed3bd702783f2aa4a1472d WatchSource:0}: Error finding container 0c96dee00b1becb4887337873ac769066dec6c2c49ed3bd702783f2aa4a1472d: Status 404 returned error can't find the container with id 0c96dee00b1becb4887337873ac769066dec6c2c49ed3bd702783f2aa4a1472d Apr 22 21:14:30.824788 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:30.824773 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:14:31.207039 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:31.207008 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-vz65t" event={"ID":"610ed830-b74b-4aca-b02f-3be039bea560","Type":"ContainerStarted","Data":"0c96dee00b1becb4887337873ac769066dec6c2c49ed3bd702783f2aa4a1472d"} Apr 22 21:14:34.219234 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:34.219123 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-vz65t" event={"ID":"610ed830-b74b-4aca-b02f-3be039bea560","Type":"ContainerStarted","Data":"2e61c6717deb165b57b2374c077e6d68b64fafa9cae5c0f6966b3cdbdda0d9b9"} Apr 22 21:14:34.246729 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:34.246672 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-vz65t" podStartSLOduration=1.2595823369999999 podStartE2EDuration="4.246656019s" podCreationTimestamp="2026-04-22 21:14:30 +0000 UTC" firstStartedPulling="2026-04-22 21:14:30.824891831 +0000 UTC m=+318.697783662" lastFinishedPulling="2026-04-22 21:14:33.811965511 +0000 UTC m=+321.684857344" observedRunningTime="2026-04-22 21:14:34.246035014 +0000 UTC m=+322.118926865" watchObservedRunningTime="2026-04-22 21:14:34.246656019 +0000 UTC m=+322.119547871" Apr 22 21:14:35.225729 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:35.225695 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-vz65t" Apr 22 21:14:42.342973 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:42.342939 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rqnzq"] Apr 22 21:14:42.347989 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:42.347966 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rqnzq" Apr 22 21:14:42.350737 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:42.350717 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-zmjvf\"" Apr 22 21:14:42.358299 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:42.358278 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rqnzq"] Apr 22 21:14:42.476983 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:42.476957 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jks7\" (UniqueName: \"kubernetes.io/projected/d4e2addf-7473-44b3-b616-4fa938c3c366-kube-api-access-8jks7\") pod \"limitador-operator-controller-manager-85c4996f8c-rqnzq\" (UID: \"d4e2addf-7473-44b3-b616-4fa938c3c366\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rqnzq" Apr 22 21:14:42.577546 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:42.577507 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jks7\" (UniqueName: \"kubernetes.io/projected/d4e2addf-7473-44b3-b616-4fa938c3c366-kube-api-access-8jks7\") pod \"limitador-operator-controller-manager-85c4996f8c-rqnzq\" (UID: \"d4e2addf-7473-44b3-b616-4fa938c3c366\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rqnzq" Apr 22 21:14:42.585341 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:42.585318 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jks7\" (UniqueName: \"kubernetes.io/projected/d4e2addf-7473-44b3-b616-4fa938c3c366-kube-api-access-8jks7\") pod \"limitador-operator-controller-manager-85c4996f8c-rqnzq\" (UID: \"d4e2addf-7473-44b3-b616-4fa938c3c366\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rqnzq" Apr 22 21:14:42.658906 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:42.658833 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rqnzq" Apr 22 21:14:42.782872 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:42.782847 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rqnzq"] Apr 22 21:14:42.784729 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:14:42.784688 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4e2addf_7473_44b3_b616_4fa938c3c366.slice/crio-5809734d124314c8c6da77d6a999ebf190c61c801de7f1b4fa1eefa7848bec3a WatchSource:0}: Error finding container 5809734d124314c8c6da77d6a999ebf190c61c801de7f1b4fa1eefa7848bec3a: Status 404 returned error can't find the container with id 5809734d124314c8c6da77d6a999ebf190c61c801de7f1b4fa1eefa7848bec3a Apr 22 21:14:43.255181 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:43.255110 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rqnzq" event={"ID":"d4e2addf-7473-44b3-b616-4fa938c3c366","Type":"ContainerStarted","Data":"5809734d124314c8c6da77d6a999ebf190c61c801de7f1b4fa1eefa7848bec3a"} Apr 22 21:14:45.265001 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:45.264963 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rqnzq" event={"ID":"d4e2addf-7473-44b3-b616-4fa938c3c366","Type":"ContainerStarted","Data":"25de882c47210354e4188d84861c467e0a76d49a45934db7b3547386ade7fbfe"} Apr 22 21:14:45.265420 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:45.265012 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rqnzq" Apr 22 21:14:45.281525 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:45.281478 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rqnzq" podStartSLOduration=1.705256029 podStartE2EDuration="3.281464372s" podCreationTimestamp="2026-04-22 21:14:42 +0000 UTC" firstStartedPulling="2026-04-22 21:14:42.78650218 +0000 UTC m=+330.659394010" lastFinishedPulling="2026-04-22 21:14:44.362710523 +0000 UTC m=+332.235602353" observedRunningTime="2026-04-22 21:14:45.279396677 +0000 UTC m=+333.152288529" watchObservedRunningTime="2026-04-22 21:14:45.281464372 +0000 UTC m=+333.154356223" Apr 22 21:14:46.231825 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:46.231794 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-vz65t" Apr 22 21:14:56.271607 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:14:56.271573 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rqnzq" Apr 22 21:15:18.173909 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.173875 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7678c4655f-9x24b"] Apr 22 21:15:18.178279 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.178263 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.180492 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.180468 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 21:15:18.181429 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.181391 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 21:15:18.181516 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.181454 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 21:15:18.181516 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.181453 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 21:15:18.181516 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.181487 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-pg8x7\"" Apr 22 21:15:18.181673 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.181549 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 21:15:18.181673 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.181620 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 21:15:18.181788 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.181759 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 21:15:18.185987 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.185495 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 21:15:18.187876 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.187857 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7678c4655f-9x24b"] Apr 22 21:15:18.269452 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.269426 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-oauth-serving-cert\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.269597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.269462 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-console-serving-cert\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.269597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.269481 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-console-oauth-config\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.269597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.269498 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4vqt\" (UniqueName: \"kubernetes.io/projected/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-kube-api-access-t4vqt\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.269597 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.269557 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-service-ca\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.269727 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.269602 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-trusted-ca-bundle\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.269727 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.269664 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-console-config\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.370485 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.370451 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-console-serving-cert\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.370485 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.370486 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-console-oauth-config\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.370679 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.370507 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4vqt\" (UniqueName: \"kubernetes.io/projected/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-kube-api-access-t4vqt\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.370679 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.370533 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-service-ca\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.370793 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.370722 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-trusted-ca-bundle\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.370850 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.370790 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-console-config\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.370903 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.370849 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-oauth-serving-cert\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.371265 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.371240 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-service-ca\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.371629 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.371605 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-console-config\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.371629 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.371620 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-oauth-serving-cert\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.371767 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.371635 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-trusted-ca-bundle\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.373059 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.373032 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-console-serving-cert\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.373174 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.373057 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-console-oauth-config\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.377325 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.377304 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4vqt\" (UniqueName: \"kubernetes.io/projected/bd4c7d34-55d1-4a9c-bf61-6d432ee34dff-kube-api-access-t4vqt\") pod \"console-7678c4655f-9x24b\" (UID: \"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff\") " pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.488799 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.488753 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:18.610135 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:18.610108 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7678c4655f-9x24b"] Apr 22 21:15:18.611890 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:15:18.611860 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd4c7d34_55d1_4a9c_bf61_6d432ee34dff.slice/crio-dfd8c7f4e5278888940ae02472f576f6fc70593302c2db322a3b4b2b21fb21cd WatchSource:0}: Error finding container dfd8c7f4e5278888940ae02472f576f6fc70593302c2db322a3b4b2b21fb21cd: Status 404 returned error can't find the container with id dfd8c7f4e5278888940ae02472f576f6fc70593302c2db322a3b4b2b21fb21cd Apr 22 21:15:19.390075 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:19.389986 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7678c4655f-9x24b" event={"ID":"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff","Type":"ContainerStarted","Data":"ae0f4b6b5dc85dc1d0c4e1e7a6fdcd2e3cb0231f23d8a7e61dbd808d7e459889"} Apr 22 21:15:19.390075 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:19.390022 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7678c4655f-9x24b" event={"ID":"bd4c7d34-55d1-4a9c-bf61-6d432ee34dff","Type":"ContainerStarted","Data":"dfd8c7f4e5278888940ae02472f576f6fc70593302c2db322a3b4b2b21fb21cd"} Apr 22 21:15:19.407915 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:19.407855 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7678c4655f-9x24b" podStartSLOduration=1.407841814 podStartE2EDuration="1.407841814s" podCreationTimestamp="2026-04-22 21:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:15:19.404828978 +0000 UTC m=+367.277720831" watchObservedRunningTime="2026-04-22 21:15:19.407841814 +0000 UTC m=+367.280733667" Apr 22 21:15:20.368300 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.368267 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-llpbb"] Apr 22 21:15:20.370586 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.370567 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-llpbb" Apr 22 21:15:20.372754 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.372734 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-nr656\"" Apr 22 21:15:20.379702 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.379681 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-llpbb"] Apr 22 21:15:20.389002 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.388978 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9npb\" (UniqueName: \"kubernetes.io/projected/c9d6f66b-3983-465f-89c8-461a2050cbfc-kube-api-access-w9npb\") pod \"authorino-f99f4b5cd-llpbb\" (UID: \"c9d6f66b-3983-465f-89c8-461a2050cbfc\") " pod="kuadrant-system/authorino-f99f4b5cd-llpbb" Apr 22 21:15:20.490426 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.490395 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9npb\" (UniqueName: \"kubernetes.io/projected/c9d6f66b-3983-465f-89c8-461a2050cbfc-kube-api-access-w9npb\") pod \"authorino-f99f4b5cd-llpbb\" (UID: \"c9d6f66b-3983-465f-89c8-461a2050cbfc\") " pod="kuadrant-system/authorino-f99f4b5cd-llpbb" Apr 22 21:15:20.499612 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.499583 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9npb\" (UniqueName: \"kubernetes.io/projected/c9d6f66b-3983-465f-89c8-461a2050cbfc-kube-api-access-w9npb\") pod \"authorino-f99f4b5cd-llpbb\" (UID: \"c9d6f66b-3983-465f-89c8-461a2050cbfc\") " pod="kuadrant-system/authorino-f99f4b5cd-llpbb" Apr 22 21:15:20.599841 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.599816 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-cgsjh"] Apr 22 21:15:20.603060 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.603045 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-cgsjh" Apr 22 21:15:20.609937 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.609916 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-cgsjh"] Apr 22 21:15:20.681569 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.681505 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-llpbb" Apr 22 21:15:20.692224 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.692200 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97796\" (UniqueName: \"kubernetes.io/projected/7e18c675-9be9-4fbf-b2bc-6f937a1bfe60-kube-api-access-97796\") pod \"authorino-7498df8756-cgsjh\" (UID: \"7e18c675-9be9-4fbf-b2bc-6f937a1bfe60\") " pod="kuadrant-system/authorino-7498df8756-cgsjh" Apr 22 21:15:20.793324 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.793288 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97796\" (UniqueName: \"kubernetes.io/projected/7e18c675-9be9-4fbf-b2bc-6f937a1bfe60-kube-api-access-97796\") pod \"authorino-7498df8756-cgsjh\" (UID: \"7e18c675-9be9-4fbf-b2bc-6f937a1bfe60\") " pod="kuadrant-system/authorino-7498df8756-cgsjh" Apr 22 21:15:20.795639 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.795617 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-llpbb"] Apr 22 21:15:20.796916 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:15:20.796888 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9d6f66b_3983_465f_89c8_461a2050cbfc.slice/crio-434d89d3d514f4260beddd221376786b8930db3ba1bdad921a01a9e54f229415 WatchSource:0}: Error finding container 434d89d3d514f4260beddd221376786b8930db3ba1bdad921a01a9e54f229415: Status 404 returned error can't find the container with id 434d89d3d514f4260beddd221376786b8930db3ba1bdad921a01a9e54f229415 Apr 22 21:15:20.800730 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.800709 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97796\" (UniqueName: \"kubernetes.io/projected/7e18c675-9be9-4fbf-b2bc-6f937a1bfe60-kube-api-access-97796\") pod \"authorino-7498df8756-cgsjh\" (UID: \"7e18c675-9be9-4fbf-b2bc-6f937a1bfe60\") " pod="kuadrant-system/authorino-7498df8756-cgsjh" Apr 22 21:15:20.913593 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:20.913573 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-cgsjh" Apr 22 21:15:21.024622 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:21.024594 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-cgsjh"] Apr 22 21:15:21.025606 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:15:21.025584 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e18c675_9be9_4fbf_b2bc_6f937a1bfe60.slice/crio-6938f3630f5b27b97fd98d648182a5b1e82dd682f3a91c54beed8c98800b4a61 WatchSource:0}: Error finding container 6938f3630f5b27b97fd98d648182a5b1e82dd682f3a91c54beed8c98800b4a61: Status 404 returned error can't find the container with id 6938f3630f5b27b97fd98d648182a5b1e82dd682f3a91c54beed8c98800b4a61 Apr 22 21:15:21.398716 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:21.398684 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-cgsjh" event={"ID":"7e18c675-9be9-4fbf-b2bc-6f937a1bfe60","Type":"ContainerStarted","Data":"6938f3630f5b27b97fd98d648182a5b1e82dd682f3a91c54beed8c98800b4a61"} Apr 22 21:15:21.399727 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:21.399707 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-llpbb" event={"ID":"c9d6f66b-3983-465f-89c8-461a2050cbfc","Type":"ContainerStarted","Data":"434d89d3d514f4260beddd221376786b8930db3ba1bdad921a01a9e54f229415"} Apr 22 21:15:25.424063 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:25.424021 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-llpbb" event={"ID":"c9d6f66b-3983-465f-89c8-461a2050cbfc","Type":"ContainerStarted","Data":"3a7ea0ee00df23b02b22714a709ec2f294ca669e52e1221b17c8590bf6eda9cb"} Apr 22 21:15:25.425417 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:25.425391 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-cgsjh" event={"ID":"7e18c675-9be9-4fbf-b2bc-6f937a1bfe60","Type":"ContainerStarted","Data":"f5aa1aae78e1ef8e97924f01181deef63f52f9030f0f264d466e2fdadddf8287"} Apr 22 21:15:25.439186 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:25.439124 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-llpbb" podStartSLOduration=1.825580722 podStartE2EDuration="5.439111831s" podCreationTimestamp="2026-04-22 21:15:20 +0000 UTC" firstStartedPulling="2026-04-22 21:15:20.797959463 +0000 UTC m=+368.670851296" lastFinishedPulling="2026-04-22 21:15:24.411490572 +0000 UTC m=+372.284382405" observedRunningTime="2026-04-22 21:15:25.436934775 +0000 UTC m=+373.309826641" watchObservedRunningTime="2026-04-22 21:15:25.439111831 +0000 UTC m=+373.312003753" Apr 22 21:15:25.452863 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:25.452825 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-cgsjh" podStartSLOduration=2.056062788 podStartE2EDuration="5.452810038s" podCreationTimestamp="2026-04-22 21:15:20 +0000 UTC" firstStartedPulling="2026-04-22 21:15:21.026918423 +0000 UTC m=+368.899810254" lastFinishedPulling="2026-04-22 21:15:24.423665675 +0000 UTC m=+372.296557504" observedRunningTime="2026-04-22 21:15:25.450209823 +0000 UTC m=+373.323101676" watchObservedRunningTime="2026-04-22 21:15:25.452810038 +0000 UTC m=+373.325701889" Apr 22 21:15:25.473034 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:25.473009 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-llpbb"] Apr 22 21:15:27.433784 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:27.433738 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-llpbb" podUID="c9d6f66b-3983-465f-89c8-461a2050cbfc" containerName="authorino" containerID="cri-o://3a7ea0ee00df23b02b22714a709ec2f294ca669e52e1221b17c8590bf6eda9cb" gracePeriod=30 Apr 22 21:15:27.672955 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:27.672934 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-llpbb" Apr 22 21:15:27.756798 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:27.756770 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9npb\" (UniqueName: \"kubernetes.io/projected/c9d6f66b-3983-465f-89c8-461a2050cbfc-kube-api-access-w9npb\") pod \"c9d6f66b-3983-465f-89c8-461a2050cbfc\" (UID: \"c9d6f66b-3983-465f-89c8-461a2050cbfc\") " Apr 22 21:15:27.758715 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:27.758692 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9d6f66b-3983-465f-89c8-461a2050cbfc-kube-api-access-w9npb" (OuterVolumeSpecName: "kube-api-access-w9npb") pod "c9d6f66b-3983-465f-89c8-461a2050cbfc" (UID: "c9d6f66b-3983-465f-89c8-461a2050cbfc"). InnerVolumeSpecName "kube-api-access-w9npb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:15:27.858108 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:27.858079 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w9npb\" (UniqueName: \"kubernetes.io/projected/c9d6f66b-3983-465f-89c8-461a2050cbfc-kube-api-access-w9npb\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:15:28.439613 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:28.439578 2569 generic.go:358] "Generic (PLEG): container finished" podID="c9d6f66b-3983-465f-89c8-461a2050cbfc" containerID="3a7ea0ee00df23b02b22714a709ec2f294ca669e52e1221b17c8590bf6eda9cb" exitCode=0 Apr 22 21:15:28.440080 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:28.439631 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-llpbb" Apr 22 21:15:28.440080 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:28.439665 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-llpbb" event={"ID":"c9d6f66b-3983-465f-89c8-461a2050cbfc","Type":"ContainerDied","Data":"3a7ea0ee00df23b02b22714a709ec2f294ca669e52e1221b17c8590bf6eda9cb"} Apr 22 21:15:28.440080 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:28.439707 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-llpbb" event={"ID":"c9d6f66b-3983-465f-89c8-461a2050cbfc","Type":"ContainerDied","Data":"434d89d3d514f4260beddd221376786b8930db3ba1bdad921a01a9e54f229415"} Apr 22 21:15:28.440080 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:28.439725 2569 scope.go:117] "RemoveContainer" containerID="3a7ea0ee00df23b02b22714a709ec2f294ca669e52e1221b17c8590bf6eda9cb" Apr 22 21:15:28.449885 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:28.449866 2569 scope.go:117] "RemoveContainer" containerID="3a7ea0ee00df23b02b22714a709ec2f294ca669e52e1221b17c8590bf6eda9cb" Apr 22 21:15:28.450224 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:15:28.450195 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a7ea0ee00df23b02b22714a709ec2f294ca669e52e1221b17c8590bf6eda9cb\": container with ID starting with 3a7ea0ee00df23b02b22714a709ec2f294ca669e52e1221b17c8590bf6eda9cb not found: ID does not exist" containerID="3a7ea0ee00df23b02b22714a709ec2f294ca669e52e1221b17c8590bf6eda9cb" Apr 22 21:15:28.450333 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:28.450232 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7ea0ee00df23b02b22714a709ec2f294ca669e52e1221b17c8590bf6eda9cb"} err="failed to get container status \"3a7ea0ee00df23b02b22714a709ec2f294ca669e52e1221b17c8590bf6eda9cb\": rpc error: code = NotFound desc = could not find container \"3a7ea0ee00df23b02b22714a709ec2f294ca669e52e1221b17c8590bf6eda9cb\": container with ID starting with 3a7ea0ee00df23b02b22714a709ec2f294ca669e52e1221b17c8590bf6eda9cb not found: ID does not exist" Apr 22 21:15:28.462327 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:28.462301 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-llpbb"] Apr 22 21:15:28.469588 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:28.465934 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-llpbb"] Apr 22 21:15:28.488892 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:28.488872 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:28.488952 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:28.488906 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:28.493558 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:28.493539 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:28.789596 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:28.789558 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9d6f66b-3983-465f-89c8-461a2050cbfc" path="/var/lib/kubelet/pods/c9d6f66b-3983-465f-89c8-461a2050cbfc/volumes" Apr 22 21:15:29.448222 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:29.448198 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7678c4655f-9x24b" Apr 22 21:15:53.824036 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:53.824004 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-z57dl"] Apr 22 21:15:53.824510 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:53.824419 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9d6f66b-3983-465f-89c8-461a2050cbfc" containerName="authorino" Apr 22 21:15:53.824510 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:53.824433 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d6f66b-3983-465f-89c8-461a2050cbfc" containerName="authorino" Apr 22 21:15:53.824580 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:53.824512 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9d6f66b-3983-465f-89c8-461a2050cbfc" containerName="authorino" Apr 22 21:15:53.827546 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:53.827528 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-z57dl" Apr 22 21:15:53.835656 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:53.835297 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-z57dl"] Apr 22 21:15:53.973893 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:53.973861 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z98mz\" (UniqueName: \"kubernetes.io/projected/f6b727e0-c7b8-4d7e-9b5b-715e4babd03d-kube-api-access-z98mz\") pod \"authorino-8b475cf9f-z57dl\" (UID: \"f6b727e0-c7b8-4d7e-9b5b-715e4babd03d\") " pod="kuadrant-system/authorino-8b475cf9f-z57dl" Apr 22 21:15:54.053360 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.053332 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-z57dl"] Apr 22 21:15:54.053592 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:15:54.053573 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-z98mz], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-z57dl" podUID="f6b727e0-c7b8-4d7e-9b5b-715e4babd03d" Apr 22 21:15:54.074677 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.074604 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z98mz\" (UniqueName: \"kubernetes.io/projected/f6b727e0-c7b8-4d7e-9b5b-715e4babd03d-kube-api-access-z98mz\") pod \"authorino-8b475cf9f-z57dl\" (UID: \"f6b727e0-c7b8-4d7e-9b5b-715e4babd03d\") " pod="kuadrant-system/authorino-8b475cf9f-z57dl" Apr 22 21:15:54.075888 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.075857 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-866d4dcc54-wfklr"] Apr 22 21:15:54.079283 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.079267 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-866d4dcc54-wfklr" Apr 22 21:15:54.081469 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.081400 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z98mz\" (UniqueName: \"kubernetes.io/projected/f6b727e0-c7b8-4d7e-9b5b-715e4babd03d-kube-api-access-z98mz\") pod \"authorino-8b475cf9f-z57dl\" (UID: \"f6b727e0-c7b8-4d7e-9b5b-715e4babd03d\") " pod="kuadrant-system/authorino-8b475cf9f-z57dl" Apr 22 21:15:54.083806 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.083784 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-866d4dcc54-wfklr"] Apr 22 21:15:54.170210 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.170186 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-866d4dcc54-wfklr"] Apr 22 21:15:54.170457 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:15:54.170437 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-2kdpt], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-866d4dcc54-wfklr" podUID="07aaa917-ced5-4c95-b817-771c844bd6e3" Apr 22 21:15:54.175939 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.175918 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kdpt\" (UniqueName: \"kubernetes.io/projected/07aaa917-ced5-4c95-b817-771c844bd6e3-kube-api-access-2kdpt\") pod \"authorino-866d4dcc54-wfklr\" (UID: \"07aaa917-ced5-4c95-b817-771c844bd6e3\") " pod="kuadrant-system/authorino-866d4dcc54-wfklr" Apr 22 21:15:54.195710 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.195687 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7c5b497b5d-tqzx7"] Apr 22 21:15:54.199236 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.199218 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" Apr 22 21:15:54.201590 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.201573 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 21:15:54.204499 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.204480 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7c5b497b5d-tqzx7"] Apr 22 21:15:54.276736 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.276710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kdpt\" (UniqueName: \"kubernetes.io/projected/07aaa917-ced5-4c95-b817-771c844bd6e3-kube-api-access-2kdpt\") pod \"authorino-866d4dcc54-wfklr\" (UID: \"07aaa917-ced5-4c95-b817-771c844bd6e3\") " pod="kuadrant-system/authorino-866d4dcc54-wfklr" Apr 22 21:15:54.276864 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.276782 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwhz7\" (UniqueName: \"kubernetes.io/projected/0e450473-dd8e-423b-8b64-36ed883e2568-kube-api-access-pwhz7\") pod \"authorino-7c5b497b5d-tqzx7\" (UID: \"0e450473-dd8e-423b-8b64-36ed883e2568\") " pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" Apr 22 21:15:54.276864 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.276803 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0e450473-dd8e-423b-8b64-36ed883e2568-tls-cert\") pod \"authorino-7c5b497b5d-tqzx7\" (UID: \"0e450473-dd8e-423b-8b64-36ed883e2568\") " pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" Apr 22 21:15:54.284108 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.284075 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kdpt\" (UniqueName: \"kubernetes.io/projected/07aaa917-ced5-4c95-b817-771c844bd6e3-kube-api-access-2kdpt\") pod \"authorino-866d4dcc54-wfklr\" (UID: \"07aaa917-ced5-4c95-b817-771c844bd6e3\") " pod="kuadrant-system/authorino-866d4dcc54-wfklr" Apr 22 21:15:54.378201 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.378108 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwhz7\" (UniqueName: \"kubernetes.io/projected/0e450473-dd8e-423b-8b64-36ed883e2568-kube-api-access-pwhz7\") pod \"authorino-7c5b497b5d-tqzx7\" (UID: \"0e450473-dd8e-423b-8b64-36ed883e2568\") " pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" Apr 22 21:15:54.378201 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.378167 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0e450473-dd8e-423b-8b64-36ed883e2568-tls-cert\") pod \"authorino-7c5b497b5d-tqzx7\" (UID: \"0e450473-dd8e-423b-8b64-36ed883e2568\") " pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" Apr 22 21:15:54.380641 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.380608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0e450473-dd8e-423b-8b64-36ed883e2568-tls-cert\") pod \"authorino-7c5b497b5d-tqzx7\" (UID: \"0e450473-dd8e-423b-8b64-36ed883e2568\") " pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" Apr 22 21:15:54.385051 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.385028 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwhz7\" (UniqueName: \"kubernetes.io/projected/0e450473-dd8e-423b-8b64-36ed883e2568-kube-api-access-pwhz7\") pod \"authorino-7c5b497b5d-tqzx7\" (UID: \"0e450473-dd8e-423b-8b64-36ed883e2568\") " pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" Apr 22 21:15:54.509005 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.508982 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" Apr 22 21:15:54.543879 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.543853 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-866d4dcc54-wfklr" Apr 22 21:15:54.543879 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.543867 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-z57dl" Apr 22 21:15:54.549563 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.549545 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-866d4dcc54-wfklr" Apr 22 21:15:54.552979 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.552963 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-z57dl" Apr 22 21:15:54.627255 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.627230 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7c5b497b5d-tqzx7"] Apr 22 21:15:54.628832 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:15:54.628773 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e450473_dd8e_423b_8b64_36ed883e2568.slice/crio-6e865a5e0b20c0520d2801f70056bd44cbf4898a359b98b763d2bef5b4ff7b53 WatchSource:0}: Error finding container 6e865a5e0b20c0520d2801f70056bd44cbf4898a359b98b763d2bef5b4ff7b53: Status 404 returned error can't find the container with id 6e865a5e0b20c0520d2801f70056bd44cbf4898a359b98b763d2bef5b4ff7b53 Apr 22 21:15:54.681049 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.681027 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kdpt\" (UniqueName: \"kubernetes.io/projected/07aaa917-ced5-4c95-b817-771c844bd6e3-kube-api-access-2kdpt\") pod \"07aaa917-ced5-4c95-b817-771c844bd6e3\" (UID: \"07aaa917-ced5-4c95-b817-771c844bd6e3\") " Apr 22 21:15:54.681165 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.681134 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z98mz\" (UniqueName: \"kubernetes.io/projected/f6b727e0-c7b8-4d7e-9b5b-715e4babd03d-kube-api-access-z98mz\") pod \"f6b727e0-c7b8-4d7e-9b5b-715e4babd03d\" (UID: \"f6b727e0-c7b8-4d7e-9b5b-715e4babd03d\") " Apr 22 21:15:54.682912 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.682879 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07aaa917-ced5-4c95-b817-771c844bd6e3-kube-api-access-2kdpt" (OuterVolumeSpecName: "kube-api-access-2kdpt") pod "07aaa917-ced5-4c95-b817-771c844bd6e3" (UID: "07aaa917-ced5-4c95-b817-771c844bd6e3"). InnerVolumeSpecName "kube-api-access-2kdpt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:15:54.682994 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.682954 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b727e0-c7b8-4d7e-9b5b-715e4babd03d-kube-api-access-z98mz" (OuterVolumeSpecName: "kube-api-access-z98mz") pod "f6b727e0-c7b8-4d7e-9b5b-715e4babd03d" (UID: "f6b727e0-c7b8-4d7e-9b5b-715e4babd03d"). InnerVolumeSpecName "kube-api-access-z98mz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:15:54.782198 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.782178 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z98mz\" (UniqueName: \"kubernetes.io/projected/f6b727e0-c7b8-4d7e-9b5b-715e4babd03d-kube-api-access-z98mz\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:15:54.782198 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:54.782199 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2kdpt\" (UniqueName: \"kubernetes.io/projected/07aaa917-ced5-4c95-b817-771c844bd6e3-kube-api-access-2kdpt\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:15:55.549337 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.549297 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" event={"ID":"0e450473-dd8e-423b-8b64-36ed883e2568","Type":"ContainerStarted","Data":"547a9d65ce3335b291625938a0d3ad50d1a3cd6f2e42418c2cec711df6bd2cb3"} Apr 22 21:15:55.549337 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.549334 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-z57dl" Apr 22 21:15:55.549788 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.549344 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" event={"ID":"0e450473-dd8e-423b-8b64-36ed883e2568","Type":"ContainerStarted","Data":"6e865a5e0b20c0520d2801f70056bd44cbf4898a359b98b763d2bef5b4ff7b53"} Apr 22 21:15:55.549788 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.549341 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-866d4dcc54-wfklr" Apr 22 21:15:55.563398 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.563350 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" podStartSLOduration=1.009025902 podStartE2EDuration="1.56333739s" podCreationTimestamp="2026-04-22 21:15:54 +0000 UTC" firstStartedPulling="2026-04-22 21:15:54.63005936 +0000 UTC m=+402.502951189" lastFinishedPulling="2026-04-22 21:15:55.184370831 +0000 UTC m=+403.057262677" observedRunningTime="2026-04-22 21:15:55.563022033 +0000 UTC m=+403.435913889" watchObservedRunningTime="2026-04-22 21:15:55.56333739 +0000 UTC m=+403.436229241" Apr 22 21:15:55.588184 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.588135 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-cgsjh"] Apr 22 21:15:55.588384 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.588363 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-cgsjh" podUID="7e18c675-9be9-4fbf-b2bc-6f937a1bfe60" containerName="authorino" containerID="cri-o://f5aa1aae78e1ef8e97924f01181deef63f52f9030f0f264d466e2fdadddf8287" gracePeriod=30 Apr 22 21:15:55.594571 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.594548 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-z57dl"] Apr 22 21:15:55.598367 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.598342 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-z57dl"] Apr 22 21:15:55.616013 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.615989 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-866d4dcc54-wfklr"] Apr 22 21:15:55.618032 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.618009 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-866d4dcc54-wfklr"] Apr 22 21:15:55.804126 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.804068 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-cgsjh" Apr 22 21:15:55.890370 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.890340 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97796\" (UniqueName: \"kubernetes.io/projected/7e18c675-9be9-4fbf-b2bc-6f937a1bfe60-kube-api-access-97796\") pod \"7e18c675-9be9-4fbf-b2bc-6f937a1bfe60\" (UID: \"7e18c675-9be9-4fbf-b2bc-6f937a1bfe60\") " Apr 22 21:15:55.892273 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.892247 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e18c675-9be9-4fbf-b2bc-6f937a1bfe60-kube-api-access-97796" (OuterVolumeSpecName: "kube-api-access-97796") pod "7e18c675-9be9-4fbf-b2bc-6f937a1bfe60" (UID: "7e18c675-9be9-4fbf-b2bc-6f937a1bfe60"). InnerVolumeSpecName "kube-api-access-97796". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:15:55.991449 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:55.991427 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-97796\" (UniqueName: \"kubernetes.io/projected/7e18c675-9be9-4fbf-b2bc-6f937a1bfe60-kube-api-access-97796\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:15:56.558695 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:56.558663 2569 generic.go:358] "Generic (PLEG): container finished" podID="7e18c675-9be9-4fbf-b2bc-6f937a1bfe60" containerID="f5aa1aae78e1ef8e97924f01181deef63f52f9030f0f264d466e2fdadddf8287" exitCode=0 Apr 22 21:15:56.559109 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:56.558716 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-cgsjh" Apr 22 21:15:56.559109 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:56.558745 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-cgsjh" event={"ID":"7e18c675-9be9-4fbf-b2bc-6f937a1bfe60","Type":"ContainerDied","Data":"f5aa1aae78e1ef8e97924f01181deef63f52f9030f0f264d466e2fdadddf8287"} Apr 22 21:15:56.559109 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:56.558783 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-cgsjh" event={"ID":"7e18c675-9be9-4fbf-b2bc-6f937a1bfe60","Type":"ContainerDied","Data":"6938f3630f5b27b97fd98d648182a5b1e82dd682f3a91c54beed8c98800b4a61"} Apr 22 21:15:56.559109 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:56.558800 2569 scope.go:117] "RemoveContainer" containerID="f5aa1aae78e1ef8e97924f01181deef63f52f9030f0f264d466e2fdadddf8287" Apr 22 21:15:56.567978 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:56.567961 2569 scope.go:117] "RemoveContainer" containerID="f5aa1aae78e1ef8e97924f01181deef63f52f9030f0f264d466e2fdadddf8287" Apr 22 21:15:56.568258 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:15:56.568242 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5aa1aae78e1ef8e97924f01181deef63f52f9030f0f264d466e2fdadddf8287\": container with ID starting with f5aa1aae78e1ef8e97924f01181deef63f52f9030f0f264d466e2fdadddf8287 not found: ID does not exist" containerID="f5aa1aae78e1ef8e97924f01181deef63f52f9030f0f264d466e2fdadddf8287" Apr 22 21:15:56.568318 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:56.568265 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5aa1aae78e1ef8e97924f01181deef63f52f9030f0f264d466e2fdadddf8287"} err="failed to get container status \"f5aa1aae78e1ef8e97924f01181deef63f52f9030f0f264d466e2fdadddf8287\": rpc error: code = NotFound desc = could not find container \"f5aa1aae78e1ef8e97924f01181deef63f52f9030f0f264d466e2fdadddf8287\": container with ID starting with f5aa1aae78e1ef8e97924f01181deef63f52f9030f0f264d466e2fdadddf8287 not found: ID does not exist" Apr 22 21:15:56.579714 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:56.579693 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-cgsjh"] Apr 22 21:15:56.583370 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:56.583344 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-cgsjh"] Apr 22 21:15:56.789906 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:56.789865 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07aaa917-ced5-4c95-b817-771c844bd6e3" path="/var/lib/kubelet/pods/07aaa917-ced5-4c95-b817-771c844bd6e3/volumes" Apr 22 21:15:56.790198 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:56.790181 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e18c675-9be9-4fbf-b2bc-6f937a1bfe60" path="/var/lib/kubelet/pods/7e18c675-9be9-4fbf-b2bc-6f937a1bfe60/volumes" Apr 22 21:15:56.790534 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:15:56.790516 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b727e0-c7b8-4d7e-9b5b-715e4babd03d" path="/var/lib/kubelet/pods/f6b727e0-c7b8-4d7e-9b5b-715e4babd03d/volumes" Apr 22 21:18:17.177336 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:17.177297 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-759b6db555-c2xlg"] Apr 22 21:18:17.177882 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:17.177864 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e18c675-9be9-4fbf-b2bc-6f937a1bfe60" containerName="authorino" Apr 22 21:18:17.177959 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:17.177886 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e18c675-9be9-4fbf-b2bc-6f937a1bfe60" containerName="authorino" Apr 22 21:18:17.178010 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:17.177967 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e18c675-9be9-4fbf-b2bc-6f937a1bfe60" containerName="authorino" Apr 22 21:18:17.181111 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:17.181081 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-759b6db555-c2xlg" Apr 22 21:18:17.186882 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:17.186857 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-759b6db555-c2xlg"] Apr 22 21:18:17.245518 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:17.245489 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w76l8\" (UniqueName: \"kubernetes.io/projected/de885737-ad17-40b9-8e06-7bc0f9b24253-kube-api-access-w76l8\") pod \"authorino-759b6db555-c2xlg\" (UID: \"de885737-ad17-40b9-8e06-7bc0f9b24253\") " pod="kuadrant-system/authorino-759b6db555-c2xlg" Apr 22 21:18:17.245643 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:17.245551 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/de885737-ad17-40b9-8e06-7bc0f9b24253-tls-cert\") pod \"authorino-759b6db555-c2xlg\" (UID: \"de885737-ad17-40b9-8e06-7bc0f9b24253\") " pod="kuadrant-system/authorino-759b6db555-c2xlg" Apr 22 21:18:17.346886 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:17.346853 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/de885737-ad17-40b9-8e06-7bc0f9b24253-tls-cert\") pod \"authorino-759b6db555-c2xlg\" (UID: \"de885737-ad17-40b9-8e06-7bc0f9b24253\") " pod="kuadrant-system/authorino-759b6db555-c2xlg" Apr 22 21:18:17.347045 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:17.346987 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w76l8\" (UniqueName: \"kubernetes.io/projected/de885737-ad17-40b9-8e06-7bc0f9b24253-kube-api-access-w76l8\") pod \"authorino-759b6db555-c2xlg\" (UID: \"de885737-ad17-40b9-8e06-7bc0f9b24253\") " pod="kuadrant-system/authorino-759b6db555-c2xlg" Apr 22 21:18:17.349326 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:17.349303 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/de885737-ad17-40b9-8e06-7bc0f9b24253-tls-cert\") pod \"authorino-759b6db555-c2xlg\" (UID: \"de885737-ad17-40b9-8e06-7bc0f9b24253\") " pod="kuadrant-system/authorino-759b6db555-c2xlg" Apr 22 21:18:17.354232 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:17.354207 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w76l8\" (UniqueName: \"kubernetes.io/projected/de885737-ad17-40b9-8e06-7bc0f9b24253-kube-api-access-w76l8\") pod \"authorino-759b6db555-c2xlg\" (UID: \"de885737-ad17-40b9-8e06-7bc0f9b24253\") " pod="kuadrant-system/authorino-759b6db555-c2xlg" Apr 22 21:18:17.491500 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:17.491420 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-759b6db555-c2xlg" Apr 22 21:18:17.610112 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:17.610087 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-759b6db555-c2xlg"] Apr 22 21:18:17.611567 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:18:17.611537 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde885737_ad17_40b9_8e06_7bc0f9b24253.slice/crio-8b62ebe7a9b4a793fc394a514c4d742f417d5b9ce6fd0617107469034755a434 WatchSource:0}: Error finding container 8b62ebe7a9b4a793fc394a514c4d742f417d5b9ce6fd0617107469034755a434: Status 404 returned error can't find the container with id 8b62ebe7a9b4a793fc394a514c4d742f417d5b9ce6fd0617107469034755a434 Apr 22 21:18:18.092409 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:18.092377 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-759b6db555-c2xlg" event={"ID":"de885737-ad17-40b9-8e06-7bc0f9b24253","Type":"ContainerStarted","Data":"8b62ebe7a9b4a793fc394a514c4d742f417d5b9ce6fd0617107469034755a434"} Apr 22 21:18:19.096896 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:19.096860 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-759b6db555-c2xlg" event={"ID":"de885737-ad17-40b9-8e06-7bc0f9b24253","Type":"ContainerStarted","Data":"aff9a86452bd93b134a8c100fb33bda89f55f9533c990b2de6304d9a759689d0"} Apr 22 21:18:19.112625 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:19.112582 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-759b6db555-c2xlg" podStartSLOduration=1.573596928 podStartE2EDuration="2.112568986s" podCreationTimestamp="2026-04-22 21:18:17 +0000 UTC" firstStartedPulling="2026-04-22 21:18:17.612832147 +0000 UTC m=+545.485723978" lastFinishedPulling="2026-04-22 21:18:18.151804206 +0000 UTC m=+546.024696036" observedRunningTime="2026-04-22 21:18:19.109670681 +0000 UTC m=+546.982562534" watchObservedRunningTime="2026-04-22 21:18:19.112568986 +0000 UTC m=+546.985460838" Apr 22 21:18:19.133402 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:19.133375 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7c5b497b5d-tqzx7"] Apr 22 21:18:19.133593 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:19.133573 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" podUID="0e450473-dd8e-423b-8b64-36ed883e2568" containerName="authorino" containerID="cri-o://547a9d65ce3335b291625938a0d3ad50d1a3cd6f2e42418c2cec711df6bd2cb3" gracePeriod=30 Apr 22 21:18:19.366369 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:19.366348 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" Apr 22 21:18:19.562503 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:19.562470 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwhz7\" (UniqueName: \"kubernetes.io/projected/0e450473-dd8e-423b-8b64-36ed883e2568-kube-api-access-pwhz7\") pod \"0e450473-dd8e-423b-8b64-36ed883e2568\" (UID: \"0e450473-dd8e-423b-8b64-36ed883e2568\") " Apr 22 21:18:19.562503 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:19.562504 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0e450473-dd8e-423b-8b64-36ed883e2568-tls-cert\") pod \"0e450473-dd8e-423b-8b64-36ed883e2568\" (UID: \"0e450473-dd8e-423b-8b64-36ed883e2568\") " Apr 22 21:18:19.564637 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:19.564599 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e450473-dd8e-423b-8b64-36ed883e2568-kube-api-access-pwhz7" (OuterVolumeSpecName: "kube-api-access-pwhz7") pod "0e450473-dd8e-423b-8b64-36ed883e2568" (UID: "0e450473-dd8e-423b-8b64-36ed883e2568"). InnerVolumeSpecName "kube-api-access-pwhz7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:18:19.572248 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:19.572221 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e450473-dd8e-423b-8b64-36ed883e2568-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "0e450473-dd8e-423b-8b64-36ed883e2568" (UID: "0e450473-dd8e-423b-8b64-36ed883e2568"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:18:19.663500 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:19.663443 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pwhz7\" (UniqueName: \"kubernetes.io/projected/0e450473-dd8e-423b-8b64-36ed883e2568-kube-api-access-pwhz7\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:18:19.663500 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:19.663465 2569 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/0e450473-dd8e-423b-8b64-36ed883e2568-tls-cert\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:18:20.101110 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:20.101076 2569 generic.go:358] "Generic (PLEG): container finished" podID="0e450473-dd8e-423b-8b64-36ed883e2568" containerID="547a9d65ce3335b291625938a0d3ad50d1a3cd6f2e42418c2cec711df6bd2cb3" exitCode=0 Apr 22 21:18:20.101564 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:20.101117 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" Apr 22 21:18:20.101564 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:20.101163 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" event={"ID":"0e450473-dd8e-423b-8b64-36ed883e2568","Type":"ContainerDied","Data":"547a9d65ce3335b291625938a0d3ad50d1a3cd6f2e42418c2cec711df6bd2cb3"} Apr 22 21:18:20.101564 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:20.101199 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7c5b497b5d-tqzx7" event={"ID":"0e450473-dd8e-423b-8b64-36ed883e2568","Type":"ContainerDied","Data":"6e865a5e0b20c0520d2801f70056bd44cbf4898a359b98b763d2bef5b4ff7b53"} Apr 22 21:18:20.101564 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:20.101214 2569 scope.go:117] "RemoveContainer" containerID="547a9d65ce3335b291625938a0d3ad50d1a3cd6f2e42418c2cec711df6bd2cb3" Apr 22 21:18:20.110115 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:20.110101 2569 scope.go:117] "RemoveContainer" containerID="547a9d65ce3335b291625938a0d3ad50d1a3cd6f2e42418c2cec711df6bd2cb3" Apr 22 21:18:20.110364 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:18:20.110345 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547a9d65ce3335b291625938a0d3ad50d1a3cd6f2e42418c2cec711df6bd2cb3\": container with ID starting with 547a9d65ce3335b291625938a0d3ad50d1a3cd6f2e42418c2cec711df6bd2cb3 not found: ID does not exist" containerID="547a9d65ce3335b291625938a0d3ad50d1a3cd6f2e42418c2cec711df6bd2cb3" Apr 22 21:18:20.110418 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:20.110373 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547a9d65ce3335b291625938a0d3ad50d1a3cd6f2e42418c2cec711df6bd2cb3"} err="failed to get container status \"547a9d65ce3335b291625938a0d3ad50d1a3cd6f2e42418c2cec711df6bd2cb3\": rpc error: code = NotFound desc = could not find container \"547a9d65ce3335b291625938a0d3ad50d1a3cd6f2e42418c2cec711df6bd2cb3\": container with ID starting with 547a9d65ce3335b291625938a0d3ad50d1a3cd6f2e42418c2cec711df6bd2cb3 not found: ID does not exist" Apr 22 21:18:20.122069 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:20.122045 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7c5b497b5d-tqzx7"] Apr 22 21:18:20.125736 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:20.125715 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7c5b497b5d-tqzx7"] Apr 22 21:18:20.794875 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:18:20.794843 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e450473-dd8e-423b-8b64-36ed883e2568" path="/var/lib/kubelet/pods/0e450473-dd8e-423b-8b64-36ed883e2568/volumes" Apr 22 21:19:12.702528 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:19:12.702498 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:19:12.703070 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:19:12.702936 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:24:12.735131 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:24:12.735059 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:24:12.736969 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:24:12.736933 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:29:12.773049 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:29:12.773019 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:29:12.776709 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:29:12.776687 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:30:00.141307 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:00.141270 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29614890-c92g2"] Apr 22 21:30:00.141773 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:00.141622 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e450473-dd8e-423b-8b64-36ed883e2568" containerName="authorino" Apr 22 21:30:00.141773 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:00.141632 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e450473-dd8e-423b-8b64-36ed883e2568" containerName="authorino" Apr 22 21:30:00.141773 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:00.141703 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e450473-dd8e-423b-8b64-36ed883e2568" containerName="authorino" Apr 22 21:30:00.144869 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:00.144853 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" Apr 22 21:30:00.146903 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:00.146880 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-qxg7n\"" Apr 22 21:30:00.150702 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:00.150678 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614890-c92g2"] Apr 22 21:30:00.240670 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:00.240608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc7mw\" (UniqueName: \"kubernetes.io/projected/cbc389d9-8350-44fa-95e0-736d1fa3a8ca-kube-api-access-lc7mw\") pod \"maas-api-key-cleanup-29614890-c92g2\" (UID: \"cbc389d9-8350-44fa-95e0-736d1fa3a8ca\") " pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" Apr 22 21:30:00.341225 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:00.341198 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc7mw\" (UniqueName: \"kubernetes.io/projected/cbc389d9-8350-44fa-95e0-736d1fa3a8ca-kube-api-access-lc7mw\") pod \"maas-api-key-cleanup-29614890-c92g2\" (UID: \"cbc389d9-8350-44fa-95e0-736d1fa3a8ca\") " pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" Apr 22 21:30:00.349216 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:00.349192 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc7mw\" (UniqueName: \"kubernetes.io/projected/cbc389d9-8350-44fa-95e0-736d1fa3a8ca-kube-api-access-lc7mw\") pod \"maas-api-key-cleanup-29614890-c92g2\" (UID: \"cbc389d9-8350-44fa-95e0-736d1fa3a8ca\") " pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" Apr 22 21:30:00.455907 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:00.455857 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" Apr 22 21:30:00.778940 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:00.778909 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614890-c92g2"] Apr 22 21:30:00.780244 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:30:00.780206 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbc389d9_8350_44fa_95e0_736d1fa3a8ca.slice/crio-0dac1d5acaf1645213769443d6f48483a9c9d18db1f92af956474cb0a0d4d51f WatchSource:0}: Error finding container 0dac1d5acaf1645213769443d6f48483a9c9d18db1f92af956474cb0a0d4d51f: Status 404 returned error can't find the container with id 0dac1d5acaf1645213769443d6f48483a9c9d18db1f92af956474cb0a0d4d51f Apr 22 21:30:00.781919 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:00.781901 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:30:01.748912 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:01.748868 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" event={"ID":"cbc389d9-8350-44fa-95e0-736d1fa3a8ca","Type":"ContainerStarted","Data":"0dac1d5acaf1645213769443d6f48483a9c9d18db1f92af956474cb0a0d4d51f"} Apr 22 21:30:03.759785 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:03.759752 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" event={"ID":"cbc389d9-8350-44fa-95e0-736d1fa3a8ca","Type":"ContainerStarted","Data":"81e35a7f3c98faabfaed364695bd0596ec7757edf128f089b7dc8b6054c7830e"} Apr 22 21:30:03.774080 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:03.774030 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" podStartSLOduration=1.42269695 podStartE2EDuration="3.774015743s" podCreationTimestamp="2026-04-22 21:30:00 +0000 UTC" firstStartedPulling="2026-04-22 21:30:00.7820402 +0000 UTC m=+1248.654932030" lastFinishedPulling="2026-04-22 21:30:03.133358992 +0000 UTC m=+1251.006250823" observedRunningTime="2026-04-22 21:30:03.772198003 +0000 UTC m=+1251.645089855" watchObservedRunningTime="2026-04-22 21:30:03.774015743 +0000 UTC m=+1251.646907595" Apr 22 21:30:23.839237 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:23.839210 2569 generic.go:358] "Generic (PLEG): container finished" podID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" containerID="81e35a7f3c98faabfaed364695bd0596ec7757edf128f089b7dc8b6054c7830e" exitCode=6 Apr 22 21:30:23.839622 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:23.839284 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" event={"ID":"cbc389d9-8350-44fa-95e0-736d1fa3a8ca","Type":"ContainerDied","Data":"81e35a7f3c98faabfaed364695bd0596ec7757edf128f089b7dc8b6054c7830e"} Apr 22 21:30:23.839622 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:23.839586 2569 scope.go:117] "RemoveContainer" containerID="81e35a7f3c98faabfaed364695bd0596ec7757edf128f089b7dc8b6054c7830e" Apr 22 21:30:24.845111 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:24.845078 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" event={"ID":"cbc389d9-8350-44fa-95e0-736d1fa3a8ca","Type":"ContainerStarted","Data":"b07c371224043cbbe6bed08d9b24dadc366a1ae46862112ac974a516f7dd0f2a"} Apr 22 21:30:44.925955 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:44.925916 2569 generic.go:358] "Generic (PLEG): container finished" podID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" containerID="b07c371224043cbbe6bed08d9b24dadc366a1ae46862112ac974a516f7dd0f2a" exitCode=6 Apr 22 21:30:44.926461 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:44.925994 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" event={"ID":"cbc389d9-8350-44fa-95e0-736d1fa3a8ca","Type":"ContainerDied","Data":"b07c371224043cbbe6bed08d9b24dadc366a1ae46862112ac974a516f7dd0f2a"} Apr 22 21:30:44.926461 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:44.926053 2569 scope.go:117] "RemoveContainer" containerID="81e35a7f3c98faabfaed364695bd0596ec7757edf128f089b7dc8b6054c7830e" Apr 22 21:30:44.926461 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:44.926428 2569 scope.go:117] "RemoveContainer" containerID="b07c371224043cbbe6bed08d9b24dadc366a1ae46862112ac974a516f7dd0f2a" Apr 22 21:30:44.926654 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:30:44.926636 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29614890-c92g2_opendatahub(cbc389d9-8350-44fa-95e0-736d1fa3a8ca)\"" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" podUID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" Apr 22 21:30:59.785119 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:30:59.785089 2569 scope.go:117] "RemoveContainer" containerID="b07c371224043cbbe6bed08d9b24dadc366a1ae46862112ac974a516f7dd0f2a" Apr 22 21:31:00.010331 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:00.010296 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614890-c92g2"] Apr 22 21:31:00.997385 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:00.997351 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" event={"ID":"cbc389d9-8350-44fa-95e0-736d1fa3a8ca","Type":"ContainerStarted","Data":"50aa32ddb5bef6a8398f9ff039556741e5896562d37f1844d75987390b741852"} Apr 22 21:31:00.997767 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:00.997419 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" podUID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" containerName="cleanup" containerID="cri-o://50aa32ddb5bef6a8398f9ff039556741e5896562d37f1844d75987390b741852" gracePeriod=30 Apr 22 21:31:20.534621 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:20.534597 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" Apr 22 21:31:20.607025 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:20.606957 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc7mw\" (UniqueName: \"kubernetes.io/projected/cbc389d9-8350-44fa-95e0-736d1fa3a8ca-kube-api-access-lc7mw\") pod \"cbc389d9-8350-44fa-95e0-736d1fa3a8ca\" (UID: \"cbc389d9-8350-44fa-95e0-736d1fa3a8ca\") " Apr 22 21:31:20.608956 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:20.608939 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbc389d9-8350-44fa-95e0-736d1fa3a8ca-kube-api-access-lc7mw" (OuterVolumeSpecName: "kube-api-access-lc7mw") pod "cbc389d9-8350-44fa-95e0-736d1fa3a8ca" (UID: "cbc389d9-8350-44fa-95e0-736d1fa3a8ca"). InnerVolumeSpecName "kube-api-access-lc7mw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:31:20.707643 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:20.707620 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lc7mw\" (UniqueName: \"kubernetes.io/projected/cbc389d9-8350-44fa-95e0-736d1fa3a8ca-kube-api-access-lc7mw\") on node \"ip-10-0-133-75.ec2.internal\" DevicePath \"\"" Apr 22 21:31:21.076708 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:21.076679 2569 generic.go:358] "Generic (PLEG): container finished" podID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" containerID="50aa32ddb5bef6a8398f9ff039556741e5896562d37f1844d75987390b741852" exitCode=6 Apr 22 21:31:21.076969 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:21.076765 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" Apr 22 21:31:21.076969 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:21.076760 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" event={"ID":"cbc389d9-8350-44fa-95e0-736d1fa3a8ca","Type":"ContainerDied","Data":"50aa32ddb5bef6a8398f9ff039556741e5896562d37f1844d75987390b741852"} Apr 22 21:31:21.076969 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:21.076865 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614890-c92g2" event={"ID":"cbc389d9-8350-44fa-95e0-736d1fa3a8ca","Type":"ContainerDied","Data":"0dac1d5acaf1645213769443d6f48483a9c9d18db1f92af956474cb0a0d4d51f"} Apr 22 21:31:21.076969 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:21.076887 2569 scope.go:117] "RemoveContainer" containerID="50aa32ddb5bef6a8398f9ff039556741e5896562d37f1844d75987390b741852" Apr 22 21:31:21.085233 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:21.085218 2569 scope.go:117] "RemoveContainer" containerID="b07c371224043cbbe6bed08d9b24dadc366a1ae46862112ac974a516f7dd0f2a" Apr 22 21:31:21.092458 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:21.092437 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614890-c92g2"] Apr 22 21:31:21.093412 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:21.093396 2569 scope.go:117] "RemoveContainer" containerID="50aa32ddb5bef6a8398f9ff039556741e5896562d37f1844d75987390b741852" Apr 22 21:31:21.093660 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:31:21.093636 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50aa32ddb5bef6a8398f9ff039556741e5896562d37f1844d75987390b741852\": container with ID starting with 50aa32ddb5bef6a8398f9ff039556741e5896562d37f1844d75987390b741852 not found: ID does not exist" containerID="50aa32ddb5bef6a8398f9ff039556741e5896562d37f1844d75987390b741852" Apr 22 21:31:21.093705 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:21.093671 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50aa32ddb5bef6a8398f9ff039556741e5896562d37f1844d75987390b741852"} err="failed to get container status \"50aa32ddb5bef6a8398f9ff039556741e5896562d37f1844d75987390b741852\": rpc error: code = NotFound desc = could not find container \"50aa32ddb5bef6a8398f9ff039556741e5896562d37f1844d75987390b741852\": container with ID starting with 50aa32ddb5bef6a8398f9ff039556741e5896562d37f1844d75987390b741852 not found: ID does not exist" Apr 22 21:31:21.093705 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:21.093694 2569 scope.go:117] "RemoveContainer" containerID="b07c371224043cbbe6bed08d9b24dadc366a1ae46862112ac974a516f7dd0f2a" Apr 22 21:31:21.093926 ip-10-0-133-75 kubenswrapper[2569]: E0422 21:31:21.093910 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b07c371224043cbbe6bed08d9b24dadc366a1ae46862112ac974a516f7dd0f2a\": container with ID starting with b07c371224043cbbe6bed08d9b24dadc366a1ae46862112ac974a516f7dd0f2a not found: ID does not exist" containerID="b07c371224043cbbe6bed08d9b24dadc366a1ae46862112ac974a516f7dd0f2a" Apr 22 21:31:21.093964 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:21.093932 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b07c371224043cbbe6bed08d9b24dadc366a1ae46862112ac974a516f7dd0f2a"} err="failed to get container status \"b07c371224043cbbe6bed08d9b24dadc366a1ae46862112ac974a516f7dd0f2a\": rpc error: code = NotFound desc = could not find container \"b07c371224043cbbe6bed08d9b24dadc366a1ae46862112ac974a516f7dd0f2a\": container with ID starting with b07c371224043cbbe6bed08d9b24dadc366a1ae46862112ac974a516f7dd0f2a not found: ID does not exist" Apr 22 21:31:21.094740 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:21.094722 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614890-c92g2"] Apr 22 21:31:22.789852 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:31:22.789818 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" path="/var/lib/kubelet/pods/cbc389d9-8350-44fa-95e0-736d1fa3a8ca/volumes" Apr 22 21:34:12.817480 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:34:12.817390 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:34:12.820347 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:34:12.820326 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:39:12.849982 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:39:12.849880 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:39:12.854181 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:39:12.854139 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:40:52.119006 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:52.118978 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-759b6db555-c2xlg_de885737-ad17-40b9-8e06-7bc0f9b24253/authorino/0.log" Apr 22 21:40:56.828766 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:56.828732 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-754bfc4657-28svj_c42afd88-4c86-464c-b4d1-b5705f790287/manager/0.log" Apr 22 21:40:57.910783 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:57.910747 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh_382a169a-813d-4ae9-b423-42732d0bf9cf/pull/0.log" Apr 22 21:40:57.917702 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:57.917679 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh_382a169a-813d-4ae9-b423-42732d0bf9cf/extract/0.log" Apr 22 21:40:57.923932 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:57.923914 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh_382a169a-813d-4ae9-b423-42732d0bf9cf/util/0.log" Apr 22 21:40:58.029412 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:58.029389 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g_f9f19330-b51c-4326-b771-8a44d93476d1/util/0.log" Apr 22 21:40:58.034793 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:58.034766 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g_f9f19330-b51c-4326-b771-8a44d93476d1/pull/0.log" Apr 22 21:40:58.040185 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:58.040165 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g_f9f19330-b51c-4326-b771-8a44d93476d1/extract/0.log" Apr 22 21:40:58.146498 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:58.146479 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq_d3569a2f-8690-48b6-8973-ea29801279cc/util/0.log" Apr 22 21:40:58.152706 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:58.152688 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq_d3569a2f-8690-48b6-8973-ea29801279cc/pull/0.log" Apr 22 21:40:58.158732 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:58.158705 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq_d3569a2f-8690-48b6-8973-ea29801279cc/extract/0.log" Apr 22 21:40:58.265332 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:58.265305 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj_ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c/util/0.log" Apr 22 21:40:58.271371 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:58.271353 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj_ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c/pull/0.log" Apr 22 21:40:58.276872 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:58.276855 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj_ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c/extract/0.log" Apr 22 21:40:58.392781 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:58.392758 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-759b6db555-c2xlg_de885737-ad17-40b9-8e06-7bc0f9b24253/authorino/0.log" Apr 22 21:40:58.617665 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:58.617588 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-vz65t_610ed830-b74b-4aca-b02f-3be039bea560/manager/0.log" Apr 22 21:40:59.201261 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:59.201231 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-rqnzq_d4e2addf-7473-44b3-b616-4fa938c3c366/manager/0.log" Apr 22 21:40:59.873093 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:40:59.873061 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-567cb9698d-xq5ml_8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a/kube-auth-proxy/0.log" Apr 22 21:41:05.105510 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.105477 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nnb7p/must-gather-v5ml6"] Apr 22 21:41:05.105955 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.105940 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" containerName="cleanup" Apr 22 21:41:05.106001 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.105958 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" containerName="cleanup" Apr 22 21:41:05.106001 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.105974 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" containerName="cleanup" Apr 22 21:41:05.106001 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.105982 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" containerName="cleanup" Apr 22 21:41:05.106097 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.106051 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" containerName="cleanup" Apr 22 21:41:05.106097 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.106059 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" containerName="cleanup" Apr 22 21:41:05.106189 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.106120 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" containerName="cleanup" Apr 22 21:41:05.106189 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.106130 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" containerName="cleanup" Apr 22 21:41:05.106281 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.106268 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="cbc389d9-8350-44fa-95e0-736d1fa3a8ca" containerName="cleanup" Apr 22 21:41:05.108168 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.108136 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnb7p/must-gather-v5ml6" Apr 22 21:41:05.110894 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.110870 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nnb7p\"/\"kube-root-ca.crt\"" Apr 22 21:41:05.111712 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.111688 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nnb7p\"/\"openshift-service-ca.crt\"" Apr 22 21:41:05.111843 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.111762 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nnb7p\"/\"default-dockercfg-2cdrj\"" Apr 22 21:41:05.125994 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.125968 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nnb7p/must-gather-v5ml6"] Apr 22 21:41:05.233095 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.233063 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85z52\" (UniqueName: \"kubernetes.io/projected/423cb968-1791-4d0b-b982-7e7c3f3f5cc2-kube-api-access-85z52\") pod \"must-gather-v5ml6\" (UID: \"423cb968-1791-4d0b-b982-7e7c3f3f5cc2\") " pod="openshift-must-gather-nnb7p/must-gather-v5ml6" Apr 22 21:41:05.233285 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.233113 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/423cb968-1791-4d0b-b982-7e7c3f3f5cc2-must-gather-output\") pod \"must-gather-v5ml6\" (UID: \"423cb968-1791-4d0b-b982-7e7c3f3f5cc2\") " pod="openshift-must-gather-nnb7p/must-gather-v5ml6" Apr 22 21:41:05.333816 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.333787 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85z52\" (UniqueName: \"kubernetes.io/projected/423cb968-1791-4d0b-b982-7e7c3f3f5cc2-kube-api-access-85z52\") pod \"must-gather-v5ml6\" (UID: \"423cb968-1791-4d0b-b982-7e7c3f3f5cc2\") " pod="openshift-must-gather-nnb7p/must-gather-v5ml6" Apr 22 21:41:05.334015 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.333828 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/423cb968-1791-4d0b-b982-7e7c3f3f5cc2-must-gather-output\") pod \"must-gather-v5ml6\" (UID: \"423cb968-1791-4d0b-b982-7e7c3f3f5cc2\") " pod="openshift-must-gather-nnb7p/must-gather-v5ml6" Apr 22 21:41:05.334125 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.334108 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/423cb968-1791-4d0b-b982-7e7c3f3f5cc2-must-gather-output\") pod \"must-gather-v5ml6\" (UID: \"423cb968-1791-4d0b-b982-7e7c3f3f5cc2\") " pod="openshift-must-gather-nnb7p/must-gather-v5ml6" Apr 22 21:41:05.344864 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.344844 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85z52\" (UniqueName: \"kubernetes.io/projected/423cb968-1791-4d0b-b982-7e7c3f3f5cc2-kube-api-access-85z52\") pod \"must-gather-v5ml6\" (UID: \"423cb968-1791-4d0b-b982-7e7c3f3f5cc2\") " pod="openshift-must-gather-nnb7p/must-gather-v5ml6" Apr 22 21:41:05.417928 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.417871 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnb7p/must-gather-v5ml6" Apr 22 21:41:05.531053 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.530978 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nnb7p/must-gather-v5ml6"] Apr 22 21:41:05.533746 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:41:05.533721 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod423cb968_1791_4d0b_b982_7e7c3f3f5cc2.slice/crio-528fda966391ac60429355e7c3c0dd6851a48e8be7dff03ace0aff79e61bb898 WatchSource:0}: Error finding container 528fda966391ac60429355e7c3c0dd6851a48e8be7dff03ace0aff79e61bb898: Status 404 returned error can't find the container with id 528fda966391ac60429355e7c3c0dd6851a48e8be7dff03ace0aff79e61bb898 Apr 22 21:41:05.535737 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:05.535718 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:41:06.292998 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:06.292969 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnb7p/must-gather-v5ml6" event={"ID":"423cb968-1791-4d0b-b982-7e7c3f3f5cc2","Type":"ContainerStarted","Data":"528fda966391ac60429355e7c3c0dd6851a48e8be7dff03ace0aff79e61bb898"} Apr 22 21:41:07.298495 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:07.298455 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnb7p/must-gather-v5ml6" event={"ID":"423cb968-1791-4d0b-b982-7e7c3f3f5cc2","Type":"ContainerStarted","Data":"e099b8f28a99bf045cde55537b9f684f576aaa2481563b2940c7bdabf7742b98"} Apr 22 21:41:07.298495 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:07.298503 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnb7p/must-gather-v5ml6" event={"ID":"423cb968-1791-4d0b-b982-7e7c3f3f5cc2","Type":"ContainerStarted","Data":"4a4a1c6d56dd707f07f8c34107cb84235646e0b169c24bc2ffb8f541d26e1274"} Apr 22 21:41:07.313530 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:07.313482 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nnb7p/must-gather-v5ml6" podStartSLOduration=1.4188493580000001 podStartE2EDuration="2.313466476s" podCreationTimestamp="2026-04-22 21:41:05 +0000 UTC" firstStartedPulling="2026-04-22 21:41:05.535910597 +0000 UTC m=+1913.408802432" lastFinishedPulling="2026-04-22 21:41:06.430527713 +0000 UTC m=+1914.303419550" observedRunningTime="2026-04-22 21:41:07.31218867 +0000 UTC m=+1915.185080523" watchObservedRunningTime="2026-04-22 21:41:07.313466476 +0000 UTC m=+1915.186358361" Apr 22 21:41:08.031721 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:08.031684 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-szpzh_0be939b4-a64f-442a-957d-d341f369c11b/global-pull-secret-syncer/0.log" Apr 22 21:41:08.107474 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:08.107440 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-b4l88_0031b709-1393-4369-bbf3-cc631f87aafc/konnectivity-agent/0.log" Apr 22 21:41:08.253699 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:08.253666 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-75.ec2.internal_5401552a10b9bd31fa1f4a18dcace9bb/haproxy/0.log" Apr 22 21:41:11.820166 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:11.819343 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh_382a169a-813d-4ae9-b423-42732d0bf9cf/extract/0.log" Apr 22 21:41:11.842999 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:11.842974 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh_382a169a-813d-4ae9-b423-42732d0bf9cf/util/0.log" Apr 22 21:41:11.876248 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:11.876203 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592l4zh_382a169a-813d-4ae9-b423-42732d0bf9cf/pull/0.log" Apr 22 21:41:11.909995 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:11.909959 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g_f9f19330-b51c-4326-b771-8a44d93476d1/extract/0.log" Apr 22 21:41:11.939275 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:11.939211 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g_f9f19330-b51c-4326-b771-8a44d93476d1/util/0.log" Apr 22 21:41:11.966514 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:11.966480 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e07xg6g_f9f19330-b51c-4326-b771-8a44d93476d1/pull/0.log" Apr 22 21:41:12.012811 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:12.012784 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq_d3569a2f-8690-48b6-8973-ea29801279cc/extract/0.log" Apr 22 21:41:12.035163 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:12.034575 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq_d3569a2f-8690-48b6-8973-ea29801279cc/util/0.log" Apr 22 21:41:12.063024 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:12.062994 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73sg2lq_d3569a2f-8690-48b6-8973-ea29801279cc/pull/0.log" Apr 22 21:41:12.094116 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:12.094023 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj_ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c/extract/0.log" Apr 22 21:41:12.117198 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:12.117163 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj_ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c/util/0.log" Apr 22 21:41:12.135903 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:12.135873 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1tcnfj_ce7edcdd-59ce-46aa-ac0f-da1c0bb6b70c/pull/0.log" Apr 22 21:41:12.450216 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:12.447172 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-759b6db555-c2xlg_de885737-ad17-40b9-8e06-7bc0f9b24253/authorino/0.log" Apr 22 21:41:12.507171 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:12.507004 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-vz65t_610ed830-b74b-4aca-b02f-3be039bea560/manager/0.log" Apr 22 21:41:12.804076 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:12.802972 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-rqnzq_d4e2addf-7473-44b3-b616-4fa938c3c366/manager/0.log" Apr 22 21:41:14.280452 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.280420 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3ceb214d-3079-4c0d-a30f-961fe468d29b/alertmanager/0.log" Apr 22 21:41:14.306175 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.306128 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3ceb214d-3079-4c0d-a30f-961fe468d29b/config-reloader/0.log" Apr 22 21:41:14.326962 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.326922 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3ceb214d-3079-4c0d-a30f-961fe468d29b/kube-rbac-proxy-web/0.log" Apr 22 21:41:14.351823 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.351791 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3ceb214d-3079-4c0d-a30f-961fe468d29b/kube-rbac-proxy/0.log" Apr 22 21:41:14.371495 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.371455 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3ceb214d-3079-4c0d-a30f-961fe468d29b/kube-rbac-proxy-metric/0.log" Apr 22 21:41:14.391357 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.391283 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3ceb214d-3079-4c0d-a30f-961fe468d29b/prom-label-proxy/0.log" Apr 22 21:41:14.415008 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.414968 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_3ceb214d-3079-4c0d-a30f-961fe468d29b/init-config-reloader/0.log" Apr 22 21:41:14.494659 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.494621 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8ktp7_8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb/kube-state-metrics/0.log" Apr 22 21:41:14.514511 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.514465 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8ktp7_8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb/kube-rbac-proxy-main/0.log" Apr 22 21:41:14.537096 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.537073 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8ktp7_8d4bb5e4-1b2f-4fdb-bb90-9147216fe7eb/kube-rbac-proxy-self/0.log" Apr 22 21:41:14.564966 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.564935 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-748ffcd58d-lljq4_43473ce1-d3e6-424f-aef0-e230cda76179/metrics-server/0.log" Apr 22 21:41:14.590670 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.590637 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-8mrrq_55622d37-e0f0-4be4-b871-5f5e077eff37/monitoring-plugin/0.log" Apr 22 21:41:14.770085 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.770049 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jrgd2_832e8ef7-8db0-4d87-a721-afdfff094b49/node-exporter/0.log" Apr 22 21:41:14.790587 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.790554 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jrgd2_832e8ef7-8db0-4d87-a721-afdfff094b49/kube-rbac-proxy/0.log" Apr 22 21:41:14.811164 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.811115 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jrgd2_832e8ef7-8db0-4d87-a721-afdfff094b49/init-textfile/0.log" Apr 22 21:41:14.834393 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.834366 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7zzq2_0ae169a5-d6ee-433b-af7c-56170339c262/kube-rbac-proxy-main/0.log" Apr 22 21:41:14.854791 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.854759 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7zzq2_0ae169a5-d6ee-433b-af7c-56170339c262/kube-rbac-proxy-self/0.log" Apr 22 21:41:14.876849 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.876818 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7zzq2_0ae169a5-d6ee-433b-af7c-56170339c262/openshift-state-metrics/0.log" Apr 22 21:41:14.908679 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.908647 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0e1037e8-6f25-4a2d-b5f0-fb90d011338c/prometheus/0.log" Apr 22 21:41:14.939172 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.939127 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0e1037e8-6f25-4a2d-b5f0-fb90d011338c/config-reloader/0.log" Apr 22 21:41:14.961489 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.961458 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0e1037e8-6f25-4a2d-b5f0-fb90d011338c/thanos-sidecar/0.log" Apr 22 21:41:14.983035 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:14.983007 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0e1037e8-6f25-4a2d-b5f0-fb90d011338c/kube-rbac-proxy-web/0.log" Apr 22 21:41:15.004037 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:15.004004 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0e1037e8-6f25-4a2d-b5f0-fb90d011338c/kube-rbac-proxy/0.log" Apr 22 21:41:15.024441 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:15.024366 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0e1037e8-6f25-4a2d-b5f0-fb90d011338c/kube-rbac-proxy-thanos/0.log" Apr 22 21:41:15.046801 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:15.046735 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0e1037e8-6f25-4a2d-b5f0-fb90d011338c/init-config-reloader/0.log" Apr 22 21:41:15.150171 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:15.149423 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-74bbd6b687-tl28x_09610e14-4051-4f28-b405-a73b06e01c3f/telemeter-client/0.log" Apr 22 21:41:15.167877 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:15.167826 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-74bbd6b687-tl28x_09610e14-4051-4f28-b405-a73b06e01c3f/reload/0.log" Apr 22 21:41:15.186701 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:15.186671 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-74bbd6b687-tl28x_09610e14-4051-4f28-b405-a73b06e01c3f/kube-rbac-proxy/0.log" Apr 22 21:41:15.217028 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:15.216989 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bd6dd8cc8-pshnk_cd925cbe-dac6-4f84-8d08-c8337a350d1d/thanos-query/0.log" Apr 22 21:41:15.236976 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:15.236922 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bd6dd8cc8-pshnk_cd925cbe-dac6-4f84-8d08-c8337a350d1d/kube-rbac-proxy-web/0.log" Apr 22 21:41:15.256410 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:15.256381 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bd6dd8cc8-pshnk_cd925cbe-dac6-4f84-8d08-c8337a350d1d/kube-rbac-proxy/0.log" Apr 22 21:41:15.275587 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:15.275509 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bd6dd8cc8-pshnk_cd925cbe-dac6-4f84-8d08-c8337a350d1d/prom-label-proxy/0.log" Apr 22 21:41:15.295270 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:15.295239 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bd6dd8cc8-pshnk_cd925cbe-dac6-4f84-8d08-c8337a350d1d/kube-rbac-proxy-rules/0.log" Apr 22 21:41:15.318165 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:15.318107 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6bd6dd8cc8-pshnk_cd925cbe-dac6-4f84-8d08-c8337a350d1d/kube-rbac-proxy-metrics/0.log" Apr 22 21:41:16.532938 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.532887 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp"] Apr 22 21:41:16.539528 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.539498 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.547594 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.547552 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp"] Apr 22 21:41:16.651978 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.651935 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9038738b-a69b-4448-9535-8e6d66cd8530-lib-modules\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.652191 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.652009 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9038738b-a69b-4448-9535-8e6d66cd8530-proc\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.652191 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.652047 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9038738b-a69b-4448-9535-8e6d66cd8530-podres\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.652191 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.652090 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9038738b-a69b-4448-9535-8e6d66cd8530-sys\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.652370 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.652191 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgfzd\" (UniqueName: \"kubernetes.io/projected/9038738b-a69b-4448-9535-8e6d66cd8530-kube-api-access-pgfzd\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.752774 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.752732 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgfzd\" (UniqueName: \"kubernetes.io/projected/9038738b-a69b-4448-9535-8e6d66cd8530-kube-api-access-pgfzd\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.752961 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.752798 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9038738b-a69b-4448-9535-8e6d66cd8530-lib-modules\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.752961 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.752840 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9038738b-a69b-4448-9535-8e6d66cd8530-proc\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.752961 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.752871 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9038738b-a69b-4448-9535-8e6d66cd8530-podres\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.752961 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.752914 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9038738b-a69b-4448-9535-8e6d66cd8530-sys\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.753211 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.753006 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9038738b-a69b-4448-9535-8e6d66cd8530-lib-modules\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.753211 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.753016 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9038738b-a69b-4448-9535-8e6d66cd8530-sys\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.753211 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.753023 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9038738b-a69b-4448-9535-8e6d66cd8530-proc\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.753211 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.753106 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9038738b-a69b-4448-9535-8e6d66cd8530-podres\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.760234 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.760209 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgfzd\" (UniqueName: \"kubernetes.io/projected/9038738b-a69b-4448-9535-8e6d66cd8530-kube-api-access-pgfzd\") pod \"perf-node-gather-daemonset-65mwp\" (UID: \"9038738b-a69b-4448-9535-8e6d66cd8530\") " pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:16.862215 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:16.862120 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:17.025240 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:17.025206 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp"] Apr 22 21:41:17.028525 ip-10-0-133-75 kubenswrapper[2569]: W0422 21:41:17.028494 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9038738b_a69b_4448_9535_8e6d66cd8530.slice/crio-5a337985207e537ebab37274810e2ae0d5914f86eb2c11c5a7ee2879625b1f38 WatchSource:0}: Error finding container 5a337985207e537ebab37274810e2ae0d5914f86eb2c11c5a7ee2879625b1f38: Status 404 returned error can't find the container with id 5a337985207e537ebab37274810e2ae0d5914f86eb2c11c5a7ee2879625b1f38 Apr 22 21:41:17.320711 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:17.320676 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7678c4655f-9x24b_bd4c7d34-55d1-4a9c-bf61-6d432ee34dff/console/0.log" Apr 22 21:41:17.354385 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:17.354353 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" event={"ID":"9038738b-a69b-4448-9535-8e6d66cd8530","Type":"ContainerStarted","Data":"09a9dc8764cb1c33da621acd6a66169de7ac624be3f3d033afe312b6468a54ff"} Apr 22 21:41:17.354566 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:17.354390 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" event={"ID":"9038738b-a69b-4448-9535-8e6d66cd8530","Type":"ContainerStarted","Data":"5a337985207e537ebab37274810e2ae0d5914f86eb2c11c5a7ee2879625b1f38"} Apr 22 21:41:17.354566 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:17.354416 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:17.372826 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:17.372774 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" podStartSLOduration=1.372758116 podStartE2EDuration="1.372758116s" podCreationTimestamp="2026-04-22 21:41:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:41:17.368298006 +0000 UTC m=+1925.241189861" watchObservedRunningTime="2026-04-22 21:41:17.372758116 +0000 UTC m=+1925.245649969" Apr 22 21:41:18.686306 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:18.686274 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xdxhq_9bc6f374-c3e0-48a6-8231-9d5d16eea2ac/dns/0.log" Apr 22 21:41:18.708410 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:18.708386 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xdxhq_9bc6f374-c3e0-48a6-8231-9d5d16eea2ac/kube-rbac-proxy/0.log" Apr 22 21:41:18.748818 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:18.748796 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rllmv_e62d12e3-203c-47e6-8292-aebabba9b716/dns-node-resolver/0.log" Apr 22 21:41:19.250861 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:19.250819 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-q864p_5e2e8788-770b-4c8b-aa3b-d52af912e57b/node-ca/0.log" Apr 22 21:41:20.216523 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:20.216491 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-567cb9698d-xq5ml_8fd18c56-fd13-490f-8e1b-a5c5afa8cc0a/kube-auth-proxy/0.log" Apr 22 21:41:20.786693 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:20.786667 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-z44pn_661d84d5-8c52-4e6c-823f-add2e843f2a4/serve-healthcheck-canary/0.log" Apr 22 21:41:21.233635 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:21.233562 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j7nqz_e22ce551-77f1-4198-bdd2-edf964b0a064/kube-rbac-proxy/0.log" Apr 22 21:41:21.252121 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:21.252085 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j7nqz_e22ce551-77f1-4198-bdd2-edf964b0a064/exporter/0.log" Apr 22 21:41:21.271725 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:21.271703 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j7nqz_e22ce551-77f1-4198-bdd2-edf964b0a064/extractor/0.log" Apr 22 21:41:23.369460 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:23.369421 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nnb7p/perf-node-gather-daemonset-65mwp" Apr 22 21:41:23.398174 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:23.398127 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-754bfc4657-28svj_c42afd88-4c86-464c-b4d1-b5705f790287/manager/0.log" Apr 22 21:41:24.545683 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:24.545648 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7979f84667-9gxqw_257ddb76-5df7-44e1-8222-4f8fd3909da9/manager/0.log" Apr 22 21:41:24.590610 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:24.590580 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-gfrvn_1ba1824a-dda6-4c46-a8f1-c95420a43eb1/openshift-lws-operator/0.log" Apr 22 21:41:30.528077 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:30.528052 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z85g_d1d59cdd-035f-4424-9def-015beb3b369f/kube-multus-additional-cni-plugins/0.log" Apr 22 21:41:30.551768 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:30.551738 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z85g_d1d59cdd-035f-4424-9def-015beb3b369f/egress-router-binary-copy/0.log" Apr 22 21:41:30.577961 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:30.577933 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z85g_d1d59cdd-035f-4424-9def-015beb3b369f/cni-plugins/0.log" Apr 22 21:41:30.598129 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:30.598106 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z85g_d1d59cdd-035f-4424-9def-015beb3b369f/bond-cni-plugin/0.log" Apr 22 21:41:30.616727 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:30.616708 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z85g_d1d59cdd-035f-4424-9def-015beb3b369f/routeoverride-cni/0.log" Apr 22 21:41:30.635764 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:30.635733 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z85g_d1d59cdd-035f-4424-9def-015beb3b369f/whereabouts-cni-bincopy/0.log" Apr 22 21:41:30.657202 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:30.657180 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z85g_d1d59cdd-035f-4424-9def-015beb3b369f/whereabouts-cni/0.log" Apr 22 21:41:31.030978 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:31.030946 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xv5js_d9420aad-9147-4086-9dbb-2f74a2f65676/kube-multus/0.log" Apr 22 21:41:31.092670 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:31.092638 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-d7j8j_80bac7af-2767-4aee-b3fa-d0683f389b6a/network-metrics-daemon/0.log" Apr 22 21:41:31.108133 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:31.108103 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-d7j8j_80bac7af-2767-4aee-b3fa-d0683f389b6a/kube-rbac-proxy/0.log" Apr 22 21:41:32.493334 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:32.493301 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-controller/0.log" Apr 22 21:41:32.508436 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:32.508411 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/0.log" Apr 22 21:41:32.524761 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:32.524735 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovn-acl-logging/1.log" Apr 22 21:41:32.543296 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:32.543270 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/kube-rbac-proxy-node/0.log" Apr 22 21:41:32.564523 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:32.564498 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 21:41:32.583039 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:32.583021 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/northd/0.log" Apr 22 21:41:32.600918 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:32.600895 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/nbdb/0.log" Apr 22 21:41:32.619243 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:32.619215 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/sbdb/0.log" Apr 22 21:41:32.801964 ip-10-0-133-75 kubenswrapper[2569]: I0422 21:41:32.801931 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sshlp_62da3121-b9c0-42d1-b441-45c1a4816f11/ovnkube-controller/0.log"