Apr 20 19:25:44.183989 ip-10-0-135-55 systemd[1]: Starting Kubernetes Kubelet... Apr 20 19:25:44.641883 ip-10-0-135-55 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:25:44.641883 ip-10-0-135-55 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 19:25:44.641883 ip-10-0-135-55 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:25:44.641883 ip-10-0-135-55 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 19:25:44.641883 ip-10-0-135-55 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:25:44.643622 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.643503 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 19:25:44.648222 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648190 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:44.648222 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648215 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:44.648222 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648219 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:44.648222 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648223 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:44.648222 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648230 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:44.648222 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648234 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:44.648222 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648238 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648241 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648245 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648249 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648252 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648256 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648260 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648263 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648267 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648270 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648274 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648277 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648280 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648284 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648287 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648291 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648295 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648299 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648303 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648306 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:44.648647 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648309 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648313 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648324 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648328 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648332 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648336 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648340 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648344 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648347 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648350 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648354 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648358 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648361 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648364 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648368 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648373 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648376 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648380 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648384 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648387 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:44.649510 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648391 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648396 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648400 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648405 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648409 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648413 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648418 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648422 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648427 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648433 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648437 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648441 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648445 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648449 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648453 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648458 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648463 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648468 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648472 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:44.650157 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648476 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648484 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648492 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648497 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648505 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648509 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648516 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648523 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648528 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648533 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648538 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648542 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648546 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648551 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648556 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648560 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648564 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648569 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:44.650740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648573 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648577 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.648582 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649294 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649304 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649309 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649313 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649318 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649323 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649327 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649331 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649336 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649340 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649345 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649349 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649353 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649357 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649362 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649366 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649370 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:44.651494 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649374 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649378 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649382 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649386 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649390 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649394 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649399 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649404 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649408 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649413 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649417 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649421 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649425 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649429 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649433 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649436 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649441 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649444 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649449 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649453 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:44.652291 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649457 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649461 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649465 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649469 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649474 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649478 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649482 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649486 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649491 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649495 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649499 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649505 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649509 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649512 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649517 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649521 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649525 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649529 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649533 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649537 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:44.652876 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649555 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649560 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649564 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649568 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649572 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649577 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649580 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649585 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649588 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649592 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649596 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649603 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649630 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649636 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649642 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649646 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649651 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649657 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649664 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:44.653471 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649669 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649674 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649678 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649682 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649687 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649691 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649698 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649702 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649707 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.649711 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650528 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650543 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650556 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650563 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650571 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650576 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650583 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650591 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650596 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650601 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650606 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650634 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 19:25:44.654159 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650639 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650644 2575 flags.go:64] FLAG: --cgroup-root="" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650649 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650654 2575 flags.go:64] FLAG: --client-ca-file="" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650659 2575 flags.go:64] FLAG: --cloud-config="" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650664 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650669 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650676 2575 flags.go:64] FLAG: --cluster-domain="" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650681 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650687 2575 flags.go:64] FLAG: --config-dir="" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650692 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650697 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650704 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650709 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650715 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650720 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650725 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650729 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650734 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650740 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650744 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650751 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650756 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650760 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650765 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 19:25:44.654794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650771 2575 flags.go:64] FLAG: --enable-server="true" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650776 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650784 2575 flags.go:64] FLAG: --event-burst="100" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650790 2575 flags.go:64] FLAG: --event-qps="50" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650796 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650801 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650806 2575 flags.go:64] FLAG: --eviction-hard="" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650813 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650817 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650822 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650827 2575 flags.go:64] FLAG: --eviction-soft="" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650833 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650838 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650842 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650847 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650852 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650857 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650862 2575 flags.go:64] FLAG: --feature-gates="" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650868 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650873 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650879 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650884 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650889 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650895 2575 flags.go:64] FLAG: --help="false" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650900 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-135-55.ec2.internal" Apr 20 19:25:44.655446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650905 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650910 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650915 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650921 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650927 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650932 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650937 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650941 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650947 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650952 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650958 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650962 2575 flags.go:64] FLAG: --kube-reserved="" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650967 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650971 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650977 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650982 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650987 2575 flags.go:64] FLAG: --lock-file="" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650992 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.650996 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651001 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651010 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651014 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651019 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 19:25:44.656160 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651024 2575 flags.go:64] FLAG: --logging-format="text" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651028 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651034 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651039 2575 flags.go:64] FLAG: --manifest-url="" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651043 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651079 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651086 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651094 2575 flags.go:64] FLAG: --max-pods="110" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651100 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651105 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651110 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651114 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651119 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651124 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651129 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651143 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651149 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651154 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651160 2575 flags.go:64] FLAG: --pod-cidr="" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651165 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651174 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651179 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651184 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651189 2575 flags.go:64] FLAG: --port="10250" Apr 20 19:25:44.656761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651194 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651199 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08e0c74c6488feeba" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651204 2575 flags.go:64] FLAG: --qos-reserved="" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651209 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651214 2575 flags.go:64] FLAG: --register-node="true" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651219 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651224 2575 flags.go:64] FLAG: --register-with-taints="" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651230 2575 flags.go:64] FLAG: --registry-burst="10" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651235 2575 flags.go:64] FLAG: --registry-qps="5" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651240 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651244 2575 flags.go:64] FLAG: --reserved-memory="" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651250 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651256 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651260 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651265 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651270 2575 flags.go:64] FLAG: --runonce="false" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651275 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651280 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651285 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651289 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651294 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651299 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651304 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651314 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651319 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651325 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 19:25:44.657350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651331 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651337 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651342 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651347 2575 flags.go:64] FLAG: --system-cgroups="" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651351 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651360 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651365 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651369 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651375 2575 flags.go:64] FLAG: --tls-min-version="" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651380 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651385 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651389 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651394 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651399 2575 flags.go:64] FLAG: --v="2" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651406 2575 flags.go:64] FLAG: --version="false" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651412 2575 flags.go:64] FLAG: --vmodule="" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651419 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.651424 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651587 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651596 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651602 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651629 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651634 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:44.658086 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651639 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651643 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651647 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651651 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651655 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651659 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651665 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651670 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651674 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651680 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651685 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651690 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651694 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651698 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651703 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651707 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651711 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651715 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651719 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651723 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:44.658737 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651726 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651730 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651735 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651740 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651753 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651757 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651762 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651766 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651770 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651774 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651778 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651782 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651787 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651791 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651795 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651799 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651803 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651807 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651814 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651818 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:44.659247 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651822 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651830 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651836 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651840 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651846 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651850 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651854 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651858 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651863 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651867 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651872 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651876 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651880 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651884 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651888 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651892 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651896 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651900 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651904 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651908 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:44.659791 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651912 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651917 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651921 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651925 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651929 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651934 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651938 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651942 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651946 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651950 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651956 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651960 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651964 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651970 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651974 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651978 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651982 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.651995 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.652000 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:44.660294 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.652004 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.652008 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.652899 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.660241 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.660264 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660317 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660326 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660330 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660334 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660337 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660340 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660343 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660346 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660348 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660351 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:44.660792 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660353 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660356 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660359 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660361 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660364 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660367 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660369 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660372 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660375 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660379 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660381 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660384 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660386 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660389 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660391 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660394 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660396 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660399 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660401 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660404 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:44.661174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660406 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660411 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660414 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660416 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660419 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660422 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660424 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660427 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660429 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660431 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660434 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660436 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660439 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660441 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660444 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660447 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660451 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660454 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660457 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:44.661693 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660460 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660463 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660466 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660468 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660471 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660473 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660476 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660478 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660481 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660483 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660486 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660488 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660491 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660494 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660497 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660500 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660503 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660506 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660508 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660510 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:44.662192 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660513 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660515 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660518 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660520 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660523 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660525 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660528 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660530 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660533 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660535 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660538 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660540 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660543 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660545 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660548 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660550 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:44.662713 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660553 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.660558 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660682 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660687 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660690 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660693 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660696 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660699 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660702 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660705 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660708 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660711 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660715 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660718 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660721 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660724 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:44.663108 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660727 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660730 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660732 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660735 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660738 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660741 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660743 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660746 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660748 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660750 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660753 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660756 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660758 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660761 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660763 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660766 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660768 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660771 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660773 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:44.663516 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660777 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660780 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660783 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660786 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660789 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660792 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660795 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660798 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660802 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660805 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660808 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660811 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660814 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660816 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660819 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660821 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660824 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660826 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660829 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660832 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:44.664023 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660834 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660837 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660839 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660842 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660844 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660846 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660849 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660851 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660854 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660856 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660859 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660861 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660863 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660866 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660868 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660871 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660873 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660875 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660878 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660881 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:44.664513 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660884 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660886 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660889 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660893 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660897 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660899 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660902 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660905 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660907 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660910 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660912 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660915 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:44.660917 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.660922 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.661672 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 19:25:44.665098 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.664928 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 19:25:44.665996 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.665982 2575 server.go:1019] "Starting client certificate rotation" Apr 20 19:25:44.666103 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.666083 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:25:44.666139 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.666129 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:25:44.691037 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.691012 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:25:44.693863 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.693840 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:25:44.711073 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.711036 2575 log.go:25] "Validated CRI v1 runtime API" Apr 20 19:25:44.717335 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.717308 2575 log.go:25] "Validated CRI v1 image API" Apr 20 19:25:44.718757 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.718738 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 19:25:44.723231 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.723207 2575 fs.go:135] Filesystem UUIDs: map[082307e8-0ec7-460d-a41e-01cf0b98ec7c:/dev/nvme0n1p3 55ae434b-7102-40e0-b037-640571f0df4d:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 20 19:25:44.723285 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.723232 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 19:25:44.729215 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.729071 2575 manager.go:217] Machine: {Timestamp:2026-04-20 19:25:44.727107667 +0000 UTC m=+0.417520932 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100249 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2575558c3b76a00fcb24d62ad04dad SystemUUID:ec257555-8c3b-76a0-0fcb-24d62ad04dad BootID:48dc5791-bfaf-47af-a4f1-95192c7a7fe1 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6b:4e:7c:f0:05 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6b:4e:7c:f0:05 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2e:1b:0d:d5:e1:2a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 19:25:44.729215 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.729205 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 19:25:44.729384 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.729370 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 19:25:44.730691 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.730648 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 19:25:44.730846 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.730692 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-55.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 19:25:44.730897 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.730857 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 19:25:44.730897 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.730865 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 19:25:44.730897 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.730879 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:25:44.731861 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.731845 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:25:44.733286 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.733271 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:25:44.733414 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.733405 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 19:25:44.736541 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.736526 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 20 19:25:44.736582 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.736547 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 19:25:44.736582 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.736561 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 19:25:44.736582 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.736572 2575 kubelet.go:397] "Adding apiserver pod source" Apr 20 19:25:44.736582 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.736580 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 19:25:44.737038 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.737009 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:25:44.737824 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.737811 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:25:44.737872 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.737832 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:25:44.741584 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.741566 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 19:25:44.743783 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.743767 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 19:25:44.745368 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.745352 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 19:25:44.745408 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.745385 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 19:25:44.745408 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.745395 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 19:25:44.745408 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.745402 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 19:25:44.745408 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.745407 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 19:25:44.745528 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.745413 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 19:25:44.745528 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.745420 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 19:25:44.745528 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.745426 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 19:25:44.745528 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.745433 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 19:25:44.745528 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.745439 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 19:25:44.745528 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.745456 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 19:25:44.745528 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.745465 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 19:25:44.746184 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.746174 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 19:25:44.746184 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.746185 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 19:25:44.750046 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.750030 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 19:25:44.750090 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.750077 2575 server.go:1295] "Started kubelet" Apr 20 19:25:44.750289 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.750229 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 19:25:44.750369 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.750316 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 19:25:44.750583 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.750558 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 19:25:44.751084 ip-10-0-135-55 systemd[1]: Started Kubernetes Kubelet. Apr 20 19:25:44.754802 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.754778 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 19:25:44.755561 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.755536 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-55.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 19:25:44.756141 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:44.756081 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 19:25:44.756251 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:44.756181 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-55.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 19:25:44.757144 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.757123 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 20 19:25:44.759562 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.759540 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 19:25:44.759562 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.759547 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 19:25:44.760375 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.760350 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 19:25:44.760375 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.760379 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 19:25:44.760505 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.760408 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 19:25:44.760505 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.760467 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 20 19:25:44.760505 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.760478 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 20 19:25:44.760653 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:44.760560 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:44.761848 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.761830 2575 factory.go:55] Registering systemd factory Apr 20 19:25:44.761971 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.761939 2575 factory.go:223] Registration of the systemd container factory successfully Apr 20 19:25:44.762220 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.762203 2575 factory.go:153] Registering CRI-O factory Apr 20 19:25:44.762220 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.762218 2575 factory.go:223] Registration of the crio container factory successfully Apr 20 19:25:44.762362 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.762279 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 19:25:44.762362 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.762305 2575 factory.go:103] Registering Raw factory Apr 20 19:25:44.762362 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.762320 2575 manager.go:1196] Started watching for new ooms in manager Apr 20 19:25:44.762758 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.762743 2575 manager.go:319] Starting recovery of all containers Apr 20 19:25:44.763428 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:44.763401 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-55.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 19:25:44.763553 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:44.763503 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 19:25:44.763645 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:44.763578 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 19:25:44.764718 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:44.763438 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-55.ec2.internal.18a827311497fc91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-55.ec2.internal,UID:ip-10-0-135-55.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-55.ec2.internal,},FirstTimestamp:2026-04-20 19:25:44.750046353 +0000 UTC m=+0.440459618,LastTimestamp:2026-04-20 19:25:44.750046353 +0000 UTC m=+0.440459618,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-55.ec2.internal,}" Apr 20 19:25:44.772398 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.772356 2575 manager.go:324] Recovery completed Apr 20 19:25:44.776348 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.776318 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kvxpp" Apr 20 19:25:44.777977 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.777960 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:25:44.780860 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.780827 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:25:44.780860 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.780864 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:25:44.781030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.780877 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:25:44.781475 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.781451 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 19:25:44.781475 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.781471 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 19:25:44.781637 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.781492 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:25:44.783180 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:44.783107 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-55.ec2.internal.18a82731166dfbbe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-55.ec2.internal,UID:ip-10-0-135-55.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-55.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-55.ec2.internal,},FirstTimestamp:2026-04-20 19:25:44.780848062 +0000 UTC m=+0.471261327,LastTimestamp:2026-04-20 19:25:44.780848062 +0000 UTC m=+0.471261327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-55.ec2.internal,}" Apr 20 19:25:44.784117 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.784099 2575 policy_none.go:49] "None policy: Start" Apr 20 19:25:44.784181 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.784121 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 19:25:44.784181 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.784132 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 20 19:25:44.790563 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.790538 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kvxpp" Apr 20 19:25:44.839674 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.835428 2575 manager.go:341] "Starting Device Plugin manager" Apr 20 19:25:44.839674 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:44.835485 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 19:25:44.839674 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.835498 2575 server.go:85] "Starting device plugin registration server" Apr 20 19:25:44.839674 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.835862 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 19:25:44.839674 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.835876 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 19:25:44.839674 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.835972 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 19:25:44.839674 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.836090 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 19:25:44.839674 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.836102 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 19:25:44.839674 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:44.837825 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 19:25:44.839674 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:44.837864 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:44.901247 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.901160 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 19:25:44.902595 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.902575 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 19:25:44.902707 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.902627 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 19:25:44.902707 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.902661 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 19:25:44.902707 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.902670 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 19:25:44.902830 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:44.902777 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 19:25:44.906888 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.906858 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:44.936541 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.936497 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:25:44.937729 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.937707 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:25:44.937831 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.937751 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:25:44.937831 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.937766 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:25:44.937831 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.937791 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-55.ec2.internal" Apr 20 19:25:44.952922 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:44.952897 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-55.ec2.internal" Apr 20 19:25:44.953008 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:44.952927 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-55.ec2.internal\": node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:45.003041 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.002985 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-55.ec2.internal"] Apr 20 19:25:45.003245 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.003117 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:25:45.004831 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.004811 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:25:45.004941 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.004847 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:25:45.004941 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.004861 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:25:45.007166 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.007146 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:25:45.007313 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.007289 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.007369 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.007335 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:25:45.008037 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.008010 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:25:45.008037 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.008024 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:25:45.008204 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.008046 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:25:45.008204 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.008045 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:25:45.008204 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.008072 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:25:45.008204 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.008057 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:25:45.010372 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.010355 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.010449 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.010385 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:25:45.011203 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.011186 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:25:45.011269 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.011214 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:25:45.011269 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.011227 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:25:45.028377 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:45.028345 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-55.ec2.internal\" not found" node="ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.032165 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:45.032145 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-55.ec2.internal\" not found" node="ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.047874 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:45.047843 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:45.062577 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.062544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/60834707d5f53c027199cd5bc82fb1f6-config\") pod \"kube-apiserver-proxy-ip-10-0-135-55.ec2.internal\" (UID: \"60834707d5f53c027199cd5bc82fb1f6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.062577 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.062577 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2475b11f2cca7a40145b7c82467d78f7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal\" (UID: \"2475b11f2cca7a40145b7c82467d78f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.062781 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.062597 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2475b11f2cca7a40145b7c82467d78f7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal\" (UID: \"2475b11f2cca7a40145b7c82467d78f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.148938 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:45.148905 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:45.163596 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.163527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2475b11f2cca7a40145b7c82467d78f7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal\" (UID: \"2475b11f2cca7a40145b7c82467d78f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.163596 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.163573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2475b11f2cca7a40145b7c82467d78f7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal\" (UID: \"2475b11f2cca7a40145b7c82467d78f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.163596 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.163591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/60834707d5f53c027199cd5bc82fb1f6-config\") pod \"kube-apiserver-proxy-ip-10-0-135-55.ec2.internal\" (UID: \"60834707d5f53c027199cd5bc82fb1f6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.163744 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.163645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2475b11f2cca7a40145b7c82467d78f7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal\" (UID: \"2475b11f2cca7a40145b7c82467d78f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.163744 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.163656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2475b11f2cca7a40145b7c82467d78f7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal\" (UID: \"2475b11f2cca7a40145b7c82467d78f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.163744 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.163652 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/60834707d5f53c027199cd5bc82fb1f6-config\") pod \"kube-apiserver-proxy-ip-10-0-135-55.ec2.internal\" (UID: \"60834707d5f53c027199cd5bc82fb1f6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.249773 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:45.249740 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:45.330946 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.330912 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.334708 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.334687 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-55.ec2.internal" Apr 20 19:25:45.349921 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:45.349885 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:45.450587 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:45.450500 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:45.551071 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:45.551044 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:45.651713 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:45.651669 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:45.666066 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.666034 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 19:25:45.666236 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.666219 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:25:45.752938 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:45.752738 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:45.760093 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.760069 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 19:25:45.792409 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.792330 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 19:20:44 +0000 UTC" deadline="2028-01-06 03:02:27.567027606 +0000 UTC" Apr 20 19:25:45.792409 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.792384 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15007h36m41.774648973s" Apr 20 19:25:45.795940 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.795913 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:25:45.810254 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.810219 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:45.853416 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:45.853383 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:45.866542 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:45.866506 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60834707d5f53c027199cd5bc82fb1f6.slice/crio-9ca39882440d8ee8e399e4a29ca46ed4c945898136e7c21abf181fa88a92352b WatchSource:0}: Error finding container 9ca39882440d8ee8e399e4a29ca46ed4c945898136e7c21abf181fa88a92352b: Status 404 returned error can't find the container with id 9ca39882440d8ee8e399e4a29ca46ed4c945898136e7c21abf181fa88a92352b Apr 20 19:25:45.873214 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.873189 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:25:45.877293 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:45.877265 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2475b11f2cca7a40145b7c82467d78f7.slice/crio-73c4a37a829bb7af12e11e8026b2bf736f0ce52e13e2b70bdda228e2577160f4 WatchSource:0}: Error finding container 73c4a37a829bb7af12e11e8026b2bf736f0ce52e13e2b70bdda228e2577160f4: Status 404 returned error can't find the container with id 73c4a37a829bb7af12e11e8026b2bf736f0ce52e13e2b70bdda228e2577160f4 Apr 20 19:25:45.906470 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.906414 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal" event={"ID":"2475b11f2cca7a40145b7c82467d78f7","Type":"ContainerStarted","Data":"73c4a37a829bb7af12e11e8026b2bf736f0ce52e13e2b70bdda228e2577160f4"} Apr 20 19:25:45.907528 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.907505 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-55.ec2.internal" event={"ID":"60834707d5f53c027199cd5bc82fb1f6","Type":"ContainerStarted","Data":"9ca39882440d8ee8e399e4a29ca46ed4c945898136e7c21abf181fa88a92352b"} Apr 20 19:25:45.953786 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:45.953750 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:45.997211 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:45.997154 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-klw4b" Apr 20 19:25:46.018964 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.018939 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-klw4b" Apr 20 19:25:46.022680 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.022649 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:46.054156 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:46.054124 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:46.154743 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:46.154696 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-55.ec2.internal\" not found" Apr 20 19:25:46.214081 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.214050 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:46.260134 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.260056 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal" Apr 20 19:25:46.278158 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.278124 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:25:46.279338 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.279302 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-55.ec2.internal" Apr 20 19:25:46.290451 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.290420 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:25:46.738584 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.738496 2575 apiserver.go:52] "Watching apiserver" Apr 20 19:25:46.747052 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.747013 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 19:25:46.749594 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.749564 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal","openshift-multus/multus-additional-cni-plugins-wbqgg","openshift-network-diagnostics/network-check-target-64gnz","openshift-network-operator/iptables-alerter-z7wvh","kube-system/kube-apiserver-proxy-ip-10-0-135-55.ec2.internal","openshift-dns/node-resolver-bttgn","openshift-image-registry/node-ca-b6xwm","openshift-multus/multus-545qf","openshift-multus/network-metrics-daemon-j9xp6","openshift-ovn-kubernetes/ovnkube-node-w558t","kube-system/konnectivity-agent-724ct","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54","openshift-cluster-node-tuning-operator/tuned-79jvp"] Apr 20 19:25:46.752170 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.752136 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bttgn" Apr 20 19:25:46.754414 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.754388 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.755181 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.755156 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 19:25:46.755181 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.755175 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 19:25:46.755181 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.755183 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-lfkbv\"" Apr 20 19:25:46.756999 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.756976 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z7wvh" Apr 20 19:25:46.757223 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.757205 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 19:25:46.757565 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.757545 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-xnv4f\"" Apr 20 19:25:46.757938 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.757910 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 19:25:46.757938 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.757925 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 19:25:46.758438 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.758409 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 19:25:46.758525 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.758449 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 19:25:46.759008 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.758989 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 19:25:46.759150 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.759012 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:25:46.759446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.759430 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:25:46.759527 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:46.759498 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:25:46.761903 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.761880 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b6xwm" Apr 20 19:25:46.764918 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.764481 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-545qf" Apr 20 19:25:46.766105 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.766079 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 19:25:46.766242 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.766199 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-btbzq\"" Apr 20 19:25:46.766309 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.766294 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 19:25:46.766502 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.766480 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 19:25:46.766587 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.766516 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 19:25:46.766985 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.766967 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:46.767154 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.767015 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lr9cl\"" Apr 20 19:25:46.767154 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:46.767036 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:25:46.767373 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.767358 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 19:25:46.767412 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.767380 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 19:25:46.767656 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.767638 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8qcr9\"" Apr 20 19:25:46.767656 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.767653 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 19:25:46.767824 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.767748 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 19:25:46.769555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.769507 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.770769 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.770746 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 19:25:46.771992 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.771969 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-724ct" Apr 20 19:25:46.772249 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.772230 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 19:25:46.772517 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.771972 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 19:25:46.772878 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.772860 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xrthd\"" Apr 20 19:25:46.773161 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773131 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5nc4\" (UniqueName: \"kubernetes.io/projected/97e58d97-c4b1-4d4a-a6f6-d87a86138255-kube-api-access-c5nc4\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773255 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl977\" (UniqueName: \"kubernetes.io/projected/9ce44109-8d3b-4499-b0fb-fa475f90e132-kube-api-access-dl977\") pod \"node-resolver-bttgn\" (UID: \"9ce44109-8d3b-4499-b0fb-fa475f90e132\") " pod="openshift-dns/node-resolver-bttgn" Apr 20 19:25:46.773255 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773186 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-run-netns\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773255 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773228 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-log-socket\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773417 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773260 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-cni-netd\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773417 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4d4a78de-0a92-44ec-af5e-719f1eca3757-iptables-alerter-script\") pod \"iptables-alerter-z7wvh\" (UID: \"4d4a78de-0a92-44ec-af5e-719f1eca3757\") " pod="openshift-network-operator/iptables-alerter-z7wvh" Apr 20 19:25:46.773417 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d4a78de-0a92-44ec-af5e-719f1eca3757-host-slash\") pod \"iptables-alerter-z7wvh\" (UID: \"4d4a78de-0a92-44ec-af5e-719f1eca3757\") " pod="openshift-network-operator/iptables-alerter-z7wvh" Apr 20 19:25:46.773417 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773321 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9ce44109-8d3b-4499-b0fb-fa475f90e132-hosts-file\") pod \"node-resolver-bttgn\" (UID: \"9ce44109-8d3b-4499-b0fb-fa475f90e132\") " pod="openshift-dns/node-resolver-bttgn" Apr 20 19:25:46.773417 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773354 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773678 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773434 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/148d86dd-6cae-42a6-9e0e-44b0a13baa33-serviceca\") pod \"node-ca-b6xwm\" (UID: \"148d86dd-6cae-42a6-9e0e-44b0a13baa33\") " pod="openshift-image-registry/node-ca-b6xwm" Apr 20 19:25:46.773678 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-run-openvswitch\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773678 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773521 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-systemd-units\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773678 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773556 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-run-ovn\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773678 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-run-ovn-kubernetes\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773678 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-kubelet\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773678 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773642 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-slash\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773947 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-var-lib-openvswitch\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773947 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-node-log\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773947 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773752 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-cni-bin\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773947 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773776 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97e58d97-c4b1-4d4a-a6f6-d87a86138255-ovnkube-config\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773947 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773835 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntgvd\" (UniqueName: \"kubernetes.io/projected/148d86dd-6cae-42a6-9e0e-44b0a13baa33-kube-api-access-ntgvd\") pod \"node-ca-b6xwm\" (UID: \"148d86dd-6cae-42a6-9e0e-44b0a13baa33\") " pod="openshift-image-registry/node-ca-b6xwm" Apr 20 19:25:46.773947 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773864 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97e58d97-c4b1-4d4a-a6f6-d87a86138255-ovnkube-script-lib\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773947 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-etc-openvswitch\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.773947 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq2rx\" (UniqueName: \"kubernetes.io/projected/4d4a78de-0a92-44ec-af5e-719f1eca3757-kube-api-access-hq2rx\") pod \"iptables-alerter-z7wvh\" (UID: \"4d4a78de-0a92-44ec-af5e-719f1eca3757\") " pod="openshift-network-operator/iptables-alerter-z7wvh" Apr 20 19:25:46.773947 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5ml6\" (UniqueName: \"kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6\") pod \"network-check-target-64gnz\" (UID: \"77302d9f-23b6-4aa5-9de7-368ff66ca70e\") " pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:25:46.774226 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-run-systemd\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.774226 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.773984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/148d86dd-6cae-42a6-9e0e-44b0a13baa33-host\") pod \"node-ca-b6xwm\" (UID: \"148d86dd-6cae-42a6-9e0e-44b0a13baa33\") " pod="openshift-image-registry/node-ca-b6xwm" Apr 20 19:25:46.774226 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.774002 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ce44109-8d3b-4499-b0fb-fa475f90e132-tmp-dir\") pod \"node-resolver-bttgn\" (UID: \"9ce44109-8d3b-4499-b0fb-fa475f90e132\") " pod="openshift-dns/node-resolver-bttgn" Apr 20 19:25:46.774226 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.774021 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97e58d97-c4b1-4d4a-a6f6-d87a86138255-env-overrides\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.774226 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.774067 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97e58d97-c4b1-4d4a-a6f6-d87a86138255-ovn-node-metrics-cert\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.774900 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.774878 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.777358 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.777340 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.780751 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.780730 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 19:25:46.781142 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.781048 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 19:25:46.781142 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.781086 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 19:25:46.782028 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.781671 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:25:46.782028 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.781803 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ds5nn\"" Apr 20 19:25:46.782028 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.781969 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-66qhr\"" Apr 20 19:25:46.782251 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.782222 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 19:25:46.782705 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.782402 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wsd5d\"" Apr 20 19:25:46.782705 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.782411 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 19:25:46.782705 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.782687 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 19:25:46.861234 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.861199 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 19:25:46.874916 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.874882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97e58d97-c4b1-4d4a-a6f6-d87a86138255-env-overrides\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.874916 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.874926 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97e58d97-c4b1-4d4a-a6f6-d87a86138255-ovn-node-metrics-cert\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.875167 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.874961 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-multus-socket-dir-parent\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.875167 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.874989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.875167 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.875167 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875040 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-run\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.875167 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dl977\" (UniqueName: \"kubernetes.io/projected/9ce44109-8d3b-4499-b0fb-fa475f90e132-kube-api-access-dl977\") pod \"node-resolver-bttgn\" (UID: \"9ce44109-8d3b-4499-b0fb-fa475f90e132\") " pod="openshift-dns/node-resolver-bttgn" Apr 20 19:25:46.875167 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875095 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-cni-binary-copy\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.875167 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875118 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvw8s\" (UniqueName: \"kubernetes.io/projected/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-kube-api-access-dvw8s\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.875167 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875141 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-cni-binary-copy\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.875167 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-var-lib-cni-bin\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-hostroot\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875281 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-system-cni-dir\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875300 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-os-release\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875320 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-host\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-systemd-units\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875359 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-run-ovn-kubernetes\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-socket-dir\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875402 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/97e58d97-c4b1-4d4a-a6f6-d87a86138255-env-overrides\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875427 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-systemd-units\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875408 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-run-ovn-kubernetes\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875460 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-var-lib-kubelet\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875448 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875480 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875498 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-sysconfig\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875534 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-tmp\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.875567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875569 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-kubelet\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875628 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-slash\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875670 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97e58d97-c4b1-4d4a-a6f6-d87a86138255-ovnkube-config\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-kubelet\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-slash\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ntgvd\" (UniqueName: \"kubernetes.io/projected/148d86dd-6cae-42a6-9e0e-44b0a13baa33-kube-api-access-ntgvd\") pod \"node-ca-b6xwm\" (UID: \"148d86dd-6cae-42a6-9e0e-44b0a13baa33\") " pod="openshift-image-registry/node-ca-b6xwm" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875793 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-os-release\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.875887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97e58d97-c4b1-4d4a-a6f6-d87a86138255-ovnkube-script-lib\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hq2rx\" (UniqueName: \"kubernetes.io/projected/4d4a78de-0a92-44ec-af5e-719f1eca3757-kube-api-access-hq2rx\") pod \"iptables-alerter-z7wvh\" (UID: \"4d4a78de-0a92-44ec-af5e-719f1eca3757\") " pod="openshift-network-operator/iptables-alerter-z7wvh" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876040 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-etc-selinux\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876066 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-system-cni-dir\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876181 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/97e58d97-c4b1-4d4a-a6f6-d87a86138255-ovnkube-config\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-run-multus-certs\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-run-systemd\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876287 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-registration-dir\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876298 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 19:25:46.876346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-run-systemd\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876369 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/97e58d97-c4b1-4d4a-a6f6-d87a86138255-ovnkube-script-lib\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876382 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876411 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-sysctl-conf\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876438 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ce44109-8d3b-4499-b0fb-fa475f90e132-tmp-dir\") pod \"node-resolver-bttgn\" (UID: \"9ce44109-8d3b-4499-b0fb-fa475f90e132\") " pod="openshift-dns/node-resolver-bttgn" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-run-k8s-cni-cncf-io\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876510 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm7wn\" (UniqueName: \"kubernetes.io/projected/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-kube-api-access-rm7wn\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5nc4\" (UniqueName: \"kubernetes.io/projected/97e58d97-c4b1-4d4a-a6f6-d87a86138255-kube-api-access-c5nc4\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-run-netns\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-log-socket\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-cni-netd\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876651 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-lib-modules\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbhtf\" (UniqueName: \"kubernetes.io/projected/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-kube-api-access-mbhtf\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876700 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d5c4e0a-2236-4bab-82b8-acad605e74cb-agent-certs\") pod \"konnectivity-agent-724ct\" (UID: \"7d5c4e0a-2236-4bab-82b8-acad605e74cb\") " pod="kube-system/konnectivity-agent-724ct" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-cnibin\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876749 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-run-netns\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876772 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-multus-daemon-config\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.877025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876797 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4d4a78de-0a92-44ec-af5e-719f1eca3757-iptables-alerter-script\") pod \"iptables-alerter-z7wvh\" (UID: \"4d4a78de-0a92-44ec-af5e-719f1eca3757\") " pod="openshift-network-operator/iptables-alerter-z7wvh" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876828 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-run-netns\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876838 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-log-socket\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d4a78de-0a92-44ec-af5e-719f1eca3757-host-slash\") pod \"iptables-alerter-z7wvh\" (UID: \"4d4a78de-0a92-44ec-af5e-719f1eca3757\") " pod="openshift-network-operator/iptables-alerter-z7wvh" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876879 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d4a78de-0a92-44ec-af5e-719f1eca3757-host-slash\") pod \"iptables-alerter-z7wvh\" (UID: \"4d4a78de-0a92-44ec-af5e-719f1eca3757\") " pod="openshift-network-operator/iptables-alerter-z7wvh" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9ce44109-8d3b-4499-b0fb-fa475f90e132-hosts-file\") pod \"node-resolver-bttgn\" (UID: \"9ce44109-8d3b-4499-b0fb-fa475f90e132\") " pod="openshift-dns/node-resolver-bttgn" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-cni-netd\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/148d86dd-6cae-42a6-9e0e-44b0a13baa33-serviceca\") pod \"node-ca-b6xwm\" (UID: \"148d86dd-6cae-42a6-9e0e-44b0a13baa33\") " pod="openshift-image-registry/node-ca-b6xwm" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876962 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-run-openvswitch\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9ce44109-8d3b-4499-b0fb-fa475f90e132-hosts-file\") pod \"node-resolver-bttgn\" (UID: \"9ce44109-8d3b-4499-b0fb-fa475f90e132\") " pod="openshift-dns/node-resolver-bttgn" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.876996 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65v9h\" (UniqueName: \"kubernetes.io/projected/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-kube-api-access-65v9h\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877064 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-kubernetes\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877086 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-systemd\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-var-lib-kubelet\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877140 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-tuned\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877163 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d5c4e0a-2236-4bab-82b8-acad605e74cb-konnectivity-ca\") pod \"konnectivity-agent-724ct\" (UID: \"7d5c4e0a-2236-4bab-82b8-acad605e74cb\") " pod="kube-system/konnectivity-agent-724ct" Apr 20 19:25:46.877555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-run-ovn\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877217 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-sys-fs\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877240 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-etc-kubernetes\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-var-lib-openvswitch\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877281 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9ce44109-8d3b-4499-b0fb-fa475f90e132-tmp-dir\") pod \"node-resolver-bttgn\" (UID: \"9ce44109-8d3b-4499-b0fb-fa475f90e132\") " pod="openshift-dns/node-resolver-bttgn" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-node-log\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4d4a78de-0a92-44ec-af5e-719f1eca3757-iptables-alerter-script\") pod \"iptables-alerter-z7wvh\" (UID: \"4d4a78de-0a92-44ec-af5e-719f1eca3757\") " pod="openshift-network-operator/iptables-alerter-z7wvh" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877446 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-node-log\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-cni-bin\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-run-ovn\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877537 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-device-dir\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-var-lib-openvswitch\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877547 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-host-cni-bin\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877568 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-multus-cni-dir\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877591 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-run-openvswitch\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-var-lib-cni-multus\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v2cl\" (UniqueName: \"kubernetes.io/projected/234402da-caaa-48f3-8a69-400f12f55eb6-kube-api-access-6v2cl\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:46.878175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-modprobe-d\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.878829 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877684 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/148d86dd-6cae-42a6-9e0e-44b0a13baa33-serviceca\") pod \"node-ca-b6xwm\" (UID: \"148d86dd-6cae-42a6-9e0e-44b0a13baa33\") " pod="openshift-image-registry/node-ca-b6xwm" Apr 20 19:25:46.878829 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-etc-openvswitch\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.878829 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877723 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/97e58d97-c4b1-4d4a-a6f6-d87a86138255-etc-openvswitch\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.878829 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5ml6\" (UniqueName: \"kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6\") pod \"network-check-target-64gnz\" (UID: \"77302d9f-23b6-4aa5-9de7-368ff66ca70e\") " pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:25:46.878829 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877760 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-multus-conf-dir\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.878829 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877786 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-cnibin\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.878829 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877810 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-sysctl-d\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.878829 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877834 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-sys\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.878829 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877866 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/148d86dd-6cae-42a6-9e0e-44b0a13baa33-host\") pod \"node-ca-b6xwm\" (UID: \"148d86dd-6cae-42a6-9e0e-44b0a13baa33\") " pod="openshift-image-registry/node-ca-b6xwm" Apr 20 19:25:46.878829 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.877919 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/148d86dd-6cae-42a6-9e0e-44b0a13baa33-host\") pod \"node-ca-b6xwm\" (UID: \"148d86dd-6cae-42a6-9e0e-44b0a13baa33\") " pod="openshift-image-registry/node-ca-b6xwm" Apr 20 19:25:46.880192 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.880170 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/97e58d97-c4b1-4d4a-a6f6-d87a86138255-ovn-node-metrics-cert\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.907154 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:46.907120 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:46.907154 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:46.907155 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:46.907393 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:46.907170 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j5ml6 for pod openshift-network-diagnostics/network-check-target-64gnz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:46.907393 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:46.907263 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6 podName:77302d9f-23b6-4aa5-9de7-368ff66ca70e nodeName:}" failed. No retries permitted until 2026-04-20 19:25:47.407230445 +0000 UTC m=+3.097643698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-j5ml6" (UniqueName: "kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6") pod "network-check-target-64gnz" (UID: "77302d9f-23b6-4aa5-9de7-368ff66ca70e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:46.910158 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.910124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntgvd\" (UniqueName: \"kubernetes.io/projected/148d86dd-6cae-42a6-9e0e-44b0a13baa33-kube-api-access-ntgvd\") pod \"node-ca-b6xwm\" (UID: \"148d86dd-6cae-42a6-9e0e-44b0a13baa33\") " pod="openshift-image-registry/node-ca-b6xwm" Apr 20 19:25:46.911959 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.911905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl977\" (UniqueName: \"kubernetes.io/projected/9ce44109-8d3b-4499-b0fb-fa475f90e132-kube-api-access-dl977\") pod \"node-resolver-bttgn\" (UID: \"9ce44109-8d3b-4499-b0fb-fa475f90e132\") " pod="openshift-dns/node-resolver-bttgn" Apr 20 19:25:46.911959 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.911929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5nc4\" (UniqueName: \"kubernetes.io/projected/97e58d97-c4b1-4d4a-a6f6-d87a86138255-kube-api-access-c5nc4\") pod \"ovnkube-node-w558t\" (UID: \"97e58d97-c4b1-4d4a-a6f6-d87a86138255\") " pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:46.931188 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.931155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq2rx\" (UniqueName: \"kubernetes.io/projected/4d4a78de-0a92-44ec-af5e-719f1eca3757-kube-api-access-hq2rx\") pod \"iptables-alerter-z7wvh\" (UID: \"4d4a78de-0a92-44ec-af5e-719f1eca3757\") " pod="openshift-network-operator/iptables-alerter-z7wvh" Apr 20 19:25:46.979249 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979208 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-device-dir\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.979416 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-multus-cni-dir\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.979416 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-var-lib-cni-multus\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.979416 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6v2cl\" (UniqueName: \"kubernetes.io/projected/234402da-caaa-48f3-8a69-400f12f55eb6-kube-api-access-6v2cl\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:46.979416 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979385 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-modprobe-d\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.979416 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-var-lib-cni-multus\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.979416 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979414 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-multus-conf-dir\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.979727 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-device-dir\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.979727 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979437 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-cnibin\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.979727 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-cnibin\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.979727 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979496 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-multus-conf-dir\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.979727 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-sysctl-d\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.979727 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979581 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-sys\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.979727 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-multus-socket-dir-parent\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.979727 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.979727 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.979727 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979684 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-modprobe-d\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.979727 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979713 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-sysctl-d\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.980127 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979759 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-multus-cni-dir\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.980127 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-multus-socket-dir-parent\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.980127 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.979841 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-sys\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.980223 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-run\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.980223 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-run\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.980223 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-cni-binary-copy\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.980323 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvw8s\" (UniqueName: \"kubernetes.io/projected/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-kube-api-access-dvw8s\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.980323 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980267 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.980323 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980261 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-cni-binary-copy\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.980440 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980320 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-var-lib-cni-bin\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.980440 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-hostroot\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.980440 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-system-cni-dir\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.980440 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980267 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.980440 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-hostroot\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.980440 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980431 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-os-release\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980440 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-var-lib-cni-bin\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-host\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980496 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-socket-dir\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980514 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-os-release\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980496 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-system-cni-dir\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980534 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-host\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-var-lib-kubelet\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980572 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980598 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-sysconfig\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-socket-dir\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-tmp\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-sysconfig\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980671 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-os-release\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-var-lib-kubelet\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980710 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-etc-selinux\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980727 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-os-release\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980735 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-cni-binary-copy\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.980736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-system-cni-dir\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980756 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-etc-selinux\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980713 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980770 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-run-multus-certs\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980780 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-system-cni-dir\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980810 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-run-multus-certs\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980823 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-registration-dir\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980855 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-registration-dir\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-sysctl-conf\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-cni-binary-copy\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-run-k8s-cni-cncf-io\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:46.980933 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rm7wn\" (UniqueName: \"kubernetes.io/projected/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-kube-api-access-rm7wn\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-sysctl-conf\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.980998 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-run-k8s-cni-cncf-io\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:46.980998 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs podName:234402da-caaa-48f3-8a69-400f12f55eb6 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:47.480979688 +0000 UTC m=+3.171392957 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs") pod "network-metrics-daemon-j9xp6" (UID: "234402da-caaa-48f3-8a69-400f12f55eb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:46.981371 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981045 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-lib-modules\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981062 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbhtf\" (UniqueName: \"kubernetes.io/projected/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-kube-api-access-mbhtf\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d5c4e0a-2236-4bab-82b8-acad605e74cb-agent-certs\") pod \"konnectivity-agent-724ct\" (UID: \"7d5c4e0a-2236-4bab-82b8-acad605e74cb\") " pod="kube-system/konnectivity-agent-724ct" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981112 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-cnibin\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981159 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-run-netns\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981183 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-lib-modules\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981191 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-multus-daemon-config\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981236 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-host-run-netns\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981254 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-cnibin\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981303 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65v9h\" (UniqueName: \"kubernetes.io/projected/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-kube-api-access-65v9h\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-kubernetes\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981319 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-systemd\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-var-lib-kubelet\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-kubernetes\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981401 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-tuned\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-systemd\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.982030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981428 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d5c4e0a-2236-4bab-82b8-acad605e74cb-konnectivity-ca\") pod \"konnectivity-agent-724ct\" (UID: \"7d5c4e0a-2236-4bab-82b8-acad605e74cb\") " pod="kube-system/konnectivity-agent-724ct" Apr 20 19:25:46.982849 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-sys-fs\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.982849 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-var-lib-kubelet\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.982849 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-etc-kubernetes\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.982849 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.981510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-etc-kubernetes\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.982849 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.982743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-multus-daemon-config\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:46.983487 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.983465 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-sys-fs\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:46.984037 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.984012 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d5c4e0a-2236-4bab-82b8-acad605e74cb-konnectivity-ca\") pod \"konnectivity-agent-724ct\" (UID: \"7d5c4e0a-2236-4bab-82b8-acad605e74cb\") " pod="kube-system/konnectivity-agent-724ct" Apr 20 19:25:46.986338 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.984814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-tmp\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.986338 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.984918 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-etc-tuned\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:46.986338 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:46.985383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d5c4e0a-2236-4bab-82b8-acad605e74cb-agent-certs\") pod \"konnectivity-agent-724ct\" (UID: \"7d5c4e0a-2236-4bab-82b8-acad605e74cb\") " pod="kube-system/konnectivity-agent-724ct" Apr 20 19:25:47.003444 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.003368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v2cl\" (UniqueName: \"kubernetes.io/projected/234402da-caaa-48f3-8a69-400f12f55eb6-kube-api-access-6v2cl\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:47.010165 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.010124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65v9h\" (UniqueName: \"kubernetes.io/projected/ddf66c79-6b7a-4d5c-93f0-e2b401bede8d-kube-api-access-65v9h\") pod \"multus-545qf\" (UID: \"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d\") " pod="openshift-multus/multus-545qf" Apr 20 19:25:47.010309 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.010267 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm7wn\" (UniqueName: \"kubernetes.io/projected/3e3c3be7-736a-4d5e-ae42-c0f7e318af44-kube-api-access-rm7wn\") pod \"multus-additional-cni-plugins-wbqgg\" (UID: \"3e3c3be7-736a-4d5e-ae42-c0f7e318af44\") " pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:47.010387 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.010370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbhtf\" (UniqueName: \"kubernetes.io/projected/e43dc51c-4a26-4d80-a786-f5d55c0ce49d-kube-api-access-mbhtf\") pod \"tuned-79jvp\" (UID: \"e43dc51c-4a26-4d80-a786-f5d55c0ce49d\") " pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:47.012195 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.012168 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvw8s\" (UniqueName: \"kubernetes.io/projected/127b4f22-0e3b-457f-b9a3-5ae595e69d8b-kube-api-access-dvw8s\") pod \"aws-ebs-csi-driver-node-nfl54\" (UID: \"127b4f22-0e3b-457f-b9a3-5ae595e69d8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:47.020152 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.020103 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:20:46 +0000 UTC" deadline="2027-12-05 23:56:37.98427287 +0000 UTC" Apr 20 19:25:47.020152 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.020148 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14260h30m50.964127971s" Apr 20 19:25:47.066046 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.066011 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bttgn" Apr 20 19:25:47.072191 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.072164 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:25:47.081839 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.081806 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z7wvh" Apr 20 19:25:47.089594 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.089562 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b6xwm" Apr 20 19:25:47.097205 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.097177 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-545qf" Apr 20 19:25:47.101932 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.101904 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wbqgg" Apr 20 19:25:47.109352 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.109322 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-724ct" Apr 20 19:25:47.116334 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.116300 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" Apr 20 19:25:47.122165 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.122136 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-79jvp" Apr 20 19:25:47.182312 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.182280 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:47.484215 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.484123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:47.484215 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.484188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5ml6\" (UniqueName: \"kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6\") pod \"network-check-target-64gnz\" (UID: \"77302d9f-23b6-4aa5-9de7-368ff66ca70e\") " pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:25:47.484424 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:47.484309 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:47.484424 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:47.484342 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:47.484424 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:47.484360 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:47.484424 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:47.484370 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j5ml6 for pod openshift-network-diagnostics/network-check-target-64gnz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:47.484424 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:47.484393 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs podName:234402da-caaa-48f3-8a69-400f12f55eb6 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:48.484369618 +0000 UTC m=+4.174782876 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs") pod "network-metrics-daemon-j9xp6" (UID: "234402da-caaa-48f3-8a69-400f12f55eb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:47.484642 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:47.484439 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6 podName:77302d9f-23b6-4aa5-9de7-368ff66ca70e nodeName:}" failed. No retries permitted until 2026-04-20 19:25:48.484401003 +0000 UTC m=+4.174814265 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-j5ml6" (UniqueName: "kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6") pod "network-check-target-64gnz" (UID: "77302d9f-23b6-4aa5-9de7-368ff66ca70e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:47.555853 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:47.555782 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97e58d97_c4b1_4d4a_a6f6_d87a86138255.slice/crio-a877cbda3b3f02fac0211e091a700e31e9f0632348c75756aabf3104681f067e WatchSource:0}: Error finding container a877cbda3b3f02fac0211e091a700e31e9f0632348c75756aabf3104681f067e: Status 404 returned error can't find the container with id a877cbda3b3f02fac0211e091a700e31e9f0632348c75756aabf3104681f067e Apr 20 19:25:47.558924 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:47.558887 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddf66c79_6b7a_4d5c_93f0_e2b401bede8d.slice/crio-6db0054ca57fbe1989d2d364d25db11a7b521015cd35c730617fa60440d8c6f3 WatchSource:0}: Error finding container 6db0054ca57fbe1989d2d364d25db11a7b521015cd35c730617fa60440d8c6f3: Status 404 returned error can't find the container with id 6db0054ca57fbe1989d2d364d25db11a7b521015cd35c730617fa60440d8c6f3 Apr 20 19:25:47.560897 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:47.560864 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d5c4e0a_2236_4bab_82b8_acad605e74cb.slice/crio-c2bd0cfc6fff6d36142a60c0bfbb4cfed155be2d74ae282cd139ee2b81488e40 WatchSource:0}: Error finding container c2bd0cfc6fff6d36142a60c0bfbb4cfed155be2d74ae282cd139ee2b81488e40: Status 404 returned error can't find the container with id c2bd0cfc6fff6d36142a60c0bfbb4cfed155be2d74ae282cd139ee2b81488e40 Apr 20 19:25:47.562331 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:47.562196 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ce44109_8d3b_4499_b0fb_fa475f90e132.slice/crio-881616b0ddca3e8f17f758f11d23f9d176816e7ee4bb440c9c6a9a7e11128e58 WatchSource:0}: Error finding container 881616b0ddca3e8f17f758f11d23f9d176816e7ee4bb440c9c6a9a7e11128e58: Status 404 returned error can't find the container with id 881616b0ddca3e8f17f758f11d23f9d176816e7ee4bb440c9c6a9a7e11128e58 Apr 20 19:25:47.584408 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:47.584382 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod127b4f22_0e3b_457f_b9a3_5ae595e69d8b.slice/crio-ae5f0e8464395313e486c8b82f055993a865fd46d269db184dccc20ce847fb82 WatchSource:0}: Error finding container ae5f0e8464395313e486c8b82f055993a865fd46d269db184dccc20ce847fb82: Status 404 returned error can't find the container with id ae5f0e8464395313e486c8b82f055993a865fd46d269db184dccc20ce847fb82 Apr 20 19:25:47.586522 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:47.586025 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod148d86dd_6cae_42a6_9e0e_44b0a13baa33.slice/crio-c36b52a61f82f6c1c804880d6d01664675ea9624b22b65838e4ed352f1a68704 WatchSource:0}: Error finding container c36b52a61f82f6c1c804880d6d01664675ea9624b22b65838e4ed352f1a68704: Status 404 returned error can't find the container with id c36b52a61f82f6c1c804880d6d01664675ea9624b22b65838e4ed352f1a68704 Apr 20 19:25:47.586633 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:47.586582 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d4a78de_0a92_44ec_af5e_719f1eca3757.slice/crio-6c3f47d18e1d0344254defb6ee8490d671d8075b530dec980243e1798b91f0e3 WatchSource:0}: Error finding container 6c3f47d18e1d0344254defb6ee8490d671d8075b530dec980243e1798b91f0e3: Status 404 returned error can't find the container with id 6c3f47d18e1d0344254defb6ee8490d671d8075b530dec980243e1798b91f0e3 Apr 20 19:25:47.588162 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:47.588126 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e3c3be7_736a_4d5e_ae42_c0f7e318af44.slice/crio-71aa71d81cb1648ff2da4152f3c8513befb0aa6eafef94efd7c43bd73e5d4aed WatchSource:0}: Error finding container 71aa71d81cb1648ff2da4152f3c8513befb0aa6eafef94efd7c43bd73e5d4aed: Status 404 returned error can't find the container with id 71aa71d81cb1648ff2da4152f3c8513befb0aa6eafef94efd7c43bd73e5d4aed Apr 20 19:25:47.589581 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:25:47.589448 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode43dc51c_4a26_4d80_a786_f5d55c0ce49d.slice/crio-742daf9c19dbedc2b34be6a960af5e4ec34ead480809bb9ed3d4ad9a83975820 WatchSource:0}: Error finding container 742daf9c19dbedc2b34be6a960af5e4ec34ead480809bb9ed3d4ad9a83975820: Status 404 returned error can't find the container with id 742daf9c19dbedc2b34be6a960af5e4ec34ead480809bb9ed3d4ad9a83975820 Apr 20 19:25:47.912328 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.912240 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-724ct" event={"ID":"7d5c4e0a-2236-4bab-82b8-acad605e74cb","Type":"ContainerStarted","Data":"c2bd0cfc6fff6d36142a60c0bfbb4cfed155be2d74ae282cd139ee2b81488e40"} Apr 20 19:25:47.915560 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.915524 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-545qf" event={"ID":"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d","Type":"ContainerStarted","Data":"6db0054ca57fbe1989d2d364d25db11a7b521015cd35c730617fa60440d8c6f3"} Apr 20 19:25:47.918985 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.918949 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" event={"ID":"97e58d97-c4b1-4d4a-a6f6-d87a86138255","Type":"ContainerStarted","Data":"a877cbda3b3f02fac0211e091a700e31e9f0632348c75756aabf3104681f067e"} Apr 20 19:25:47.921216 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.921183 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-55.ec2.internal" event={"ID":"60834707d5f53c027199cd5bc82fb1f6","Type":"ContainerStarted","Data":"2d6d8e41dc2671c8377f2de23f22ed13d490cae957bc4248219ffc0ab8bf9c15"} Apr 20 19:25:47.922324 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.922297 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" event={"ID":"127b4f22-0e3b-457f-b9a3-5ae595e69d8b","Type":"ContainerStarted","Data":"ae5f0e8464395313e486c8b82f055993a865fd46d269db184dccc20ce847fb82"} Apr 20 19:25:47.923461 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.923432 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bttgn" event={"ID":"9ce44109-8d3b-4499-b0fb-fa475f90e132","Type":"ContainerStarted","Data":"881616b0ddca3e8f17f758f11d23f9d176816e7ee4bb440c9c6a9a7e11128e58"} Apr 20 19:25:47.924865 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.924839 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-79jvp" event={"ID":"e43dc51c-4a26-4d80-a786-f5d55c0ce49d","Type":"ContainerStarted","Data":"742daf9c19dbedc2b34be6a960af5e4ec34ead480809bb9ed3d4ad9a83975820"} Apr 20 19:25:47.928254 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.928222 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbqgg" event={"ID":"3e3c3be7-736a-4d5e-ae42-c0f7e318af44","Type":"ContainerStarted","Data":"71aa71d81cb1648ff2da4152f3c8513befb0aa6eafef94efd7c43bd73e5d4aed"} Apr 20 19:25:47.930392 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.930360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z7wvh" event={"ID":"4d4a78de-0a92-44ec-af5e-719f1eca3757","Type":"ContainerStarted","Data":"6c3f47d18e1d0344254defb6ee8490d671d8075b530dec980243e1798b91f0e3"} Apr 20 19:25:47.931683 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:47.931653 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b6xwm" event={"ID":"148d86dd-6cae-42a6-9e0e-44b0a13baa33","Type":"ContainerStarted","Data":"c36b52a61f82f6c1c804880d6d01664675ea9624b22b65838e4ed352f1a68704"} Apr 20 19:25:48.020963 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:48.020916 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:20:46 +0000 UTC" deadline="2027-10-26 08:08:50.344217118 +0000 UTC" Apr 20 19:25:48.020963 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:48.020959 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13284h43m2.323261237s" Apr 20 19:25:48.491096 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:48.491054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5ml6\" (UniqueName: \"kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6\") pod \"network-check-target-64gnz\" (UID: \"77302d9f-23b6-4aa5-9de7-368ff66ca70e\") " pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:25:48.491320 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:48.491144 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:48.491320 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:48.491236 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:48.491320 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:48.491262 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:48.491320 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:48.491270 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:48.491320 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:48.491276 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j5ml6 for pod openshift-network-diagnostics/network-check-target-64gnz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:48.491651 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:48.491340 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6 podName:77302d9f-23b6-4aa5-9de7-368ff66ca70e nodeName:}" failed. No retries permitted until 2026-04-20 19:25:50.491320381 +0000 UTC m=+6.181733697 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-j5ml6" (UniqueName: "kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6") pod "network-check-target-64gnz" (UID: "77302d9f-23b6-4aa5-9de7-368ff66ca70e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:48.491651 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:48.491362 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs podName:234402da-caaa-48f3-8a69-400f12f55eb6 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:50.491350931 +0000 UTC m=+6.181764189 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs") pod "network-metrics-daemon-j9xp6" (UID: "234402da-caaa-48f3-8a69-400f12f55eb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:48.903839 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:48.903803 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:25:48.904038 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:48.903944 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:25:48.904410 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:48.904388 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:48.904522 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:48.904505 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:25:48.947744 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:48.947694 2575 generic.go:358] "Generic (PLEG): container finished" podID="2475b11f2cca7a40145b7c82467d78f7" containerID="6df74b621de25085ade0853b3e380104d1a8712d9cc01ecbc87e36d715d70818" exitCode=0 Apr 20 19:25:48.948218 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:48.947779 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal" event={"ID":"2475b11f2cca7a40145b7c82467d78f7","Type":"ContainerDied","Data":"6df74b621de25085ade0853b3e380104d1a8712d9cc01ecbc87e36d715d70818"} Apr 20 19:25:48.968681 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:48.967772 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-55.ec2.internal" podStartSLOduration=2.967753136 podStartE2EDuration="2.967753136s" podCreationTimestamp="2026-04-20 19:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:25:47.939071886 +0000 UTC m=+3.629485162" watchObservedRunningTime="2026-04-20 19:25:48.967753136 +0000 UTC m=+4.658166410" Apr 20 19:25:49.963165 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:49.962932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal" event={"ID":"2475b11f2cca7a40145b7c82467d78f7","Type":"ContainerStarted","Data":"15378ca2c1a544f0b3d6f60e791d0efa2cf8718f4843d6d30042ad93f60c12b6"} Apr 20 19:25:50.509222 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:50.508395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5ml6\" (UniqueName: \"kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6\") pod \"network-check-target-64gnz\" (UID: \"77302d9f-23b6-4aa5-9de7-368ff66ca70e\") " pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:25:50.509222 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:50.508475 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:50.509222 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:50.508631 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:50.509222 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:50.508697 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs podName:234402da-caaa-48f3-8a69-400f12f55eb6 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:54.508677068 +0000 UTC m=+10.199090326 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs") pod "network-metrics-daemon-j9xp6" (UID: "234402da-caaa-48f3-8a69-400f12f55eb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:50.509222 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:50.509105 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:50.509222 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:50.509123 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:50.509222 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:50.509136 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j5ml6 for pod openshift-network-diagnostics/network-check-target-64gnz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:50.509222 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:50.509182 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6 podName:77302d9f-23b6-4aa5-9de7-368ff66ca70e nodeName:}" failed. No retries permitted until 2026-04-20 19:25:54.509165963 +0000 UTC m=+10.199579218 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-j5ml6" (UniqueName: "kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6") pod "network-check-target-64gnz" (UID: "77302d9f-23b6-4aa5-9de7-368ff66ca70e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:50.904696 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:50.903687 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:50.904696 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:50.903737 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:25:50.904696 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:50.903848 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:25:50.904696 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:50.903993 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:25:52.903826 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:52.903793 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:25:52.904265 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:52.903793 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:52.904265 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:52.903918 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:25:52.904265 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:52.904018 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:25:54.547366 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:54.547334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5ml6\" (UniqueName: \"kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6\") pod \"network-check-target-64gnz\" (UID: \"77302d9f-23b6-4aa5-9de7-368ff66ca70e\") " pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:25:54.547941 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:54.547398 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:54.547941 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:54.547519 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:54.547941 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:54.547581 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs podName:234402da-caaa-48f3-8a69-400f12f55eb6 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:02.547562661 +0000 UTC m=+18.237975925 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs") pod "network-metrics-daemon-j9xp6" (UID: "234402da-caaa-48f3-8a69-400f12f55eb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:54.548110 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:54.548029 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:54.548110 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:54.548048 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:54.548110 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:54.548060 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j5ml6 for pod openshift-network-diagnostics/network-check-target-64gnz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:54.548247 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:54.548108 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6 podName:77302d9f-23b6-4aa5-9de7-368ff66ca70e nodeName:}" failed. No retries permitted until 2026-04-20 19:26:02.548091537 +0000 UTC m=+18.238504793 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-j5ml6" (UniqueName: "kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6") pod "network-check-target-64gnz" (UID: "77302d9f-23b6-4aa5-9de7-368ff66ca70e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:54.906082 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:54.904829 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:25:54.906082 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:54.904949 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:25:54.906082 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:54.904999 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:54.906082 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:54.905148 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:25:56.903206 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:56.903162 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:25:56.903777 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:56.903293 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:25:56.903777 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:56.903387 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:56.903777 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:56.903522 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:25:58.903831 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:58.903793 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:25:58.904235 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:58.903945 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:25:58.904235 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:25:58.903991 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:25:58.904235 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:25:58.904059 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:26:00.903206 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:00.903111 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:00.903819 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:00.903243 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:26:00.903819 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:00.903304 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:26:00.903819 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:00.903425 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:26:02.605774 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:02.605728 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:26:02.606242 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:02.605813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5ml6\" (UniqueName: \"kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6\") pod \"network-check-target-64gnz\" (UID: \"77302d9f-23b6-4aa5-9de7-368ff66ca70e\") " pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:02.606242 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:02.605869 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:26:02.606242 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:02.605939 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:26:02.606242 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:02.605960 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:26:02.606242 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:02.605973 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j5ml6 for pod openshift-network-diagnostics/network-check-target-64gnz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:26:02.606242 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:02.605945 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs podName:234402da-caaa-48f3-8a69-400f12f55eb6 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:18.605926145 +0000 UTC m=+34.296339418 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs") pod "network-metrics-daemon-j9xp6" (UID: "234402da-caaa-48f3-8a69-400f12f55eb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:26:02.606242 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:02.606045 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6 podName:77302d9f-23b6-4aa5-9de7-368ff66ca70e nodeName:}" failed. No retries permitted until 2026-04-20 19:26:18.606028285 +0000 UTC m=+34.296441546 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-j5ml6" (UniqueName: "kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6") pod "network-check-target-64gnz" (UID: "77302d9f-23b6-4aa5-9de7-368ff66ca70e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:26:02.903751 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:02.903192 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:26:02.903751 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:02.903238 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:02.903751 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:02.903345 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:26:02.903751 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:02.903468 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:26:04.908145 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:04.907926 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:04.908801 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:04.908224 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:26:04.908801 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:04.908281 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:26:04.908801 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:04.908446 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:26:05.996768 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:05.996314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" event={"ID":"127b4f22-0e3b-457f-b9a3-5ae595e69d8b","Type":"ContainerStarted","Data":"1d729e5ba8d54fd56755bce68b3cabedbc68a85e14ce38470024ccda8600422e"} Apr 20 19:26:05.997751 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:05.997715 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bttgn" event={"ID":"9ce44109-8d3b-4499-b0fb-fa475f90e132","Type":"ContainerStarted","Data":"eda78961023860e2d8ad3cfc9a4d5215069aa59e1eb8ae012e533a9f0500b64b"} Apr 20 19:26:05.999086 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:05.999044 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-79jvp" event={"ID":"e43dc51c-4a26-4d80-a786-f5d55c0ce49d","Type":"ContainerStarted","Data":"d6a251c8bd902d2d5b81d6f9593dc8f77b0de95a63981f6359904d81c3dc2fe3"} Apr 20 19:26:06.000417 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.000394 2575 generic.go:358] "Generic (PLEG): container finished" podID="3e3c3be7-736a-4d5e-ae42-c0f7e318af44" containerID="b2367855c366379cf962a1fb24297b1e016522a22fbe8fd6706973762206c666" exitCode=0 Apr 20 19:26:06.000530 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.000461 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbqgg" event={"ID":"3e3c3be7-736a-4d5e-ae42-c0f7e318af44","Type":"ContainerDied","Data":"b2367855c366379cf962a1fb24297b1e016522a22fbe8fd6706973762206c666"} Apr 20 19:26:06.001953 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.001721 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b6xwm" event={"ID":"148d86dd-6cae-42a6-9e0e-44b0a13baa33","Type":"ContainerStarted","Data":"3516a423f2e78ba89119621c313e3ea1d6b2735ea048d2cd64a30eafd54761b7"} Apr 20 19:26:06.004030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.003643 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-724ct" event={"ID":"7d5c4e0a-2236-4bab-82b8-acad605e74cb","Type":"ContainerStarted","Data":"31dc6d5368ff1da84004eefcd9aa05362f1e71173facbbb2a1ff6b65dbe06faf"} Apr 20 19:26:06.007900 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.007865 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-545qf" event={"ID":"ddf66c79-6b7a-4d5c-93f0-e2b401bede8d","Type":"ContainerStarted","Data":"7759269b55b17c3ea16bf66a697683c09a1a99079921d7f029930ef66456f3ac"} Apr 20 19:26:06.010827 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.010802 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w558t_97e58d97-c4b1-4d4a-a6f6-d87a86138255/ovn-acl-logging/0.log" Apr 20 19:26:06.011125 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.011100 2575 generic.go:358] "Generic (PLEG): container finished" podID="97e58d97-c4b1-4d4a-a6f6-d87a86138255" containerID="adc635b0a7c44fe0ae61a1ebf5dcc19f9af7f92d1883ffc99578131ab2839307" exitCode=1 Apr 20 19:26:06.011219 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.011145 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" event={"ID":"97e58d97-c4b1-4d4a-a6f6-d87a86138255","Type":"ContainerStarted","Data":"d22d36b32f8e7ae04c360e40ae011980065db06e1e71c294c646af30fa1b5791"} Apr 20 19:26:06.011219 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.011165 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" event={"ID":"97e58d97-c4b1-4d4a-a6f6-d87a86138255","Type":"ContainerStarted","Data":"1871c51e506fada984232b7cf603ddb90ef9aee28d7a50d734241247d148db50"} Apr 20 19:26:06.011219 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.011173 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" event={"ID":"97e58d97-c4b1-4d4a-a6f6-d87a86138255","Type":"ContainerStarted","Data":"704b7fbecf2da08ff2c7efb8e4247bd9338c0e56d702aa803b542326115ada18"} Apr 20 19:26:06.011219 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.011182 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" event={"ID":"97e58d97-c4b1-4d4a-a6f6-d87a86138255","Type":"ContainerStarted","Data":"7ba2df0052bd51f721814bd5125bee3e7916a6860d7fd15a0dcada8fb14ccf9a"} Apr 20 19:26:06.011219 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.011189 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" event={"ID":"97e58d97-c4b1-4d4a-a6f6-d87a86138255","Type":"ContainerDied","Data":"adc635b0a7c44fe0ae61a1ebf5dcc19f9af7f92d1883ffc99578131ab2839307"} Apr 20 19:26:06.011219 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.011198 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" event={"ID":"97e58d97-c4b1-4d4a-a6f6-d87a86138255","Type":"ContainerStarted","Data":"dbd19a651cd4feae19cea0d16a3ec9f4612ab5c32eb43e5abb09e4b8da8b2828"} Apr 20 19:26:06.022930 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.022638 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-55.ec2.internal" podStartSLOduration=20.022594282 podStartE2EDuration="20.022594282s" podCreationTimestamp="2026-04-20 19:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:25:49.993082448 +0000 UTC m=+5.683495725" watchObservedRunningTime="2026-04-20 19:26:06.022594282 +0000 UTC m=+21.713007556" Apr 20 19:26:06.022930 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.022877 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bttgn" podStartSLOduration=4.707445107 podStartE2EDuration="22.022870778s" podCreationTimestamp="2026-04-20 19:25:44 +0000 UTC" firstStartedPulling="2026-04-20 19:25:47.582998783 +0000 UTC m=+3.273412035" lastFinishedPulling="2026-04-20 19:26:04.89842444 +0000 UTC m=+20.588837706" observedRunningTime="2026-04-20 19:26:06.022827278 +0000 UTC m=+21.713240552" watchObservedRunningTime="2026-04-20 19:26:06.022870778 +0000 UTC m=+21.713284053" Apr 20 19:26:06.050184 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.050131 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-b6xwm" podStartSLOduration=3.7400250699999997 podStartE2EDuration="21.050112052s" podCreationTimestamp="2026-04-20 19:25:45 +0000 UTC" firstStartedPulling="2026-04-20 19:25:47.588248835 +0000 UTC m=+3.278662098" lastFinishedPulling="2026-04-20 19:26:04.898335819 +0000 UTC m=+20.588749080" observedRunningTime="2026-04-20 19:26:06.049816136 +0000 UTC m=+21.740229411" watchObservedRunningTime="2026-04-20 19:26:06.050112052 +0000 UTC m=+21.740525326" Apr 20 19:26:06.081061 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.080963 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-545qf" podStartSLOduration=3.696772511 podStartE2EDuration="21.08095039s" podCreationTimestamp="2026-04-20 19:25:45 +0000 UTC" firstStartedPulling="2026-04-20 19:25:47.560847557 +0000 UTC m=+3.251260810" lastFinishedPulling="2026-04-20 19:26:04.94502542 +0000 UTC m=+20.635438689" observedRunningTime="2026-04-20 19:26:06.08064831 +0000 UTC m=+21.771061585" watchObservedRunningTime="2026-04-20 19:26:06.08095039 +0000 UTC m=+21.771363667" Apr 20 19:26:06.119735 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.119673 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-724ct" podStartSLOduration=8.258945921 podStartE2EDuration="21.119657069s" podCreationTimestamp="2026-04-20 19:25:45 +0000 UTC" firstStartedPulling="2026-04-20 19:25:47.582997236 +0000 UTC m=+3.273410488" lastFinishedPulling="2026-04-20 19:26:00.443708378 +0000 UTC m=+16.134121636" observedRunningTime="2026-04-20 19:26:06.119194314 +0000 UTC m=+21.809607590" watchObservedRunningTime="2026-04-20 19:26:06.119657069 +0000 UTC m=+21.810070342" Apr 20 19:26:06.191274 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.191205 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-79jvp" podStartSLOduration=3.8858084980000003 podStartE2EDuration="21.191182797s" podCreationTimestamp="2026-04-20 19:25:45 +0000 UTC" firstStartedPulling="2026-04-20 19:25:47.592915275 +0000 UTC m=+3.283328530" lastFinishedPulling="2026-04-20 19:26:04.898289574 +0000 UTC m=+20.588702829" observedRunningTime="2026-04-20 19:26:06.190325273 +0000 UTC m=+21.880738559" watchObservedRunningTime="2026-04-20 19:26:06.191182797 +0000 UTC m=+21.881596075" Apr 20 19:26:06.762632 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.762580 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 19:26:06.848315 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.848148 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T19:26:06.76260023Z","UUID":"9bcf6928-c49f-4f7f-bff5-419c7a9df00f","Handler":null,"Name":"","Endpoint":""} Apr 20 19:26:06.849921 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.849895 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 19:26:06.849921 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.849928 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 19:26:06.903240 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.903203 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:06.903387 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:06.903203 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:26:06.903451 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:06.903402 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:26:06.903451 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:06.903305 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:26:07.014545 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:07.014503 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z7wvh" event={"ID":"4d4a78de-0a92-44ec-af5e-719f1eca3757","Type":"ContainerStarted","Data":"d0f7f700b94f0bc996f557947361900c7d57d747c3922f466d4a513c840a266b"} Apr 20 19:26:07.016395 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:07.016367 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" event={"ID":"127b4f22-0e3b-457f-b9a3-5ae595e69d8b","Type":"ContainerStarted","Data":"4c85d56e31e73af467137a8c189b95e73c8e94b86bccfeafd14a73e1952823d5"} Apr 20 19:26:07.041454 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:07.041394 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-z7wvh" podStartSLOduration=4.731827795 podStartE2EDuration="22.041372885s" podCreationTimestamp="2026-04-20 19:25:45 +0000 UTC" firstStartedPulling="2026-04-20 19:25:47.588799411 +0000 UTC m=+3.279212678" lastFinishedPulling="2026-04-20 19:26:04.898344516 +0000 UTC m=+20.588757768" observedRunningTime="2026-04-20 19:26:07.040926246 +0000 UTC m=+22.731339520" watchObservedRunningTime="2026-04-20 19:26:07.041372885 +0000 UTC m=+22.731786160" Apr 20 19:26:08.022437 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:08.022410 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w558t_97e58d97-c4b1-4d4a-a6f6-d87a86138255/ovn-acl-logging/0.log" Apr 20 19:26:08.023423 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:08.022902 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" event={"ID":"97e58d97-c4b1-4d4a-a6f6-d87a86138255","Type":"ContainerStarted","Data":"47a862912fc2510ff5f5075f2a084e4945111c873936008ea05330f44cff30ba"} Apr 20 19:26:08.902943 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:08.902906 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:08.903124 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:08.903044 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:26:08.903124 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:08.903106 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:26:08.903229 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:08.903204 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:26:09.027691 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:09.027650 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" event={"ID":"127b4f22-0e3b-457f-b9a3-5ae595e69d8b","Type":"ContainerStarted","Data":"3b7a9a61a7faddbc02a9de746b37d09270314ef3bfe572347f21c3d740212a79"} Apr 20 19:26:09.063687 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:09.063628 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nfl54" podStartSLOduration=3.363075968 podStartE2EDuration="24.063593126s" podCreationTimestamp="2026-04-20 19:25:45 +0000 UTC" firstStartedPulling="2026-04-20 19:25:47.586450971 +0000 UTC m=+3.276864223" lastFinishedPulling="2026-04-20 19:26:08.286968125 +0000 UTC m=+23.977381381" observedRunningTime="2026-04-20 19:26:09.063384319 +0000 UTC m=+24.753797592" watchObservedRunningTime="2026-04-20 19:26:09.063593126 +0000 UTC m=+24.754006400" Apr 20 19:26:09.627349 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:09.627313 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-724ct" Apr 20 19:26:09.628129 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:09.628108 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-724ct" Apr 20 19:26:10.030482 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:10.030451 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-724ct" Apr 20 19:26:10.031290 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:10.031274 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-724ct" Apr 20 19:26:10.904079 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:10.903880 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:10.904244 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:10.903894 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:26:10.904244 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:10.904180 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:26:10.904335 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:10.904305 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:26:11.033191 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:11.033153 2575 generic.go:358] "Generic (PLEG): container finished" podID="3e3c3be7-736a-4d5e-ae42-c0f7e318af44" containerID="a505132a0515e85bce572e315a8e073d36e7b8e380d1a4552c72f27fc42d7c98" exitCode=0 Apr 20 19:26:11.033651 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:11.033243 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbqgg" event={"ID":"3e3c3be7-736a-4d5e-ae42-c0f7e318af44","Type":"ContainerDied","Data":"a505132a0515e85bce572e315a8e073d36e7b8e380d1a4552c72f27fc42d7c98"} Apr 20 19:26:11.038442 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:11.038422 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w558t_97e58d97-c4b1-4d4a-a6f6-d87a86138255/ovn-acl-logging/0.log" Apr 20 19:26:11.038866 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:11.038839 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" event={"ID":"97e58d97-c4b1-4d4a-a6f6-d87a86138255","Type":"ContainerStarted","Data":"925eafcb3ec6a949d19eb911b72746db890b73de0bf7ca46921fbff3c9feacfa"} Apr 20 19:26:11.039151 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:11.039128 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:26:11.039239 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:11.039160 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:26:11.039336 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:11.039318 2575 scope.go:117] "RemoveContainer" containerID="adc635b0a7c44fe0ae61a1ebf5dcc19f9af7f92d1883ffc99578131ab2839307" Apr 20 19:26:11.056536 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:11.056510 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:26:12.043445 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:12.043253 2575 generic.go:358] "Generic (PLEG): container finished" podID="3e3c3be7-736a-4d5e-ae42-c0f7e318af44" containerID="c884f727aab0cafe282a60acc1e2cb4841d785d09ebff78ce9aaaeb1d331df75" exitCode=0 Apr 20 19:26:12.044037 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:12.043325 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbqgg" event={"ID":"3e3c3be7-736a-4d5e-ae42-c0f7e318af44","Type":"ContainerDied","Data":"c884f727aab0cafe282a60acc1e2cb4841d785d09ebff78ce9aaaeb1d331df75"} Apr 20 19:26:12.047039 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:12.047015 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w558t_97e58d97-c4b1-4d4a-a6f6-d87a86138255/ovn-acl-logging/0.log" Apr 20 19:26:12.047396 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:12.047366 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" event={"ID":"97e58d97-c4b1-4d4a-a6f6-d87a86138255","Type":"ContainerStarted","Data":"010396602bd9d9453d6a768b524c172fbd61ca0d335d47d318169105d0949381"} Apr 20 19:26:12.047755 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:12.047712 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:26:12.063591 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:12.063560 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:26:12.141306 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:12.141248 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" podStartSLOduration=10.672325047 podStartE2EDuration="28.141230712s" podCreationTimestamp="2026-04-20 19:25:44 +0000 UTC" firstStartedPulling="2026-04-20 19:25:47.557773488 +0000 UTC m=+3.248186746" lastFinishedPulling="2026-04-20 19:26:05.026679145 +0000 UTC m=+20.717092411" observedRunningTime="2026-04-20 19:26:12.137453726 +0000 UTC m=+27.827867026" watchObservedRunningTime="2026-04-20 19:26:12.141230712 +0000 UTC m=+27.831644013" Apr 20 19:26:12.360978 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:12.360801 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-64gnz"] Apr 20 19:26:12.360978 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:12.360947 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:12.361150 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:12.361043 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:26:12.372217 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:12.372178 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j9xp6"] Apr 20 19:26:12.372403 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:12.372337 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:26:12.372468 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:12.372450 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:26:13.051543 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:13.051511 2575 generic.go:358] "Generic (PLEG): container finished" podID="3e3c3be7-736a-4d5e-ae42-c0f7e318af44" containerID="d0fbb4479cc2859dceb8e7f9e65d1c4b1518d72f92fa22388e6a18e9d4472139" exitCode=0 Apr 20 19:26:13.052165 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:13.051655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbqgg" event={"ID":"3e3c3be7-736a-4d5e-ae42-c0f7e318af44","Type":"ContainerDied","Data":"d0fbb4479cc2859dceb8e7f9e65d1c4b1518d72f92fa22388e6a18e9d4472139"} Apr 20 19:26:13.903555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:13.903520 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:26:13.903761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:13.903518 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:13.903761 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:13.903674 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:26:13.903761 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:13.903710 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:26:15.903050 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:15.903012 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:15.903724 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:15.903016 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:26:15.903724 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:15.903154 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:26:15.903724 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:15.903248 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:26:17.903271 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:17.903057 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:26:17.903734 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:17.903078 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:17.903734 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:17.903368 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:26:17.903734 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:17.903435 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-64gnz" podUID="77302d9f-23b6-4aa5-9de7-368ff66ca70e" Apr 20 19:26:18.129155 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.129042 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-55.ec2.internal" event="NodeReady" Apr 20 19:26:18.129382 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.129201 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 19:26:18.202550 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.202302 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-d6zt6"] Apr 20 19:26:18.230856 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.230817 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hsc5z"] Apr 20 19:26:18.231025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.230941 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:18.234175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.234151 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 19:26:18.234175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.234161 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 19:26:18.234489 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.234471 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5vnpt\"" Apr 20 19:26:18.242314 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.242290 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d6zt6"] Apr 20 19:26:18.242314 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.242318 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hsc5z"] Apr 20 19:26:18.242494 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.242422 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:26:18.247847 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.247818 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 19:26:18.248008 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.247819 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 19:26:18.248008 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.247937 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbf4z\"" Apr 20 19:26:18.248115 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.247819 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 19:26:18.338102 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.338063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmx2r\" (UniqueName: \"kubernetes.io/projected/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-kube-api-access-gmx2r\") pod \"ingress-canary-hsc5z\" (UID: \"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425\") " pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:26:18.338338 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.338113 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b292e05-658a-4efc-8b35-8f64c0071f73-tmp-dir\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:18.338338 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.338186 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:18.338338 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.338228 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrzx6\" (UniqueName: \"kubernetes.io/projected/5b292e05-658a-4efc-8b35-8f64c0071f73-kube-api-access-vrzx6\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:18.338338 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.338318 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b292e05-658a-4efc-8b35-8f64c0071f73-config-volume\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:18.338338 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.338337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert\") pod \"ingress-canary-hsc5z\" (UID: \"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425\") " pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:26:18.439463 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.439410 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b292e05-658a-4efc-8b35-8f64c0071f73-config-volume\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:18.439463 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.439460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert\") pod \"ingress-canary-hsc5z\" (UID: \"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425\") " pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:26:18.439887 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.439506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmx2r\" (UniqueName: \"kubernetes.io/projected/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-kube-api-access-gmx2r\") pod \"ingress-canary-hsc5z\" (UID: \"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425\") " pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:26:18.439887 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.439575 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b292e05-658a-4efc-8b35-8f64c0071f73-tmp-dir\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:18.439887 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.439636 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:18.439887 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:18.439647 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:18.439887 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.439660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrzx6\" (UniqueName: \"kubernetes.io/projected/5b292e05-658a-4efc-8b35-8f64c0071f73-kube-api-access-vrzx6\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:18.439887 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:18.439732 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert podName:e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:18.939702825 +0000 UTC m=+34.630116082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert") pod "ingress-canary-hsc5z" (UID: "e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425") : secret "canary-serving-cert" not found Apr 20 19:26:18.440158 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:18.439891 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:18.440158 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:18.439954 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls podName:5b292e05-658a-4efc-8b35-8f64c0071f73 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:18.939936693 +0000 UTC m=+34.630349961 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls") pod "dns-default-d6zt6" (UID: "5b292e05-658a-4efc-8b35-8f64c0071f73") : secret "dns-default-metrics-tls" not found Apr 20 19:26:18.440158 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.439963 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5b292e05-658a-4efc-8b35-8f64c0071f73-tmp-dir\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:18.440290 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.440178 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5b292e05-658a-4efc-8b35-8f64c0071f73-config-volume\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:18.465189 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.465149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmx2r\" (UniqueName: \"kubernetes.io/projected/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-kube-api-access-gmx2r\") pod \"ingress-canary-hsc5z\" (UID: \"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425\") " pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:26:18.465641 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.465606 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrzx6\" (UniqueName: \"kubernetes.io/projected/5b292e05-658a-4efc-8b35-8f64c0071f73-kube-api-access-vrzx6\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:18.641928 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.641872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5ml6\" (UniqueName: \"kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6\") pod \"network-check-target-64gnz\" (UID: \"77302d9f-23b6-4aa5-9de7-368ff66ca70e\") " pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:18.642218 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.641961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:26:18.642218 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:18.642051 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:26:18.642218 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:18.642075 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:26:18.642218 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:18.642085 2575 projected.go:194] Error preparing data for projected volume kube-api-access-j5ml6 for pod openshift-network-diagnostics/network-check-target-64gnz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:26:18.642218 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:18.642090 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:26:18.642218 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:18.642149 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs podName:234402da-caaa-48f3-8a69-400f12f55eb6 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:50.642129149 +0000 UTC m=+66.332542404 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs") pod "network-metrics-daemon-j9xp6" (UID: "234402da-caaa-48f3-8a69-400f12f55eb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:26:18.642218 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:18.642172 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6 podName:77302d9f-23b6-4aa5-9de7-368ff66ca70e nodeName:}" failed. No retries permitted until 2026-04-20 19:26:50.642160379 +0000 UTC m=+66.332573631 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-j5ml6" (UniqueName: "kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6") pod "network-check-target-64gnz" (UID: "77302d9f-23b6-4aa5-9de7-368ff66ca70e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:26:18.944604 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.944563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert\") pod \"ingress-canary-hsc5z\" (UID: \"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425\") " pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:26:18.945019 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:18.944642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:18.945019 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:18.944720 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:18.945019 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:18.944710 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:18.945019 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:18.944773 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls podName:5b292e05-658a-4efc-8b35-8f64c0071f73 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:19.944760478 +0000 UTC m=+35.635173730 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls") pod "dns-default-d6zt6" (UID: "5b292e05-658a-4efc-8b35-8f64c0071f73") : secret "dns-default-metrics-tls" not found Apr 20 19:26:18.945019 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:18.944786 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert podName:e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:19.944780295 +0000 UTC m=+35.635193548 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert") pod "ingress-canary-hsc5z" (UID: "e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425") : secret "canary-serving-cert" not found Apr 20 19:26:19.903534 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:19.903499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:19.903745 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:19.903499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:26:19.910067 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:19.910041 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:26:19.911230 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:19.911205 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-89kqx\"" Apr 20 19:26:19.911405 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:19.911249 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:26:19.911405 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:19.911250 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v8fpn\"" Apr 20 19:26:19.911405 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:19.911205 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:26:19.952608 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:19.952567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:19.953054 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:19.952677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert\") pod \"ingress-canary-hsc5z\" (UID: \"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425\") " pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:26:19.953054 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:19.952718 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:19.953054 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:19.952796 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls podName:5b292e05-658a-4efc-8b35-8f64c0071f73 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:21.952779205 +0000 UTC m=+37.643192458 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls") pod "dns-default-d6zt6" (UID: "5b292e05-658a-4efc-8b35-8f64c0071f73") : secret "dns-default-metrics-tls" not found Apr 20 19:26:19.953054 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:19.952810 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:19.953054 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:19.952855 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert podName:e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:21.952842707 +0000 UTC m=+37.643255958 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert") pod "ingress-canary-hsc5z" (UID: "e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425") : secret "canary-serving-cert" not found Apr 20 19:26:20.068045 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:20.068002 2575 generic.go:358] "Generic (PLEG): container finished" podID="3e3c3be7-736a-4d5e-ae42-c0f7e318af44" containerID="aace4ca42a47e0a4444d943d2d8913a2a5b92525982cf531198e29fd938ff7bb" exitCode=0 Apr 20 19:26:20.068201 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:20.068076 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbqgg" event={"ID":"3e3c3be7-736a-4d5e-ae42-c0f7e318af44","Type":"ContainerDied","Data":"aace4ca42a47e0a4444d943d2d8913a2a5b92525982cf531198e29fd938ff7bb"} Apr 20 19:26:21.073288 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:21.073252 2575 generic.go:358] "Generic (PLEG): container finished" podID="3e3c3be7-736a-4d5e-ae42-c0f7e318af44" containerID="67857b1d4452c1fdd8331f7cd7097d0fb717d34a6d3c7b995e73b2d27a750760" exitCode=0 Apr 20 19:26:21.073687 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:21.073321 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbqgg" event={"ID":"3e3c3be7-736a-4d5e-ae42-c0f7e318af44","Type":"ContainerDied","Data":"67857b1d4452c1fdd8331f7cd7097d0fb717d34a6d3c7b995e73b2d27a750760"} Apr 20 19:26:21.967777 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:21.967534 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:21.967939 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:21.967827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert\") pod \"ingress-canary-hsc5z\" (UID: \"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425\") " pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:26:21.967939 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:21.967717 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:21.967939 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:21.967908 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls podName:5b292e05-658a-4efc-8b35-8f64c0071f73 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:25.96788461 +0000 UTC m=+41.658297862 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls") pod "dns-default-d6zt6" (UID: "5b292e05-658a-4efc-8b35-8f64c0071f73") : secret "dns-default-metrics-tls" not found Apr 20 19:26:21.967939 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:21.967912 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:21.968078 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:21.967950 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert podName:e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:25.967938387 +0000 UTC m=+41.658351639 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert") pod "ingress-canary-hsc5z" (UID: "e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425") : secret "canary-serving-cert" not found Apr 20 19:26:22.078518 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:22.078481 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wbqgg" event={"ID":"3e3c3be7-736a-4d5e-ae42-c0f7e318af44","Type":"ContainerStarted","Data":"80fdb2bc525e52c6f460b97e7397b491de78d8278318d7495be1fdb5fbaeed38"} Apr 20 19:26:22.113163 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:22.113115 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wbqgg" podStartSLOduration=5.646023229 podStartE2EDuration="37.113101058s" podCreationTimestamp="2026-04-20 19:25:45 +0000 UTC" firstStartedPulling="2026-04-20 19:25:47.592901807 +0000 UTC m=+3.283315059" lastFinishedPulling="2026-04-20 19:26:19.059979632 +0000 UTC m=+34.750392888" observedRunningTime="2026-04-20 19:26:22.112786737 +0000 UTC m=+37.803200012" watchObservedRunningTime="2026-04-20 19:26:22.113101058 +0000 UTC m=+37.803514311" Apr 20 19:26:25.997073 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:25.997019 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert\") pod \"ingress-canary-hsc5z\" (UID: \"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425\") " pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:26:25.997073 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:25.997093 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:25.998012 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:25.997175 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:25.998012 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:25.997184 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:25.998012 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:25.997239 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert podName:e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:33.997223737 +0000 UTC m=+49.687636988 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert") pod "ingress-canary-hsc5z" (UID: "e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425") : secret "canary-serving-cert" not found Apr 20 19:26:25.998012 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:25.997256 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls podName:5b292e05-658a-4efc-8b35-8f64c0071f73 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:33.997247767 +0000 UTC m=+49.687661018 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls") pod "dns-default-d6zt6" (UID: "5b292e05-658a-4efc-8b35-8f64c0071f73") : secret "dns-default-metrics-tls" not found Apr 20 19:26:34.058864 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:34.058819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:34.059328 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:34.058883 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert\") pod \"ingress-canary-hsc5z\" (UID: \"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425\") " pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:26:34.059328 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:34.058976 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:34.059328 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:34.058993 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:34.059328 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:34.059042 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert podName:e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:50.059027004 +0000 UTC m=+65.749440256 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert") pod "ingress-canary-hsc5z" (UID: "e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425") : secret "canary-serving-cert" not found Apr 20 19:26:34.059328 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:34.059117 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls podName:5b292e05-658a-4efc-8b35-8f64c0071f73 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:50.059097419 +0000 UTC m=+65.749510675 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls") pod "dns-default-d6zt6" (UID: "5b292e05-658a-4efc-8b35-8f64c0071f73") : secret "dns-default-metrics-tls" not found Apr 20 19:26:44.102292 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:44.102259 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w558t" Apr 20 19:26:50.070860 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:50.070805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert\") pod \"ingress-canary-hsc5z\" (UID: \"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425\") " pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:26:50.070860 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:50.070880 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:26:50.071357 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:50.070979 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:50.071357 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:50.070981 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:50.071357 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:50.071036 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls podName:5b292e05-658a-4efc-8b35-8f64c0071f73 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:22.071020648 +0000 UTC m=+97.761433900 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls") pod "dns-default-d6zt6" (UID: "5b292e05-658a-4efc-8b35-8f64c0071f73") : secret "dns-default-metrics-tls" not found Apr 20 19:26:50.071357 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:50.071059 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert podName:e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:22.071042035 +0000 UTC m=+97.761455298 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert") pod "ingress-canary-hsc5z" (UID: "e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425") : secret "canary-serving-cert" not found Apr 20 19:26:50.674193 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:50.674145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5ml6\" (UniqueName: \"kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6\") pod \"network-check-target-64gnz\" (UID: \"77302d9f-23b6-4aa5-9de7-368ff66ca70e\") " pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:50.674397 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:50.674227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:26:50.677700 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:50.677673 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:26:50.677818 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:50.677794 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:26:50.685117 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:50.685086 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:26:50.685260 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:26:50.685182 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs podName:234402da-caaa-48f3-8a69-400f12f55eb6 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:54.685163187 +0000 UTC m=+130.375576439 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs") pod "network-metrics-daemon-j9xp6" (UID: "234402da-caaa-48f3-8a69-400f12f55eb6") : secret "metrics-daemon-secret" not found Apr 20 19:26:50.687840 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:50.687818 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:26:50.698908 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:50.698873 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5ml6\" (UniqueName: \"kubernetes.io/projected/77302d9f-23b6-4aa5-9de7-368ff66ca70e-kube-api-access-j5ml6\") pod \"network-check-target-64gnz\" (UID: \"77302d9f-23b6-4aa5-9de7-368ff66ca70e\") " pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:50.816505 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:50.816477 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-89kqx\"" Apr 20 19:26:50.824812 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:50.824785 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:50.965085 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:50.965000 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-64gnz"] Apr 20 19:26:50.969677 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:26:50.969645 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77302d9f_23b6_4aa5_9de7_368ff66ca70e.slice/crio-4306a18e7cd2f7d1746bdca79b0b54533812bb9bb7cb6b7b60a085c4d4fd98f6 WatchSource:0}: Error finding container 4306a18e7cd2f7d1746bdca79b0b54533812bb9bb7cb6b7b60a085c4d4fd98f6: Status 404 returned error can't find the container with id 4306a18e7cd2f7d1746bdca79b0b54533812bb9bb7cb6b7b60a085c4d4fd98f6 Apr 20 19:26:51.138254 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:51.138215 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-64gnz" event={"ID":"77302d9f-23b6-4aa5-9de7-368ff66ca70e","Type":"ContainerStarted","Data":"4306a18e7cd2f7d1746bdca79b0b54533812bb9bb7cb6b7b60a085c4d4fd98f6"} Apr 20 19:26:54.145776 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:54.145742 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-64gnz" event={"ID":"77302d9f-23b6-4aa5-9de7-368ff66ca70e","Type":"ContainerStarted","Data":"5cab541c4ed5d634a68bf2e59aef21979b5d9ec2d8183bf27ee36a433eade0ab"} Apr 20 19:26:54.146144 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:54.145864 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:26:54.165077 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:26:54.165020 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-64gnz" podStartSLOduration=66.320019863 podStartE2EDuration="1m9.165003787s" podCreationTimestamp="2026-04-20 19:25:45 +0000 UTC" firstStartedPulling="2026-04-20 19:26:50.973458103 +0000 UTC m=+66.663871356" lastFinishedPulling="2026-04-20 19:26:53.818442025 +0000 UTC m=+69.508855280" observedRunningTime="2026-04-20 19:26:54.164577756 +0000 UTC m=+69.854991030" watchObservedRunningTime="2026-04-20 19:26:54.165003787 +0000 UTC m=+69.855417060" Apr 20 19:27:22.096873 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:22.096831 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert\") pod \"ingress-canary-hsc5z\" (UID: \"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425\") " pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:27:22.096873 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:22.096887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:27:22.097365 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:22.096977 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:27:22.097365 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:22.096992 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:27:22.097365 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:22.097031 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls podName:5b292e05-658a-4efc-8b35-8f64c0071f73 nodeName:}" failed. No retries permitted until 2026-04-20 19:28:26.097016986 +0000 UTC m=+161.787430238 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls") pod "dns-default-d6zt6" (UID: "5b292e05-658a-4efc-8b35-8f64c0071f73") : secret "dns-default-metrics-tls" not found Apr 20 19:27:22.097365 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:22.097050 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert podName:e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425 nodeName:}" failed. No retries permitted until 2026-04-20 19:28:26.097037029 +0000 UTC m=+161.787450281 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert") pod "ingress-canary-hsc5z" (UID: "e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425") : secret "canary-serving-cert" not found Apr 20 19:27:25.150141 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:25.150111 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-64gnz" Apr 20 19:27:34.178790 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.178754 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-j9gnb"] Apr 20 19:27:34.183878 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.183848 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-65d6b6f7bd-fw8wn"] Apr 20 19:27:34.184040 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.184003 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.186310 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.186288 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 19:27:34.186450 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.186288 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 19:27:34.186450 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.186291 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 19:27:34.186692 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.186676 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-lfhpz\"" Apr 20 19:27:34.186781 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.186717 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 19:27:34.187249 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.187233 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.190285 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.190258 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 19:27:34.190558 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.190453 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-j9gnb"] Apr 20 19:27:34.190911 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.190258 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 19:27:34.191071 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.190955 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-rvjwh\"" Apr 20 19:27:34.191207 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.191191 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 19:27:34.191322 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.191305 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 19:27:34.191380 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.191358 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 19:27:34.191429 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.191363 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 19:27:34.198045 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.198012 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 19:27:34.198216 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.198104 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-65d6b6f7bd-fw8wn"] Apr 20 19:27:34.284161 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.284121 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.284161 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.284166 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4b3625-678c-45ba-87f5-25257a97723a-serving-cert\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.284385 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.284189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb4b3625-678c-45ba-87f5-25257a97723a-tmp\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.284385 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.284206 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh6qj\" (UniqueName: \"kubernetes.io/projected/28176d29-9406-4440-8156-fe54a5e5596e-kube-api-access-wh6qj\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.284385 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.284283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb4b3625-678c-45ba-87f5-25257a97723a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.284385 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.284355 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb4b3625-678c-45ba-87f5-25257a97723a-service-ca-bundle\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.284524 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.284389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-default-certificate\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.284524 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.284408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxdw2\" (UniqueName: \"kubernetes.io/projected/eb4b3625-678c-45ba-87f5-25257a97723a-kube-api-access-gxdw2\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.284524 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.284423 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-stats-auth\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.284524 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.284441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.284524 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.284502 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/eb4b3625-678c-45ba-87f5-25257a97723a-snapshots\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.385719 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.385665 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb4b3625-678c-45ba-87f5-25257a97723a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.385926 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.385742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb4b3625-678c-45ba-87f5-25257a97723a-service-ca-bundle\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.385926 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.385781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-default-certificate\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.385926 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.385803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxdw2\" (UniqueName: \"kubernetes.io/projected/eb4b3625-678c-45ba-87f5-25257a97723a-kube-api-access-gxdw2\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.385926 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.385826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-stats-auth\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.385926 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.385849 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.386185 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:34.385940 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:27:34.386185 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:34.386013 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs podName:28176d29-9406-4440-8156-fe54a5e5596e nodeName:}" failed. No retries permitted until 2026-04-20 19:27:34.885992608 +0000 UTC m=+110.576405865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs") pod "router-default-65d6b6f7bd-fw8wn" (UID: "28176d29-9406-4440-8156-fe54a5e5596e") : secret "router-metrics-certs-default" not found Apr 20 19:27:34.386185 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.386033 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/eb4b3625-678c-45ba-87f5-25257a97723a-snapshots\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.386185 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.386063 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.386185 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.386097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4b3625-678c-45ba-87f5-25257a97723a-serving-cert\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.386185 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.386133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb4b3625-678c-45ba-87f5-25257a97723a-tmp\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.386185 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.386160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wh6qj\" (UniqueName: \"kubernetes.io/projected/28176d29-9406-4440-8156-fe54a5e5596e-kube-api-access-wh6qj\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.386501 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:34.386286 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle podName:28176d29-9406-4440-8156-fe54a5e5596e nodeName:}" failed. No retries permitted until 2026-04-20 19:27:34.886258202 +0000 UTC m=+110.576671456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle") pod "router-default-65d6b6f7bd-fw8wn" (UID: "28176d29-9406-4440-8156-fe54a5e5596e") : configmap references non-existent config key: service-ca.crt Apr 20 19:27:34.386501 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.386429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb4b3625-678c-45ba-87f5-25257a97723a-service-ca-bundle\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.386818 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.386797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb4b3625-678c-45ba-87f5-25257a97723a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.387101 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.387048 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb4b3625-678c-45ba-87f5-25257a97723a-tmp\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.387217 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.387149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/eb4b3625-678c-45ba-87f5-25257a97723a-snapshots\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.388591 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.388568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-default-certificate\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.388728 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.388707 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-stats-auth\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.388776 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.388707 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb4b3625-678c-45ba-87f5-25257a97723a-serving-cert\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.396408 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.396376 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh6qj\" (UniqueName: \"kubernetes.io/projected/28176d29-9406-4440-8156-fe54a5e5596e-kube-api-access-wh6qj\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.398104 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.398075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxdw2\" (UniqueName: \"kubernetes.io/projected/eb4b3625-678c-45ba-87f5-25257a97723a-kube-api-access-gxdw2\") pod \"insights-operator-585dfdc468-j9gnb\" (UID: \"eb4b3625-678c-45ba-87f5-25257a97723a\") " pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.500637 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.500505 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-j9gnb" Apr 20 19:27:34.649502 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.649468 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-j9gnb"] Apr 20 19:27:34.653280 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:27:34.653247 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb4b3625_678c_45ba_87f5_25257a97723a.slice/crio-be8994e84e9b1a0c8bf0d28466623752c491480df136d7b685844f24d483efa3 WatchSource:0}: Error finding container be8994e84e9b1a0c8bf0d28466623752c491480df136d7b685844f24d483efa3: Status 404 returned error can't find the container with id be8994e84e9b1a0c8bf0d28466623752c491480df136d7b685844f24d483efa3 Apr 20 19:27:34.889713 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.889677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.889869 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:34.889795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:34.889909 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:34.889871 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle podName:28176d29-9406-4440-8156-fe54a5e5596e nodeName:}" failed. No retries permitted until 2026-04-20 19:27:35.889849987 +0000 UTC m=+111.580263259 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle") pod "router-default-65d6b6f7bd-fw8wn" (UID: "28176d29-9406-4440-8156-fe54a5e5596e") : configmap references non-existent config key: service-ca.crt Apr 20 19:27:34.889909 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:34.889896 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:27:34.889983 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:34.889932 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs podName:28176d29-9406-4440-8156-fe54a5e5596e nodeName:}" failed. No retries permitted until 2026-04-20 19:27:35.889921152 +0000 UTC m=+111.580334405 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs") pod "router-default-65d6b6f7bd-fw8wn" (UID: "28176d29-9406-4440-8156-fe54a5e5596e") : secret "router-metrics-certs-default" not found Apr 20 19:27:35.226960 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:35.226864 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-j9gnb" event={"ID":"eb4b3625-678c-45ba-87f5-25257a97723a","Type":"ContainerStarted","Data":"be8994e84e9b1a0c8bf0d28466623752c491480df136d7b685844f24d483efa3"} Apr 20 19:27:35.896936 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:35.896889 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:35.897134 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:35.896973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:35.897134 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:35.897051 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:27:35.897251 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:35.897141 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs podName:28176d29-9406-4440-8156-fe54a5e5596e nodeName:}" failed. No retries permitted until 2026-04-20 19:27:37.897117325 +0000 UTC m=+113.587530593 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs") pod "router-default-65d6b6f7bd-fw8wn" (UID: "28176d29-9406-4440-8156-fe54a5e5596e") : secret "router-metrics-certs-default" not found Apr 20 19:27:35.897251 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:35.897164 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle podName:28176d29-9406-4440-8156-fe54a5e5596e nodeName:}" failed. No retries permitted until 2026-04-20 19:27:37.897154382 +0000 UTC m=+113.587567643 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle") pod "router-default-65d6b6f7bd-fw8wn" (UID: "28176d29-9406-4440-8156-fe54a5e5596e") : configmap references non-existent config key: service-ca.crt Apr 20 19:27:37.231182 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:37.231147 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-j9gnb" event={"ID":"eb4b3625-678c-45ba-87f5-25257a97723a","Type":"ContainerStarted","Data":"e3c778fe95538756878565b8b32ba7f602b06e81c11bbd8d4196b8aaf42f2753"} Apr 20 19:27:37.253519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:37.253451 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-j9gnb" podStartSLOduration=1.6793532469999999 podStartE2EDuration="3.25343647s" podCreationTimestamp="2026-04-20 19:27:34 +0000 UTC" firstStartedPulling="2026-04-20 19:27:34.655038066 +0000 UTC m=+110.345451318" lastFinishedPulling="2026-04-20 19:27:36.229121288 +0000 UTC m=+111.919534541" observedRunningTime="2026-04-20 19:27:37.252223789 +0000 UTC m=+112.942637053" watchObservedRunningTime="2026-04-20 19:27:37.25343647 +0000 UTC m=+112.943849743" Apr 20 19:27:37.911088 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:37.911046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:37.911088 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:37.911097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:37.911303 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:37.911206 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:27:37.911303 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:37.911221 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle podName:28176d29-9406-4440-8156-fe54a5e5596e nodeName:}" failed. No retries permitted until 2026-04-20 19:27:41.911208218 +0000 UTC m=+117.601621470 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle") pod "router-default-65d6b6f7bd-fw8wn" (UID: "28176d29-9406-4440-8156-fe54a5e5596e") : configmap references non-existent config key: service-ca.crt Apr 20 19:27:37.911303 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:37.911279 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs podName:28176d29-9406-4440-8156-fe54a5e5596e nodeName:}" failed. No retries permitted until 2026-04-20 19:27:41.911264439 +0000 UTC m=+117.601677692 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs") pod "router-default-65d6b6f7bd-fw8wn" (UID: "28176d29-9406-4440-8156-fe54a5e5596e") : secret "router-metrics-certs-default" not found Apr 20 19:27:39.659244 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:39.659215 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bttgn_9ce44109-8d3b-4499-b0fb-fa475f90e132/dns-node-resolver/0.log" Apr 20 19:27:40.059169 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:40.059094 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-b6xwm_148d86dd-6cae-42a6-9e0e-44b0a13baa33/node-ca/0.log" Apr 20 19:27:41.937969 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:41.937927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:41.938434 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:41.937986 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:41.938434 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:41.938088 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:27:41.938434 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:41.938149 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle podName:28176d29-9406-4440-8156-fe54a5e5596e nodeName:}" failed. No retries permitted until 2026-04-20 19:27:49.938133616 +0000 UTC m=+125.628546894 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle") pod "router-default-65d6b6f7bd-fw8wn" (UID: "28176d29-9406-4440-8156-fe54a5e5596e") : configmap references non-existent config key: service-ca.crt Apr 20 19:27:41.938434 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:41.938165 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs podName:28176d29-9406-4440-8156-fe54a5e5596e nodeName:}" failed. No retries permitted until 2026-04-20 19:27:49.938159223 +0000 UTC m=+125.628572476 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs") pod "router-default-65d6b6f7bd-fw8wn" (UID: "28176d29-9406-4440-8156-fe54a5e5596e") : secret "router-metrics-certs-default" not found Apr 20 19:27:44.270188 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:44.270153 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt"] Apr 20 19:27:44.273418 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:44.273399 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" Apr 20 19:27:44.275912 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:44.275879 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-gw2l8\"" Apr 20 19:27:44.276090 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:44.275898 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 19:27:44.278169 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:44.278152 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:27:44.278255 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:44.278237 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 19:27:44.286908 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:44.286874 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt"] Apr 20 19:27:44.358259 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:44.358218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-777vt\" (UID: \"26d26f03-9d92-44d3-81f0-f95956a7a385\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" Apr 20 19:27:44.358259 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:44.358269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgw5v\" (UniqueName: \"kubernetes.io/projected/26d26f03-9d92-44d3-81f0-f95956a7a385-kube-api-access-vgw5v\") pod \"cluster-samples-operator-6dc5bdb6b4-777vt\" (UID: \"26d26f03-9d92-44d3-81f0-f95956a7a385\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" Apr 20 19:27:44.459430 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:44.459388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-777vt\" (UID: \"26d26f03-9d92-44d3-81f0-f95956a7a385\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" Apr 20 19:27:44.459603 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:44.459435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgw5v\" (UniqueName: \"kubernetes.io/projected/26d26f03-9d92-44d3-81f0-f95956a7a385-kube-api-access-vgw5v\") pod \"cluster-samples-operator-6dc5bdb6b4-777vt\" (UID: \"26d26f03-9d92-44d3-81f0-f95956a7a385\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" Apr 20 19:27:44.459603 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:44.459550 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:27:44.459716 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:44.459649 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls podName:26d26f03-9d92-44d3-81f0-f95956a7a385 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:44.95960203 +0000 UTC m=+120.650015283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-777vt" (UID: "26d26f03-9d92-44d3-81f0-f95956a7a385") : secret "samples-operator-tls" not found Apr 20 19:27:44.468279 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:44.468240 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgw5v\" (UniqueName: \"kubernetes.io/projected/26d26f03-9d92-44d3-81f0-f95956a7a385-kube-api-access-vgw5v\") pod \"cluster-samples-operator-6dc5bdb6b4-777vt\" (UID: \"26d26f03-9d92-44d3-81f0-f95956a7a385\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" Apr 20 19:27:44.963930 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:44.963886 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-777vt\" (UID: \"26d26f03-9d92-44d3-81f0-f95956a7a385\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" Apr 20 19:27:44.964127 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:44.964040 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:27:44.964127 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:44.964108 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls podName:26d26f03-9d92-44d3-81f0-f95956a7a385 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:45.964090454 +0000 UTC m=+121.654503707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-777vt" (UID: "26d26f03-9d92-44d3-81f0-f95956a7a385") : secret "samples-operator-tls" not found Apr 20 19:27:45.971897 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:45.971855 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-777vt\" (UID: \"26d26f03-9d92-44d3-81f0-f95956a7a385\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" Apr 20 19:27:45.972412 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:45.972047 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:27:45.972412 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:45.972136 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls podName:26d26f03-9d92-44d3-81f0-f95956a7a385 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:47.972114445 +0000 UTC m=+123.662527697 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-777vt" (UID: "26d26f03-9d92-44d3-81f0-f95956a7a385") : secret "samples-operator-tls" not found Apr 20 19:27:46.414912 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.414876 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-96bb49785-fxs9k"] Apr 20 19:27:46.417887 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.417861 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.421230 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.421193 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 19:27:46.421394 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.421202 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 19:27:46.421394 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.421205 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 19:27:46.422125 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.422100 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8vchn\"" Apr 20 19:27:46.430452 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.430423 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 19:27:46.431350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.431323 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-96bb49785-fxs9k"] Apr 20 19:27:46.575180 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.575138 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-image-registry-private-configuration\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.575180 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.575178 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-ca-trust-extracted\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.575452 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.575197 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-bound-sa-token\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.575452 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.575300 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-certificates\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.575452 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.575359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-trusted-ca\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.575452 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.575384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-installation-pull-secrets\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.575643 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.575457 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.575643 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.575547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx94k\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-kube-api-access-nx94k\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.676802 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.676692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.676802 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.676774 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nx94k\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-kube-api-access-nx94k\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.676802 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.676801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-image-registry-private-configuration\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.677091 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.676824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-ca-trust-extracted\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.677091 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.676841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-bound-sa-token\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.677091 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:46.676876 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:27:46.677091 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:46.676899 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-96bb49785-fxs9k: secret "image-registry-tls" not found Apr 20 19:27:46.677091 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:46.676962 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls podName:aa5f1199-c7b4-4aef-8ca3-fefa785ef54a nodeName:}" failed. No retries permitted until 2026-04-20 19:27:47.176942387 +0000 UTC m=+122.867355639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls") pod "image-registry-96bb49785-fxs9k" (UID: "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a") : secret "image-registry-tls" not found Apr 20 19:27:46.677091 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.676987 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-certificates\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.677091 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.677037 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-trusted-ca\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.677091 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.677063 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-installation-pull-secrets\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.677421 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.677224 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-ca-trust-extracted\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.677652 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.677603 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-certificates\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.679048 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.679018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-trusted-ca\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.679484 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.679465 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-image-registry-private-configuration\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.679583 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.679565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-installation-pull-secrets\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.687251 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.687215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-bound-sa-token\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:46.687841 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:46.687809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx94k\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-kube-api-access-nx94k\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:47.181877 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:47.181843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:47.182289 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:47.181970 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:27:47.182289 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:47.181984 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-96bb49785-fxs9k: secret "image-registry-tls" not found Apr 20 19:27:47.182289 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:47.182042 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls podName:aa5f1199-c7b4-4aef-8ca3-fefa785ef54a nodeName:}" failed. No retries permitted until 2026-04-20 19:27:48.182028828 +0000 UTC m=+123.872442080 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls") pod "image-registry-96bb49785-fxs9k" (UID: "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a") : secret "image-registry-tls" not found Apr 20 19:27:47.989569 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:47.989525 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-777vt\" (UID: \"26d26f03-9d92-44d3-81f0-f95956a7a385\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" Apr 20 19:27:47.989770 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:47.989717 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:27:47.989816 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:47.989791 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls podName:26d26f03-9d92-44d3-81f0-f95956a7a385 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:51.98977424 +0000 UTC m=+127.680187491 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-777vt" (UID: "26d26f03-9d92-44d3-81f0-f95956a7a385") : secret "samples-operator-tls" not found Apr 20 19:27:48.191443 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:48.191402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:48.191972 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:48.191573 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:27:48.191972 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:48.191601 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-96bb49785-fxs9k: secret "image-registry-tls" not found Apr 20 19:27:48.191972 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:48.191703 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls podName:aa5f1199-c7b4-4aef-8ca3-fefa785ef54a nodeName:}" failed. No retries permitted until 2026-04-20 19:27:50.191682876 +0000 UTC m=+125.882096145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls") pod "image-registry-96bb49785-fxs9k" (UID: "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a") : secret "image-registry-tls" not found Apr 20 19:27:50.007980 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.007927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:50.007980 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.007988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:27:50.008465 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:50.008080 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 19:27:50.008465 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:50.008129 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle podName:28176d29-9406-4440-8156-fe54a5e5596e nodeName:}" failed. No retries permitted until 2026-04-20 19:28:06.008113802 +0000 UTC m=+141.698527054 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle") pod "router-default-65d6b6f7bd-fw8wn" (UID: "28176d29-9406-4440-8156-fe54a5e5596e") : configmap references non-existent config key: service-ca.crt Apr 20 19:27:50.008465 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:50.008153 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs podName:28176d29-9406-4440-8156-fe54a5e5596e nodeName:}" failed. No retries permitted until 2026-04-20 19:28:06.00813798 +0000 UTC m=+141.698551232 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs") pod "router-default-65d6b6f7bd-fw8wn" (UID: "28176d29-9406-4440-8156-fe54a5e5596e") : secret "router-metrics-certs-default" not found Apr 20 19:27:50.209595 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.209558 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:50.209762 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:50.209724 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:27:50.209762 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:50.209746 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-96bb49785-fxs9k: secret "image-registry-tls" not found Apr 20 19:27:50.209848 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:50.209799 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls podName:aa5f1199-c7b4-4aef-8ca3-fefa785ef54a nodeName:}" failed. No retries permitted until 2026-04-20 19:27:54.209784137 +0000 UTC m=+129.900197390 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls") pod "image-registry-96bb49785-fxs9k" (UID: "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a") : secret "image-registry-tls" not found Apr 20 19:27:50.447309 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.447277 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xwrgr"] Apr 20 19:27:50.451484 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.451457 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xwrgr" Apr 20 19:27:50.454066 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.454043 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 19:27:50.454168 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.454107 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 19:27:50.455604 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.455571 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-h695z\"" Apr 20 19:27:50.458454 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.458418 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xwrgr"] Apr 20 19:27:50.613260 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.613209 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdjv5\" (UniqueName: \"kubernetes.io/projected/7ccf1368-39c1-4d86-9a71-be00cf3fa5f8-kube-api-access-cdjv5\") pod \"migrator-74bb7799d9-xwrgr\" (UID: \"7ccf1368-39c1-4d86-9a71-be00cf3fa5f8\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xwrgr" Apr 20 19:27:50.714443 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.714348 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdjv5\" (UniqueName: \"kubernetes.io/projected/7ccf1368-39c1-4d86-9a71-be00cf3fa5f8-kube-api-access-cdjv5\") pod \"migrator-74bb7799d9-xwrgr\" (UID: \"7ccf1368-39c1-4d86-9a71-be00cf3fa5f8\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xwrgr" Apr 20 19:27:50.723406 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.723373 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdjv5\" (UniqueName: \"kubernetes.io/projected/7ccf1368-39c1-4d86-9a71-be00cf3fa5f8-kube-api-access-cdjv5\") pod \"migrator-74bb7799d9-xwrgr\" (UID: \"7ccf1368-39c1-4d86-9a71-be00cf3fa5f8\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xwrgr" Apr 20 19:27:50.763548 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.763514 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xwrgr" Apr 20 19:27:50.886198 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.886156 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xwrgr"] Apr 20 19:27:50.890914 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:27:50.890878 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ccf1368_39c1_4d86_9a71_be00cf3fa5f8.slice/crio-db97da775d23df6a7aabb14ec0dc549fbcb4bba5082c25e56791360a899e1618 WatchSource:0}: Error finding container db97da775d23df6a7aabb14ec0dc549fbcb4bba5082c25e56791360a899e1618: Status 404 returned error can't find the container with id db97da775d23df6a7aabb14ec0dc549fbcb4bba5082c25e56791360a899e1618 Apr 20 19:27:50.902909 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.902875 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-btnm6"] Apr 20 19:27:50.907722 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.907689 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:50.915710 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.915687 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 19:27:50.916771 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.916751 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 19:27:50.918006 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.917988 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4kb52\"" Apr 20 19:27:50.938394 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:50.938360 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-btnm6"] Apr 20 19:27:51.016931 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.016831 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4b58e213-637f-47ba-8916-6ce23705dc6f-crio-socket\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.016931 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.016869 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4b58e213-637f-47ba-8916-6ce23705dc6f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.016931 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.016899 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4b58e213-637f-47ba-8916-6ce23705dc6f-data-volume\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.017346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.016969 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtclf\" (UniqueName: \"kubernetes.io/projected/4b58e213-637f-47ba-8916-6ce23705dc6f-kube-api-access-rtclf\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.017346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.017004 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.117838 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.117799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4b58e213-637f-47ba-8916-6ce23705dc6f-crio-socket\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.117838 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.117838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4b58e213-637f-47ba-8916-6ce23705dc6f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.118089 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.117868 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4b58e213-637f-47ba-8916-6ce23705dc6f-data-volume\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.118089 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.117894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtclf\" (UniqueName: \"kubernetes.io/projected/4b58e213-637f-47ba-8916-6ce23705dc6f-kube-api-access-rtclf\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.118089 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.117920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.118089 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.117938 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4b58e213-637f-47ba-8916-6ce23705dc6f-crio-socket\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.118089 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:51.118032 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:51.118326 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:51.118103 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls podName:4b58e213-637f-47ba-8916-6ce23705dc6f nodeName:}" failed. No retries permitted until 2026-04-20 19:27:51.618086466 +0000 UTC m=+127.308499734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls") pod "insights-runtime-extractor-btnm6" (UID: "4b58e213-637f-47ba-8916-6ce23705dc6f") : secret "insights-runtime-extractor-tls" not found Apr 20 19:27:51.118326 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.118256 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4b58e213-637f-47ba-8916-6ce23705dc6f-data-volume\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.118413 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.118393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4b58e213-637f-47ba-8916-6ce23705dc6f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.126368 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.126329 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtclf\" (UniqueName: \"kubernetes.io/projected/4b58e213-637f-47ba-8916-6ce23705dc6f-kube-api-access-rtclf\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.257094 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.257049 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xwrgr" event={"ID":"7ccf1368-39c1-4d86-9a71-be00cf3fa5f8","Type":"ContainerStarted","Data":"db97da775d23df6a7aabb14ec0dc549fbcb4bba5082c25e56791360a899e1618"} Apr 20 19:27:51.622323 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:51.622283 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:51.622522 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:51.622459 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:51.622583 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:51.622557 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls podName:4b58e213-637f-47ba-8916-6ce23705dc6f nodeName:}" failed. No retries permitted until 2026-04-20 19:27:52.622535114 +0000 UTC m=+128.312948377 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls") pod "insights-runtime-extractor-btnm6" (UID: "4b58e213-637f-47ba-8916-6ce23705dc6f") : secret "insights-runtime-extractor-tls" not found Apr 20 19:27:52.025892 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:52.025846 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-777vt\" (UID: \"26d26f03-9d92-44d3-81f0-f95956a7a385\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" Apr 20 19:27:52.026311 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:52.025977 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:27:52.026311 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:52.026042 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls podName:26d26f03-9d92-44d3-81f0-f95956a7a385 nodeName:}" failed. No retries permitted until 2026-04-20 19:28:00.026027657 +0000 UTC m=+135.716440910 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-777vt" (UID: "26d26f03-9d92-44d3-81f0-f95956a7a385") : secret "samples-operator-tls" not found Apr 20 19:27:52.260428 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:52.260395 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xwrgr" event={"ID":"7ccf1368-39c1-4d86-9a71-be00cf3fa5f8","Type":"ContainerStarted","Data":"e8d58c86c85c3ac283e29628b1ab4950e82080311e9e62d35b0fe97169f3869f"} Apr 20 19:27:52.260428 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:52.260434 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xwrgr" event={"ID":"7ccf1368-39c1-4d86-9a71-be00cf3fa5f8","Type":"ContainerStarted","Data":"1f3480977b222a157b7f434bcb296a4e4420b7f2e0db41fc552c1079c0ebb10a"} Apr 20 19:27:52.275362 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:52.275303 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xwrgr" podStartSLOduration=1.131353222 podStartE2EDuration="2.275286692s" podCreationTimestamp="2026-04-20 19:27:50 +0000 UTC" firstStartedPulling="2026-04-20 19:27:50.893316121 +0000 UTC m=+126.583729373" lastFinishedPulling="2026-04-20 19:27:52.037249588 +0000 UTC m=+127.727662843" observedRunningTime="2026-04-20 19:27:52.274825013 +0000 UTC m=+127.965238289" watchObservedRunningTime="2026-04-20 19:27:52.275286692 +0000 UTC m=+127.965699966" Apr 20 19:27:52.631284 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:52.631242 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:52.631529 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:52.631365 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:52.631529 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:52.631418 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls podName:4b58e213-637f-47ba-8916-6ce23705dc6f nodeName:}" failed. No retries permitted until 2026-04-20 19:27:54.631404923 +0000 UTC m=+130.321818175 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls") pod "insights-runtime-extractor-btnm6" (UID: "4b58e213-637f-47ba-8916-6ce23705dc6f") : secret "insights-runtime-extractor-tls" not found Apr 20 19:27:54.244568 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:54.244522 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:27:54.245037 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:54.244715 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:27:54.245037 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:54.244741 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-96bb49785-fxs9k: secret "image-registry-tls" not found Apr 20 19:27:54.245037 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:54.244799 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls podName:aa5f1199-c7b4-4aef-8ca3-fefa785ef54a nodeName:}" failed. No retries permitted until 2026-04-20 19:28:02.244782384 +0000 UTC m=+137.935195636 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls") pod "image-registry-96bb49785-fxs9k" (UID: "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a") : secret "image-registry-tls" not found Apr 20 19:27:54.648197 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:54.648161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:54.648381 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:54.648336 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:54.648429 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:54.648410 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls podName:4b58e213-637f-47ba-8916-6ce23705dc6f nodeName:}" failed. No retries permitted until 2026-04-20 19:27:58.648392307 +0000 UTC m=+134.338805559 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls") pod "insights-runtime-extractor-btnm6" (UID: "4b58e213-637f-47ba-8916-6ce23705dc6f") : secret "insights-runtime-extractor-tls" not found Apr 20 19:27:54.749080 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:54.749031 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:27:54.749267 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:54.749178 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:27:54.749267 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:54.749242 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs podName:234402da-caaa-48f3-8a69-400f12f55eb6 nodeName:}" failed. No retries permitted until 2026-04-20 19:29:56.749226028 +0000 UTC m=+252.439639284 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs") pod "network-metrics-daemon-j9xp6" (UID: "234402da-caaa-48f3-8a69-400f12f55eb6") : secret "metrics-daemon-secret" not found Apr 20 19:27:58.680950 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:27:58.680909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:27:58.681358 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:58.681036 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:58.681358 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:27:58.681092 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls podName:4b58e213-637f-47ba-8916-6ce23705dc6f nodeName:}" failed. No retries permitted until 2026-04-20 19:28:06.681077658 +0000 UTC m=+142.371490914 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls") pod "insights-runtime-extractor-btnm6" (UID: "4b58e213-637f-47ba-8916-6ce23705dc6f") : secret "insights-runtime-extractor-tls" not found Apr 20 19:28:00.091740 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:00.091684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-777vt\" (UID: \"26d26f03-9d92-44d3-81f0-f95956a7a385\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" Apr 20 19:28:00.094275 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:00.094246 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/26d26f03-9d92-44d3-81f0-f95956a7a385-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-777vt\" (UID: \"26d26f03-9d92-44d3-81f0-f95956a7a385\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" Apr 20 19:28:00.185255 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:00.185220 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-gw2l8\"" Apr 20 19:28:00.193529 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:00.193497 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" Apr 20 19:28:00.317342 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:00.317309 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt"] Apr 20 19:28:01.281059 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:01.281014 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" event={"ID":"26d26f03-9d92-44d3-81f0-f95956a7a385","Type":"ContainerStarted","Data":"2d222cdc889d7c3d1274a81bc62eab482fb7b6d4f99da4ec508b267f1a4a693c"} Apr 20 19:28:02.284820 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:02.284781 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" event={"ID":"26d26f03-9d92-44d3-81f0-f95956a7a385","Type":"ContainerStarted","Data":"173ee3ea9a374fc02f5855b6fd92bbfeabc4138053d482fe956740cac26dfd7b"} Apr 20 19:28:02.284820 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:02.284818 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" event={"ID":"26d26f03-9d92-44d3-81f0-f95956a7a385","Type":"ContainerStarted","Data":"fb7b9331f202a07662b2b5f1066aed9132479c23c51635a3357717cfff1283ca"} Apr 20 19:28:02.300422 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:02.300362 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-777vt" podStartSLOduration=16.838370913 podStartE2EDuration="18.300346918s" podCreationTimestamp="2026-04-20 19:27:44 +0000 UTC" firstStartedPulling="2026-04-20 19:28:00.363598517 +0000 UTC m=+136.054011770" lastFinishedPulling="2026-04-20 19:28:01.825574512 +0000 UTC m=+137.515987775" observedRunningTime="2026-04-20 19:28:02.299702765 +0000 UTC m=+137.990116053" watchObservedRunningTime="2026-04-20 19:28:02.300346918 +0000 UTC m=+137.990760209" Apr 20 19:28:02.312082 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:02.312047 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:28:02.314627 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:02.314588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls\") pod \"image-registry-96bb49785-fxs9k\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:28:02.329588 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:02.329549 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:28:02.458677 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:02.458645 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-96bb49785-fxs9k"] Apr 20 19:28:02.461963 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:28:02.461931 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa5f1199_c7b4_4aef_8ca3_fefa785ef54a.slice/crio-7612209c306c28f3b070069900f5a47ac506dc8770218a3ea9a3dd8359336806 WatchSource:0}: Error finding container 7612209c306c28f3b070069900f5a47ac506dc8770218a3ea9a3dd8359336806: Status 404 returned error can't find the container with id 7612209c306c28f3b070069900f5a47ac506dc8770218a3ea9a3dd8359336806 Apr 20 19:28:03.289920 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:03.289878 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-96bb49785-fxs9k" event={"ID":"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a","Type":"ContainerStarted","Data":"f17e3b4772c8db353484aca5915fbafe8c3fe7e171e46f4e23771faeb52bdc83"} Apr 20 19:28:03.289920 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:03.289924 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-96bb49785-fxs9k" event={"ID":"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a","Type":"ContainerStarted","Data":"7612209c306c28f3b070069900f5a47ac506dc8770218a3ea9a3dd8359336806"} Apr 20 19:28:03.309308 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:03.309257 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-96bb49785-fxs9k" podStartSLOduration=17.309242316 podStartE2EDuration="17.309242316s" podCreationTimestamp="2026-04-20 19:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:28:03.307804863 +0000 UTC m=+138.998218138" watchObservedRunningTime="2026-04-20 19:28:03.309242316 +0000 UTC m=+138.999655589" Apr 20 19:28:04.292830 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:04.292797 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:28:06.042851 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:06.042796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:28:06.043232 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:06.042898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:28:06.043458 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:06.043435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28176d29-9406-4440-8156-fe54a5e5596e-service-ca-bundle\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:28:06.045384 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:06.045362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/28176d29-9406-4440-8156-fe54a5e5596e-metrics-certs\") pod \"router-default-65d6b6f7bd-fw8wn\" (UID: \"28176d29-9406-4440-8156-fe54a5e5596e\") " pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:28:06.309675 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:06.309577 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-rvjwh\"" Apr 20 19:28:06.317754 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:06.317726 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:28:06.459100 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:06.459063 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-65d6b6f7bd-fw8wn"] Apr 20 19:28:06.462143 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:28:06.462110 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28176d29_9406_4440_8156_fe54a5e5596e.slice/crio-fb68cfbd7ec53c58dc3aaae53142b96ba59a3c88772438f184f03a5024229a25 WatchSource:0}: Error finding container fb68cfbd7ec53c58dc3aaae53142b96ba59a3c88772438f184f03a5024229a25: Status 404 returned error can't find the container with id fb68cfbd7ec53c58dc3aaae53142b96ba59a3c88772438f184f03a5024229a25 Apr 20 19:28:06.747711 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:06.747648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:28:06.750203 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:06.750173 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4b58e213-637f-47ba-8916-6ce23705dc6f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-btnm6\" (UID: \"4b58e213-637f-47ba-8916-6ce23705dc6f\") " pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:28:06.818241 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:06.818199 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-btnm6" Apr 20 19:28:06.939471 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:06.939435 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-btnm6"] Apr 20 19:28:06.943026 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:28:06.942997 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b58e213_637f_47ba_8916_6ce23705dc6f.slice/crio-8329827803b6ecd45e0e8df71fc05009e411df4c70d9d73cddc67254ef4d3e53 WatchSource:0}: Error finding container 8329827803b6ecd45e0e8df71fc05009e411df4c70d9d73cddc67254ef4d3e53: Status 404 returned error can't find the container with id 8329827803b6ecd45e0e8df71fc05009e411df4c70d9d73cddc67254ef4d3e53 Apr 20 19:28:07.302190 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:07.302096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-btnm6" event={"ID":"4b58e213-637f-47ba-8916-6ce23705dc6f","Type":"ContainerStarted","Data":"d6f71aa85a4ca4db1ab3a6b3861b4b37c5a83f7924e333c83508e22aed0af7d4"} Apr 20 19:28:07.302190 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:07.302140 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-btnm6" event={"ID":"4b58e213-637f-47ba-8916-6ce23705dc6f","Type":"ContainerStarted","Data":"8329827803b6ecd45e0e8df71fc05009e411df4c70d9d73cddc67254ef4d3e53"} Apr 20 19:28:07.303337 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:07.303309 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" event={"ID":"28176d29-9406-4440-8156-fe54a5e5596e","Type":"ContainerStarted","Data":"59fb4eadd98219ec33a5a44ca912595f870a962246b72b968310e1ff1d973692"} Apr 20 19:28:07.303337 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:07.303338 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" event={"ID":"28176d29-9406-4440-8156-fe54a5e5596e","Type":"ContainerStarted","Data":"fb68cfbd7ec53c58dc3aaae53142b96ba59a3c88772438f184f03a5024229a25"} Apr 20 19:28:07.318626 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:07.318576 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:28:07.321538 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:07.321505 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:28:07.321538 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:07.321488 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" podStartSLOduration=33.321471921 podStartE2EDuration="33.321471921s" podCreationTimestamp="2026-04-20 19:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:28:07.320070586 +0000 UTC m=+143.010483862" watchObservedRunningTime="2026-04-20 19:28:07.321471921 +0000 UTC m=+143.011885192" Apr 20 19:28:08.308548 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:08.308506 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-btnm6" event={"ID":"4b58e213-637f-47ba-8916-6ce23705dc6f","Type":"ContainerStarted","Data":"951c60736b78d22326122f1de68634fdf6e236d475c9d951e5141e62d303504e"} Apr 20 19:28:08.309020 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:08.308652 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:28:08.310072 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:08.310052 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-65d6b6f7bd-fw8wn" Apr 20 19:28:09.312758 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:09.312650 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-btnm6" event={"ID":"4b58e213-637f-47ba-8916-6ce23705dc6f","Type":"ContainerStarted","Data":"eea5bc36bf69cbff22f111f300cc445d773e8b9c4c44201cb02aa51880003a49"} Apr 20 19:28:09.329404 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:09.329351 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-btnm6" podStartSLOduration=17.363434317 podStartE2EDuration="19.329330153s" podCreationTimestamp="2026-04-20 19:27:50 +0000 UTC" firstStartedPulling="2026-04-20 19:28:07.01052988 +0000 UTC m=+142.700943135" lastFinishedPulling="2026-04-20 19:28:08.976425719 +0000 UTC m=+144.666838971" observedRunningTime="2026-04-20 19:28:09.328798356 +0000 UTC m=+145.019211621" watchObservedRunningTime="2026-04-20 19:28:09.329330153 +0000 UTC m=+145.019743426" Apr 20 19:28:13.968104 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:13.968064 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-96bb49785-fxs9k"] Apr 20 19:28:17.357564 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.357527 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-g4s4x"] Apr 20 19:28:17.360393 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.360373 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" Apr 20 19:28:17.362919 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.362894 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 19:28:17.364044 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.364019 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 19:28:17.364176 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.364041 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 19:28:17.364176 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.364071 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-zbpvx\"" Apr 20 19:28:17.364176 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.364019 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 19:28:17.364176 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.364026 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 19:28:17.368087 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.368064 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-g4s4x"] Apr 20 19:28:17.420329 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.420282 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2f72a49-7b0b-4e66-806b-91f6b151dcc1-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-g4s4x\" (UID: \"b2f72a49-7b0b-4e66-806b-91f6b151dcc1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" Apr 20 19:28:17.420329 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.420326 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2f72a49-7b0b-4e66-806b-91f6b151dcc1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-g4s4x\" (UID: \"b2f72a49-7b0b-4e66-806b-91f6b151dcc1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" Apr 20 19:28:17.420550 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.420358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jntj7\" (UniqueName: \"kubernetes.io/projected/b2f72a49-7b0b-4e66-806b-91f6b151dcc1-kube-api-access-jntj7\") pod \"prometheus-operator-5676c8c784-g4s4x\" (UID: \"b2f72a49-7b0b-4e66-806b-91f6b151dcc1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" Apr 20 19:28:17.420550 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.420443 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2f72a49-7b0b-4e66-806b-91f6b151dcc1-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-g4s4x\" (UID: \"b2f72a49-7b0b-4e66-806b-91f6b151dcc1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" Apr 20 19:28:17.521308 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.521265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2f72a49-7b0b-4e66-806b-91f6b151dcc1-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-g4s4x\" (UID: \"b2f72a49-7b0b-4e66-806b-91f6b151dcc1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" Apr 20 19:28:17.521474 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.521340 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2f72a49-7b0b-4e66-806b-91f6b151dcc1-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-g4s4x\" (UID: \"b2f72a49-7b0b-4e66-806b-91f6b151dcc1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" Apr 20 19:28:17.521474 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.521370 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2f72a49-7b0b-4e66-806b-91f6b151dcc1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-g4s4x\" (UID: \"b2f72a49-7b0b-4e66-806b-91f6b151dcc1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" Apr 20 19:28:17.521474 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.521413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jntj7\" (UniqueName: \"kubernetes.io/projected/b2f72a49-7b0b-4e66-806b-91f6b151dcc1-kube-api-access-jntj7\") pod \"prometheus-operator-5676c8c784-g4s4x\" (UID: \"b2f72a49-7b0b-4e66-806b-91f6b151dcc1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" Apr 20 19:28:17.522098 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.522036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2f72a49-7b0b-4e66-806b-91f6b151dcc1-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-g4s4x\" (UID: \"b2f72a49-7b0b-4e66-806b-91f6b151dcc1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" Apr 20 19:28:17.524040 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.524016 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b2f72a49-7b0b-4e66-806b-91f6b151dcc1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-g4s4x\" (UID: \"b2f72a49-7b0b-4e66-806b-91f6b151dcc1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" Apr 20 19:28:17.524565 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.524547 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b2f72a49-7b0b-4e66-806b-91f6b151dcc1-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-g4s4x\" (UID: \"b2f72a49-7b0b-4e66-806b-91f6b151dcc1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" Apr 20 19:28:17.531292 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.531262 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jntj7\" (UniqueName: \"kubernetes.io/projected/b2f72a49-7b0b-4e66-806b-91f6b151dcc1-kube-api-access-jntj7\") pod \"prometheus-operator-5676c8c784-g4s4x\" (UID: \"b2f72a49-7b0b-4e66-806b-91f6b151dcc1\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" Apr 20 19:28:17.670334 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.670301 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" Apr 20 19:28:17.794469 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:17.794432 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-g4s4x"] Apr 20 19:28:17.798342 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:28:17.798307 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2f72a49_7b0b_4e66_806b_91f6b151dcc1.slice/crio-f412ca0331c65b6ed527514a9e8a46fac77818890d8a2cbe8b71b57048ce6dbe WatchSource:0}: Error finding container f412ca0331c65b6ed527514a9e8a46fac77818890d8a2cbe8b71b57048ce6dbe: Status 404 returned error can't find the container with id f412ca0331c65b6ed527514a9e8a46fac77818890d8a2cbe8b71b57048ce6dbe Apr 20 19:28:18.336157 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:18.336113 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" event={"ID":"b2f72a49-7b0b-4e66-806b-91f6b151dcc1","Type":"ContainerStarted","Data":"f412ca0331c65b6ed527514a9e8a46fac77818890d8a2cbe8b71b57048ce6dbe"} Apr 20 19:28:19.340539 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:19.340491 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" event={"ID":"b2f72a49-7b0b-4e66-806b-91f6b151dcc1","Type":"ContainerStarted","Data":"9a36ab6090fa6312a0318907e8a3d6e385ddfb13455d6af38c32ca338cdb7bb0"} Apr 20 19:28:19.340539 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:19.340536 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" event={"ID":"b2f72a49-7b0b-4e66-806b-91f6b151dcc1","Type":"ContainerStarted","Data":"4374fa9cbb79f18eb9ba60f67f525fa4ed9e34cc660c585be7ffcddacee7978f"} Apr 20 19:28:19.357947 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:19.357887 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-g4s4x" podStartSLOduration=1.326108672 podStartE2EDuration="2.357866278s" podCreationTimestamp="2026-04-20 19:28:17 +0000 UTC" firstStartedPulling="2026-04-20 19:28:17.800282052 +0000 UTC m=+153.490695304" lastFinishedPulling="2026-04-20 19:28:18.832039658 +0000 UTC m=+154.522452910" observedRunningTime="2026-04-20 19:28:19.356849237 +0000 UTC m=+155.047262510" watchObservedRunningTime="2026-04-20 19:28:19.357866278 +0000 UTC m=+155.048279552" Apr 20 19:28:21.242871 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:28:21.242826 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-d6zt6" podUID="5b292e05-658a-4efc-8b35-8f64c0071f73" Apr 20 19:28:21.253121 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:28:21.253072 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-hsc5z" podUID="e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425" Apr 20 19:28:21.345629 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.345593 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d6zt6" Apr 20 19:28:21.711186 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.711151 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2"] Apr 20 19:28:21.713495 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.713469 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:21.715668 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.715639 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 19:28:21.715809 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.715795 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 19:28:21.715958 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.715941 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-rtrrm\"" Apr 20 19:28:21.723525 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.723502 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2"] Apr 20 19:28:21.743924 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.743886 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-677ws"] Apr 20 19:28:21.746058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.746035 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.748641 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.748594 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-t69jz\"" Apr 20 19:28:21.748641 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.748591 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 19:28:21.748855 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.748655 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 19:28:21.748855 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.748742 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 19:28:21.758850 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.758813 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-textfile\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.758850 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.758850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-tls\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.759058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.758873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02f4c19d-e14b-45e5-8c19-5ad304dc953b-metrics-client-ca\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.759058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.758931 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.759058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.758972 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02f4c19d-e14b-45e5-8c19-5ad304dc953b-sys\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.759058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.758998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs42w\" (UniqueName: \"kubernetes.io/projected/02f4c19d-e14b-45e5-8c19-5ad304dc953b-kube-api-access-qs42w\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.759058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.759031 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ed2128b-fc5f-4c17-b148-fae03c419a6d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-pfkt2\" (UID: \"4ed2128b-fc5f-4c17-b148-fae03c419a6d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:21.759058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.759047 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-wtmp\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.759322 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.759066 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-accelerators-collector-config\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.759322 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.759189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ed2128b-fc5f-4c17-b148-fae03c419a6d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-pfkt2\" (UID: \"4ed2128b-fc5f-4c17-b148-fae03c419a6d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:21.759322 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.759247 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84mjd\" (UniqueName: \"kubernetes.io/projected/4ed2128b-fc5f-4c17-b148-fae03c419a6d-kube-api-access-84mjd\") pod \"openshift-state-metrics-9d44df66c-pfkt2\" (UID: \"4ed2128b-fc5f-4c17-b148-fae03c419a6d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:21.759322 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.759297 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ed2128b-fc5f-4c17-b148-fae03c419a6d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pfkt2\" (UID: \"4ed2128b-fc5f-4c17-b148-fae03c419a6d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:21.759479 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.759328 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/02f4c19d-e14b-45e5-8c19-5ad304dc953b-root\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.860794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.860747 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02f4c19d-e14b-45e5-8c19-5ad304dc953b-sys\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.860794 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.860799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qs42w\" (UniqueName: \"kubernetes.io/projected/02f4c19d-e14b-45e5-8c19-5ad304dc953b-kube-api-access-qs42w\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.861058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.860838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ed2128b-fc5f-4c17-b148-fae03c419a6d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-pfkt2\" (UID: \"4ed2128b-fc5f-4c17-b148-fae03c419a6d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:21.861058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.860857 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/02f4c19d-e14b-45e5-8c19-5ad304dc953b-sys\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.861058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.860864 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-wtmp\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.861058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.860915 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-accelerators-collector-config\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.861058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.860983 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ed2128b-fc5f-4c17-b148-fae03c419a6d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-pfkt2\" (UID: \"4ed2128b-fc5f-4c17-b148-fae03c419a6d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:21.861058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.860989 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-wtmp\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.861058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.861030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84mjd\" (UniqueName: \"kubernetes.io/projected/4ed2128b-fc5f-4c17-b148-fae03c419a6d-kube-api-access-84mjd\") pod \"openshift-state-metrics-9d44df66c-pfkt2\" (UID: \"4ed2128b-fc5f-4c17-b148-fae03c419a6d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:21.861369 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.861068 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ed2128b-fc5f-4c17-b148-fae03c419a6d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pfkt2\" (UID: \"4ed2128b-fc5f-4c17-b148-fae03c419a6d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:21.861369 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.861100 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/02f4c19d-e14b-45e5-8c19-5ad304dc953b-root\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.861369 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.861131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-textfile\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.861369 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.861156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-tls\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.861369 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.861192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02f4c19d-e14b-45e5-8c19-5ad304dc953b-metrics-client-ca\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.861707 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:28:21.861235 2575 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 20 19:28:21.861836 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:28:21.861765 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ed2128b-fc5f-4c17-b148-fae03c419a6d-openshift-state-metrics-tls podName:4ed2128b-fc5f-4c17-b148-fae03c419a6d nodeName:}" failed. No retries permitted until 2026-04-20 19:28:22.361743314 +0000 UTC m=+158.052156566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/4ed2128b-fc5f-4c17-b148-fae03c419a6d-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-pfkt2" (UID: "4ed2128b-fc5f-4c17-b148-fae03c419a6d") : secret "openshift-state-metrics-tls" not found Apr 20 19:28:21.861836 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.861273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/02f4c19d-e14b-45e5-8c19-5ad304dc953b-root\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.861836 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.861581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-textfile\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.861836 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.861647 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-accelerators-collector-config\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.861836 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.861243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.862368 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:28:21.861355 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 19:28:21.862368 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:28:21.861908 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-tls podName:02f4c19d-e14b-45e5-8c19-5ad304dc953b nodeName:}" failed. No retries permitted until 2026-04-20 19:28:22.361894369 +0000 UTC m=+158.052307625 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-tls") pod "node-exporter-677ws" (UID: "02f4c19d-e14b-45e5-8c19-5ad304dc953b") : secret "node-exporter-tls" not found Apr 20 19:28:21.862368 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.861916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4ed2128b-fc5f-4c17-b148-fae03c419a6d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-pfkt2\" (UID: \"4ed2128b-fc5f-4c17-b148-fae03c419a6d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:21.862368 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.862074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02f4c19d-e14b-45e5-8c19-5ad304dc953b-metrics-client-ca\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.863551 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.863524 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4ed2128b-fc5f-4c17-b148-fae03c419a6d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-pfkt2\" (UID: \"4ed2128b-fc5f-4c17-b148-fae03c419a6d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:21.863843 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.863825 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:21.869332 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.869307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84mjd\" (UniqueName: \"kubernetes.io/projected/4ed2128b-fc5f-4c17-b148-fae03c419a6d-kube-api-access-84mjd\") pod \"openshift-state-metrics-9d44df66c-pfkt2\" (UID: \"4ed2128b-fc5f-4c17-b148-fae03c419a6d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:21.869431 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:21.869338 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs42w\" (UniqueName: \"kubernetes.io/projected/02f4c19d-e14b-45e5-8c19-5ad304dc953b-kube-api-access-qs42w\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:22.365792 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.365749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ed2128b-fc5f-4c17-b148-fae03c419a6d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pfkt2\" (UID: \"4ed2128b-fc5f-4c17-b148-fae03c419a6d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:22.365792 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.365794 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-tls\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:22.368432 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.368397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/02f4c19d-e14b-45e5-8c19-5ad304dc953b-node-exporter-tls\") pod \"node-exporter-677ws\" (UID: \"02f4c19d-e14b-45e5-8c19-5ad304dc953b\") " pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:22.368547 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.368522 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4ed2128b-fc5f-4c17-b148-fae03c419a6d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pfkt2\" (UID: \"4ed2128b-fc5f-4c17-b148-fae03c419a6d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:22.623285 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.623189 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" Apr 20 19:28:22.655242 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.655205 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-677ws" Apr 20 19:28:22.666502 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:28:22.666457 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02f4c19d_e14b_45e5_8c19_5ad304dc953b.slice/crio-a0f32b32cb254236a020884c733e313f211857c61955ddc1b97b9b2323095647 WatchSource:0}: Error finding container a0f32b32cb254236a020884c733e313f211857c61955ddc1b97b9b2323095647: Status 404 returned error can't find the container with id a0f32b32cb254236a020884c733e313f211857c61955ddc1b97b9b2323095647 Apr 20 19:28:22.759636 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.759577 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2"] Apr 20 19:28:22.762909 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:28:22.762872 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ed2128b_fc5f_4c17_b148_fae03c419a6d.slice/crio-6f90a5bb204c4606b4a6c8de66c0a866005084a0f2c42dc9a846b6f3302002b3 WatchSource:0}: Error finding container 6f90a5bb204c4606b4a6c8de66c0a866005084a0f2c42dc9a846b6f3302002b3: Status 404 returned error can't find the container with id 6f90a5bb204c4606b4a6c8de66c0a866005084a0f2c42dc9a846b6f3302002b3 Apr 20 19:28:22.808661 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.808633 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:28:22.813698 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.813664 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.816491 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.816342 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 19:28:22.816491 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.816437 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 19:28:22.816491 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.816461 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 19:28:22.816843 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.816514 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 19:28:22.816843 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.816687 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 19:28:22.816843 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.816697 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-q9zsw\"" Apr 20 19:28:22.816843 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.816722 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 19:28:22.816843 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.816735 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 19:28:22.816843 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.816799 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 19:28:22.816843 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.816818 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 19:28:22.824643 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.824575 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:28:22.881202 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.879517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5d3a8135-3b09-420c-92dc-fbe987865282-config-out\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.881202 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.879598 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.881202 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.879675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.881202 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.879704 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.881202 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.879729 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbvsv\" (UniqueName: \"kubernetes.io/projected/5d3a8135-3b09-420c-92dc-fbe987865282-kube-api-access-lbvsv\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.881202 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.879787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.881202 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.879878 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.881202 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.879958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-web-config\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.881202 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.879994 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.881202 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.880020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5d3a8135-3b09-420c-92dc-fbe987865282-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.881202 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.880045 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.881202 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.880090 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.881202 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.880120 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-config-volume\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.918977 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:28:22.918942 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-j9xp6" podUID="234402da-caaa-48f3-8a69-400f12f55eb6" Apr 20 19:28:22.980963 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.980867 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.980963 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.980913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.980963 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.980934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.981243 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.980987 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbvsv\" (UniqueName: \"kubernetes.io/projected/5d3a8135-3b09-420c-92dc-fbe987865282-kube-api-access-lbvsv\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.981243 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.981031 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.981243 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:28:22.981053 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-trusted-ca-bundle podName:5d3a8135-3b09-420c-92dc-fbe987865282 nodeName:}" failed. No retries permitted until 2026-04-20 19:28:23.481031067 +0000 UTC m=+159.171444319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282") : configmap references non-existent config key: ca-bundle.crt Apr 20 19:28:22.981243 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:28:22.981052 2575 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 20 19:28:22.981243 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:28:22.981120 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-main-tls podName:5d3a8135-3b09-420c-92dc-fbe987865282 nodeName:}" failed. No retries permitted until 2026-04-20 19:28:23.481111327 +0000 UTC m=+159.171524579 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282") : secret "alertmanager-main-tls" not found Apr 20 19:28:22.981243 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.981139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.981243 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.981193 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-web-config\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.981243 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.981222 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.981243 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.981247 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5d3a8135-3b09-420c-92dc-fbe987865282-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.981706 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.981274 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.981706 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.981315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.981706 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.981342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-config-volume\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.981706 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.981377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5d3a8135-3b09-420c-92dc-fbe987865282-config-out\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.982040 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.982018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.982298 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.982270 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.984535 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.984501 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5d3a8135-3b09-420c-92dc-fbe987865282-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.984706 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.984642 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.984706 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.984689 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5d3a8135-3b09-420c-92dc-fbe987865282-config-out\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.984808 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.984791 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-config-volume\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.984988 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.984967 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-web-config\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.985041 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.984975 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.985496 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.985480 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.985892 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.985873 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:22.989339 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:22.989307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbvsv\" (UniqueName: \"kubernetes.io/projected/5d3a8135-3b09-420c-92dc-fbe987865282-kube-api-access-lbvsv\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:23.353492 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:23.353457 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" event={"ID":"4ed2128b-fc5f-4c17-b148-fae03c419a6d","Type":"ContainerStarted","Data":"921a0c02bcd11c69f4a6ed28caa1837d7cc9621511f0bb740b2881998dbec443"} Apr 20 19:28:23.353492 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:23.353499 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" event={"ID":"4ed2128b-fc5f-4c17-b148-fae03c419a6d","Type":"ContainerStarted","Data":"5affa42a7423c7bde4c2e6d81bc4145a0c760fae54376a2887c28ed737395960"} Apr 20 19:28:23.353771 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:23.353511 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" event={"ID":"4ed2128b-fc5f-4c17-b148-fae03c419a6d","Type":"ContainerStarted","Data":"6f90a5bb204c4606b4a6c8de66c0a866005084a0f2c42dc9a846b6f3302002b3"} Apr 20 19:28:23.354545 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:23.354516 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-677ws" event={"ID":"02f4c19d-e14b-45e5-8c19-5ad304dc953b","Type":"ContainerStarted","Data":"a0f32b32cb254236a020884c733e313f211857c61955ddc1b97b9b2323095647"} Apr 20 19:28:23.486157 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:23.486087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:23.486157 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:23.486145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:23.487129 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:23.487099 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:23.489049 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:23.489019 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:23.724769 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:23.724726 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:28:23.975299 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:23.975181 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:28:23.992981 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:23.992948 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:28:23.996223 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:28:23.996184 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d3a8135_3b09_420c_92dc_fbe987865282.slice/crio-2bd3c408e7c7a1415b6fe007ba54efcfa9aa51cbf3fa952bc407f85bb541a17f WatchSource:0}: Error finding container 2bd3c408e7c7a1415b6fe007ba54efcfa9aa51cbf3fa952bc407f85bb541a17f: Status 404 returned error can't find the container with id 2bd3c408e7c7a1415b6fe007ba54efcfa9aa51cbf3fa952bc407f85bb541a17f Apr 20 19:28:24.359207 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:24.359110 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" event={"ID":"4ed2128b-fc5f-4c17-b148-fae03c419a6d","Type":"ContainerStarted","Data":"4105e3fc50831eff4a7f3b54f9dc3f7f75cd9a1bcbf13b0ab00b57806d3afa80"} Apr 20 19:28:24.360625 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:24.360580 2575 generic.go:358] "Generic (PLEG): container finished" podID="02f4c19d-e14b-45e5-8c19-5ad304dc953b" containerID="d905a1b9d19d75a022b2e6d77ed347204dac4aa8e6837b68ca0f0c15662a1e72" exitCode=0 Apr 20 19:28:24.360777 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:24.360650 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-677ws" event={"ID":"02f4c19d-e14b-45e5-8c19-5ad304dc953b","Type":"ContainerDied","Data":"d905a1b9d19d75a022b2e6d77ed347204dac4aa8e6837b68ca0f0c15662a1e72"} Apr 20 19:28:24.361737 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:24.361710 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerStarted","Data":"2bd3c408e7c7a1415b6fe007ba54efcfa9aa51cbf3fa952bc407f85bb541a17f"} Apr 20 19:28:24.379104 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:24.379036 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pfkt2" podStartSLOduration=2.426566366 podStartE2EDuration="3.379016524s" podCreationTimestamp="2026-04-20 19:28:21 +0000 UTC" firstStartedPulling="2026-04-20 19:28:22.917327849 +0000 UTC m=+158.607741101" lastFinishedPulling="2026-04-20 19:28:23.869777992 +0000 UTC m=+159.560191259" observedRunningTime="2026-04-20 19:28:24.377296614 +0000 UTC m=+160.067709889" watchObservedRunningTime="2026-04-20 19:28:24.379016524 +0000 UTC m=+160.069429801" Apr 20 19:28:25.367585 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:25.367492 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-677ws" event={"ID":"02f4c19d-e14b-45e5-8c19-5ad304dc953b","Type":"ContainerStarted","Data":"98c6caf05168642c4377634616e1b2fb9d45c265e48f6d15b1e136501bba4d82"} Apr 20 19:28:25.367585 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:25.367540 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-677ws" event={"ID":"02f4c19d-e14b-45e5-8c19-5ad304dc953b","Type":"ContainerStarted","Data":"867295e9c8f31040774754c8dc905069d49b243bb022bc282a007e77f2ddaf06"} Apr 20 19:28:25.369018 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:25.368994 2575 generic.go:358] "Generic (PLEG): container finished" podID="5d3a8135-3b09-420c-92dc-fbe987865282" containerID="a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04" exitCode=0 Apr 20 19:28:25.369146 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:25.369071 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerDied","Data":"a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04"} Apr 20 19:28:25.386199 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:25.386141 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-677ws" podStartSLOduration=3.711914496 podStartE2EDuration="4.386126302s" podCreationTimestamp="2026-04-20 19:28:21 +0000 UTC" firstStartedPulling="2026-04-20 19:28:22.668149869 +0000 UTC m=+158.358563120" lastFinishedPulling="2026-04-20 19:28:23.342361661 +0000 UTC m=+159.032774926" observedRunningTime="2026-04-20 19:28:25.385340127 +0000 UTC m=+161.075753401" watchObservedRunningTime="2026-04-20 19:28:25.386126302 +0000 UTC m=+161.076539575" Apr 20 19:28:26.112446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.112397 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:28:26.112698 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.112500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert\") pod \"ingress-canary-hsc5z\" (UID: \"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425\") " pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:28:26.115295 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.115252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b292e05-658a-4efc-8b35-8f64c0071f73-metrics-tls\") pod \"dns-default-d6zt6\" (UID: \"5b292e05-658a-4efc-8b35-8f64c0071f73\") " pod="openshift-dns/dns-default-d6zt6" Apr 20 19:28:26.115457 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.115389 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425-cert\") pod \"ingress-canary-hsc5z\" (UID: \"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425\") " pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:28:26.149000 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.148963 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5vnpt\"" Apr 20 19:28:26.157409 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.157363 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d6zt6" Apr 20 19:28:26.237243 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.237209 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-9dc95554d-27nnz"] Apr 20 19:28:26.240243 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.240215 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.243081 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.243052 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-rljn6\"" Apr 20 19:28:26.243244 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.243143 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 19:28:26.244096 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.243916 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 19:28:26.244096 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.244003 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 19:28:26.244096 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.244026 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 19:28:26.244096 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.244061 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-23vi473637dg7\"" Apr 20 19:28:26.250003 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.249971 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-9dc95554d-27nnz"] Apr 20 19:28:26.308094 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.308060 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d6zt6"] Apr 20 19:28:26.311832 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:28:26.311784 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b292e05_658a_4efc_8b35_8f64c0071f73.slice/crio-6d97cae4390c710e37ce119e67e1d92330365a8900851b4aba08692e5b223380 WatchSource:0}: Error finding container 6d97cae4390c710e37ce119e67e1d92330365a8900851b4aba08692e5b223380: Status 404 returned error can't find the container with id 6d97cae4390c710e37ce119e67e1d92330365a8900851b4aba08692e5b223380 Apr 20 19:28:26.314467 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.314437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/85576550-47ec-4bdb-8e87-c3dc63086f67-audit-log\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.314566 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.314480 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/85576550-47ec-4bdb-8e87-c3dc63086f67-metrics-server-audit-profiles\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.314566 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.314502 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flw2l\" (UniqueName: \"kubernetes.io/projected/85576550-47ec-4bdb-8e87-c3dc63086f67-kube-api-access-flw2l\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.314566 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.314560 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85576550-47ec-4bdb-8e87-c3dc63086f67-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.314725 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.314652 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85576550-47ec-4bdb-8e87-c3dc63086f67-client-ca-bundle\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.314725 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.314690 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/85576550-47ec-4bdb-8e87-c3dc63086f67-secret-metrics-server-client-certs\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.314822 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.314729 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/85576550-47ec-4bdb-8e87-c3dc63086f67-secret-metrics-server-tls\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.373749 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.373645 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6zt6" event={"ID":"5b292e05-658a-4efc-8b35-8f64c0071f73","Type":"ContainerStarted","Data":"6d97cae4390c710e37ce119e67e1d92330365a8900851b4aba08692e5b223380"} Apr 20 19:28:26.415394 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.415340 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/85576550-47ec-4bdb-8e87-c3dc63086f67-secret-metrics-server-tls\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.415701 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.415679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/85576550-47ec-4bdb-8e87-c3dc63086f67-audit-log\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.415834 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.415817 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/85576550-47ec-4bdb-8e87-c3dc63086f67-metrics-server-audit-profiles\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.415904 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.415858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flw2l\" (UniqueName: \"kubernetes.io/projected/85576550-47ec-4bdb-8e87-c3dc63086f67-kube-api-access-flw2l\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.415972 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.415955 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85576550-47ec-4bdb-8e87-c3dc63086f67-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.416021 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.416004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85576550-47ec-4bdb-8e87-c3dc63086f67-client-ca-bundle\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.416073 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.416038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/85576550-47ec-4bdb-8e87-c3dc63086f67-secret-metrics-server-client-certs\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.416124 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.416096 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/85576550-47ec-4bdb-8e87-c3dc63086f67-audit-log\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.416578 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.416546 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85576550-47ec-4bdb-8e87-c3dc63086f67-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.417062 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.417013 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/85576550-47ec-4bdb-8e87-c3dc63086f67-metrics-server-audit-profiles\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.418782 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.418753 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/85576550-47ec-4bdb-8e87-c3dc63086f67-secret-metrics-server-tls\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.419020 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.419002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/85576550-47ec-4bdb-8e87-c3dc63086f67-secret-metrics-server-client-certs\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.419077 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.419040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85576550-47ec-4bdb-8e87-c3dc63086f67-client-ca-bundle\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.424468 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.424430 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flw2l\" (UniqueName: \"kubernetes.io/projected/85576550-47ec-4bdb-8e87-c3dc63086f67-kube-api-access-flw2l\") pod \"metrics-server-9dc95554d-27nnz\" (UID: \"85576550-47ec-4bdb-8e87-c3dc63086f67\") " pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.493322 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.493281 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-tgvkt"] Apr 20 19:28:26.499278 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.499246 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tgvkt" Apr 20 19:28:26.502015 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.501981 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 19:28:26.502015 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.502004 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-646kb\"" Apr 20 19:28:26.512268 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.512230 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-tgvkt"] Apr 20 19:28:26.557850 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.557798 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:26.617789 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.617744 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dd1e918e-d7fe-4a79-99b4-8b594bac54b0-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tgvkt\" (UID: \"dd1e918e-d7fe-4a79-99b4-8b594bac54b0\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tgvkt" Apr 20 19:28:26.718802 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.718768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dd1e918e-d7fe-4a79-99b4-8b594bac54b0-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tgvkt\" (UID: \"dd1e918e-d7fe-4a79-99b4-8b594bac54b0\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tgvkt" Apr 20 19:28:26.719000 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:28:26.718973 2575 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 20 19:28:26.719093 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:28:26.719081 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd1e918e-d7fe-4a79-99b4-8b594bac54b0-monitoring-plugin-cert podName:dd1e918e-d7fe-4a79-99b4-8b594bac54b0 nodeName:}" failed. No retries permitted until 2026-04-20 19:28:27.219058387 +0000 UTC m=+162.909471657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/dd1e918e-d7fe-4a79-99b4-8b594bac54b0-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-tgvkt" (UID: "dd1e918e-d7fe-4a79-99b4-8b594bac54b0") : secret "monitoring-plugin-cert" not found Apr 20 19:28:26.832050 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:26.832015 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-9dc95554d-27nnz"] Apr 20 19:28:26.834910 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:28:26.834802 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85576550_47ec_4bdb_8e87_c3dc63086f67.slice/crio-ca35b77d2349d3a82f12d59804323eb38c33fc724e06e17be7a1ffea7717622a WatchSource:0}: Error finding container ca35b77d2349d3a82f12d59804323eb38c33fc724e06e17be7a1ffea7717622a: Status 404 returned error can't find the container with id ca35b77d2349d3a82f12d59804323eb38c33fc724e06e17be7a1ffea7717622a Apr 20 19:28:27.222560 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:27.222463 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dd1e918e-d7fe-4a79-99b4-8b594bac54b0-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tgvkt\" (UID: \"dd1e918e-d7fe-4a79-99b4-8b594bac54b0\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tgvkt" Apr 20 19:28:27.225465 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:27.225429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/dd1e918e-d7fe-4a79-99b4-8b594bac54b0-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-tgvkt\" (UID: \"dd1e918e-d7fe-4a79-99b4-8b594bac54b0\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tgvkt" Apr 20 19:28:27.380192 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:27.380150 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerStarted","Data":"71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78"} Apr 20 19:28:27.380192 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:27.380190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerStarted","Data":"f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba"} Apr 20 19:28:27.380757 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:27.380202 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerStarted","Data":"268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3"} Apr 20 19:28:27.380757 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:27.380214 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerStarted","Data":"a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108"} Apr 20 19:28:27.380757 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:27.380224 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerStarted","Data":"b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3"} Apr 20 19:28:27.381388 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:27.381342 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" event={"ID":"85576550-47ec-4bdb-8e87-c3dc63086f67","Type":"ContainerStarted","Data":"ca35b77d2349d3a82f12d59804323eb38c33fc724e06e17be7a1ffea7717622a"} Apr 20 19:28:27.419307 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:27.419264 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tgvkt" Apr 20 19:28:28.388283 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:28.387441 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" event={"ID":"85576550-47ec-4bdb-8e87-c3dc63086f67","Type":"ContainerStarted","Data":"06dc87311e0657b2fc247fc539e0a89cd39da016f3ea42cb1e1cc2e45e53f0c1"} Apr 20 19:28:28.410643 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:28.410027 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" podStartSLOduration=0.977481019 podStartE2EDuration="2.41000354s" podCreationTimestamp="2026-04-20 19:28:26 +0000 UTC" firstStartedPulling="2026-04-20 19:28:26.838048429 +0000 UTC m=+162.528461682" lastFinishedPulling="2026-04-20 19:28:28.270570936 +0000 UTC m=+163.960984203" observedRunningTime="2026-04-20 19:28:28.40988068 +0000 UTC m=+164.100293955" watchObservedRunningTime="2026-04-20 19:28:28.41000354 +0000 UTC m=+164.100416818" Apr 20 19:28:28.439419 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:28.439373 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-tgvkt"] Apr 20 19:28:28.446947 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:28:28.446149 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd1e918e_d7fe_4a79_99b4_8b594bac54b0.slice/crio-e6de45e5edf67867379de3b1647ee3a4a8b0f458e359d07e8279395d6b3f0497 WatchSource:0}: Error finding container e6de45e5edf67867379de3b1647ee3a4a8b0f458e359d07e8279395d6b3f0497: Status 404 returned error can't find the container with id e6de45e5edf67867379de3b1647ee3a4a8b0f458e359d07e8279395d6b3f0497 Apr 20 19:28:29.393844 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:29.393792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerStarted","Data":"beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc"} Apr 20 19:28:29.396011 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:29.395970 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tgvkt" event={"ID":"dd1e918e-d7fe-4a79-99b4-8b594bac54b0","Type":"ContainerStarted","Data":"e6de45e5edf67867379de3b1647ee3a4a8b0f458e359d07e8279395d6b3f0497"} Apr 20 19:28:29.397837 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:29.397804 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6zt6" event={"ID":"5b292e05-658a-4efc-8b35-8f64c0071f73","Type":"ContainerStarted","Data":"ffffe2efe9db46014d5f1f0eb4dbe3dedc559cb6b2c21827861decb6d91a0dbc"} Apr 20 19:28:29.397973 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:29.397843 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6zt6" event={"ID":"5b292e05-658a-4efc-8b35-8f64c0071f73","Type":"ContainerStarted","Data":"afb242464fc0e0e0c9174551ef8644a33a825757b9b677de74d6fa36edd8089f"} Apr 20 19:28:29.449249 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:29.449190 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-d6zt6" podStartSLOduration=129.495571133 podStartE2EDuration="2m11.449170932s" podCreationTimestamp="2026-04-20 19:26:18 +0000 UTC" firstStartedPulling="2026-04-20 19:28:26.314310296 +0000 UTC m=+162.004723556" lastFinishedPulling="2026-04-20 19:28:28.267910098 +0000 UTC m=+163.958323355" observedRunningTime="2026-04-20 19:28:29.44886971 +0000 UTC m=+165.139282998" watchObservedRunningTime="2026-04-20 19:28:29.449170932 +0000 UTC m=+165.139584209" Apr 20 19:28:29.450851 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:29.450796 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.655423774 podStartE2EDuration="7.450781057s" podCreationTimestamp="2026-04-20 19:28:22 +0000 UTC" firstStartedPulling="2026-04-20 19:28:23.998085723 +0000 UTC m=+159.688498975" lastFinishedPulling="2026-04-20 19:28:28.793443001 +0000 UTC m=+164.483856258" observedRunningTime="2026-04-20 19:28:29.424129304 +0000 UTC m=+165.114542579" watchObservedRunningTime="2026-04-20 19:28:29.450781057 +0000 UTC m=+165.141194332" Apr 20 19:28:30.402689 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.402652 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tgvkt" event={"ID":"dd1e918e-d7fe-4a79-99b4-8b594bac54b0","Type":"ContainerStarted","Data":"ddb7d505ce83b03c3183ccf53210c666273bae15ae9fb6e6709ddb9d0eebb073"} Apr 20 19:28:30.403424 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.403399 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tgvkt" Apr 20 19:28:30.403546 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.403434 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-d6zt6" Apr 20 19:28:30.408213 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.408187 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tgvkt" Apr 20 19:28:30.419430 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.419379 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-tgvkt" podStartSLOduration=3.2236959880000002 podStartE2EDuration="4.419363462s" podCreationTimestamp="2026-04-20 19:28:26 +0000 UTC" firstStartedPulling="2026-04-20 19:28:28.448221881 +0000 UTC m=+164.138635133" lastFinishedPulling="2026-04-20 19:28:29.643889354 +0000 UTC m=+165.334302607" observedRunningTime="2026-04-20 19:28:30.417580485 +0000 UTC m=+166.107993760" watchObservedRunningTime="2026-04-20 19:28:30.419363462 +0000 UTC m=+166.109776736" Apr 20 19:28:30.561846 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.561811 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-78d788cf7-z9lk4"] Apr 20 19:28:30.565243 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.565219 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.567869 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.567838 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 19:28:30.567869 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.567857 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 19:28:30.568093 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.567897 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 19:28:30.568093 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.567853 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 19:28:30.568093 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.567856 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-wvcd6\"" Apr 20 19:28:30.568769 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.568752 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 19:28:30.568878 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.568857 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 19:28:30.569377 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.569361 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 19:28:30.572766 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.572738 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 19:28:30.576969 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.576946 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78d788cf7-z9lk4"] Apr 20 19:28:30.660865 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.660772 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-console-config\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.660865 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.660817 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-trusted-ca-bundle\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.660865 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.660857 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-oauth-serving-cert\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.661088 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.660908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09706621-fcf9-4aa5-bd71-c676016b81d3-console-oauth-config\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.661088 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.660946 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09706621-fcf9-4aa5-bd71-c676016b81d3-console-serving-cert\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.661088 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.660981 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-service-ca\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.661088 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.660999 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlpbp\" (UniqueName: \"kubernetes.io/projected/09706621-fcf9-4aa5-bd71-c676016b81d3-kube-api-access-nlpbp\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.761717 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.761675 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-oauth-serving-cert\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.761930 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.761725 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09706621-fcf9-4aa5-bd71-c676016b81d3-console-oauth-config\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.761930 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.761761 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09706621-fcf9-4aa5-bd71-c676016b81d3-console-serving-cert\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.761930 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.761803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-service-ca\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.761930 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.761825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlpbp\" (UniqueName: \"kubernetes.io/projected/09706621-fcf9-4aa5-bd71-c676016b81d3-kube-api-access-nlpbp\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.761930 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.761906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-console-config\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.762196 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.761943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-trusted-ca-bundle\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.762436 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.762408 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-oauth-serving-cert\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.762520 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.762489 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-service-ca\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.762681 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.762662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-console-config\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.762752 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.762714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-trusted-ca-bundle\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.765030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.765010 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09706621-fcf9-4aa5-bd71-c676016b81d3-console-serving-cert\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.765030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.765024 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09706621-fcf9-4aa5-bd71-c676016b81d3-console-oauth-config\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.770346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.770314 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlpbp\" (UniqueName: \"kubernetes.io/projected/09706621-fcf9-4aa5-bd71-c676016b81d3-kube-api-access-nlpbp\") pod \"console-78d788cf7-z9lk4\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:30.876516 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:30.876472 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:31.008908 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:31.008879 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78d788cf7-z9lk4"] Apr 20 19:28:31.011258 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:28:31.011222 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09706621_fcf9_4aa5_bd71_c676016b81d3.slice/crio-76d9c13f127afeb45199c312c5ab2be205efaced2acb6bf7803d3e51a24215af WatchSource:0}: Error finding container 76d9c13f127afeb45199c312c5ab2be205efaced2acb6bf7803d3e51a24215af: Status 404 returned error can't find the container with id 76d9c13f127afeb45199c312c5ab2be205efaced2acb6bf7803d3e51a24215af Apr 20 19:28:31.407470 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:31.407425 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78d788cf7-z9lk4" event={"ID":"09706621-fcf9-4aa5-bd71-c676016b81d3","Type":"ContainerStarted","Data":"76d9c13f127afeb45199c312c5ab2be205efaced2acb6bf7803d3e51a24215af"} Apr 20 19:28:33.903408 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:33.903362 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:28:34.418554 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:34.418514 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78d788cf7-z9lk4" event={"ID":"09706621-fcf9-4aa5-bd71-c676016b81d3","Type":"ContainerStarted","Data":"5808474f3542cc98ef7e4611f8a284ac1869ea078907240ba275a7b495cdbcb8"} Apr 20 19:28:34.438928 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:34.438872 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78d788cf7-z9lk4" podStartSLOduration=1.7334130079999999 podStartE2EDuration="4.438852467s" podCreationTimestamp="2026-04-20 19:28:30 +0000 UTC" firstStartedPulling="2026-04-20 19:28:31.013555611 +0000 UTC m=+166.703968863" lastFinishedPulling="2026-04-20 19:28:33.718995056 +0000 UTC m=+169.409408322" observedRunningTime="2026-04-20 19:28:34.438236605 +0000 UTC m=+170.128649878" watchObservedRunningTime="2026-04-20 19:28:34.438852467 +0000 UTC m=+170.129265746" Apr 20 19:28:34.904472 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:34.904436 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:28:34.907007 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:34.906979 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbf4z\"" Apr 20 19:28:34.915016 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:34.914989 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hsc5z" Apr 20 19:28:35.049079 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:35.048988 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hsc5z"] Apr 20 19:28:35.051870 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:28:35.051825 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode89b8bbe_eeb8_48dd_8e7e_9cb9544b9425.slice/crio-bf99c5f75687144af6cfc3638f3f2acca7812d2ee66a7a4764550ea1cd7678f4 WatchSource:0}: Error finding container bf99c5f75687144af6cfc3638f3f2acca7812d2ee66a7a4764550ea1cd7678f4: Status 404 returned error can't find the container with id bf99c5f75687144af6cfc3638f3f2acca7812d2ee66a7a4764550ea1cd7678f4 Apr 20 19:28:35.423116 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:35.423068 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hsc5z" event={"ID":"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425","Type":"ContainerStarted","Data":"bf99c5f75687144af6cfc3638f3f2acca7812d2ee66a7a4764550ea1cd7678f4"} Apr 20 19:28:37.432998 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:37.432960 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hsc5z" event={"ID":"e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425","Type":"ContainerStarted","Data":"9bee2984aa0af825329e9362adfc4af0d5db41177e6f6d16a484c9502431ec41"} Apr 20 19:28:37.450025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:37.449950 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hsc5z" podStartSLOduration=137.801743969 podStartE2EDuration="2m19.449929754s" podCreationTimestamp="2026-04-20 19:26:18 +0000 UTC" firstStartedPulling="2026-04-20 19:28:35.053894325 +0000 UTC m=+170.744307586" lastFinishedPulling="2026-04-20 19:28:36.702080106 +0000 UTC m=+172.392493371" observedRunningTime="2026-04-20 19:28:37.449288728 +0000 UTC m=+173.139702003" watchObservedRunningTime="2026-04-20 19:28:37.449929754 +0000 UTC m=+173.140343013" Apr 20 19:28:38.278765 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.278728 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5dc875cc46-p2shg"] Apr 20 19:28:38.280974 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.280955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.291937 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.291906 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dc875cc46-p2shg"] Apr 20 19:28:38.436250 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.436217 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-config\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.436763 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.436277 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2whg\" (UniqueName: \"kubernetes.io/projected/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-kube-api-access-d2whg\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.436763 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.436321 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-oauth-serving-cert\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.436763 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.436379 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-trusted-ca-bundle\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.436763 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.436441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-service-ca\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.436763 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.436485 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-oauth-config\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.436763 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.436542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-serving-cert\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.538093 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.537994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2whg\" (UniqueName: \"kubernetes.io/projected/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-kube-api-access-d2whg\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.538093 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.538060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-oauth-serving-cert\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.538324 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.538108 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-trusted-ca-bundle\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.538324 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.538133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-service-ca\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.538324 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.538163 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-oauth-config\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.538324 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.538216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-serving-cert\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.538324 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.538277 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-config\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.538918 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.538886 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-oauth-serving-cert\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.539034 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.538918 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-config\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.539034 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.538920 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-service-ca\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.539792 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.539768 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-trusted-ca-bundle\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.540900 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.540878 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-oauth-config\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.541113 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.541093 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-serving-cert\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.546312 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.546286 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2whg\" (UniqueName: \"kubernetes.io/projected/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-kube-api-access-d2whg\") pod \"console-5dc875cc46-p2shg\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.590984 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.590948 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:38.724713 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.724508 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dc875cc46-p2shg"] Apr 20 19:28:38.727673 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:28:38.727644 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69ef48d3_2a44_4de6_9e9e_f2f19c7d831c.slice/crio-76302d153f2670e4590201280f55b2411b41e6255ddb07f38b722b1e9e411b67 WatchSource:0}: Error finding container 76302d153f2670e4590201280f55b2411b41e6255ddb07f38b722b1e9e411b67: Status 404 returned error can't find the container with id 76302d153f2670e4590201280f55b2411b41e6255ddb07f38b722b1e9e411b67 Apr 20 19:28:38.989563 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:38.989476 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-96bb49785-fxs9k" podUID="aa5f1199-c7b4-4aef-8ca3-fefa785ef54a" containerName="registry" containerID="cri-o://f17e3b4772c8db353484aca5915fbafe8c3fe7e171e46f4e23771faeb52bdc83" gracePeriod=30 Apr 20 19:28:39.229199 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.229174 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:28:39.345357 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.345319 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls\") pod \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " Apr 20 19:28:39.345513 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.345378 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-ca-trust-extracted\") pod \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " Apr 20 19:28:39.345513 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.345424 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-certificates\") pod \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " Apr 20 19:28:39.345513 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.345448 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx94k\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-kube-api-access-nx94k\") pod \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " Apr 20 19:28:39.345513 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.345475 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-bound-sa-token\") pod \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " Apr 20 19:28:39.345731 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.345540 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-trusted-ca\") pod \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " Apr 20 19:28:39.345731 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.345575 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-image-registry-private-configuration\") pod \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " Apr 20 19:28:39.345731 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.345603 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-installation-pull-secrets\") pod \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\" (UID: \"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a\") " Apr 20 19:28:39.346060 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.345956 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a" (UID: "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:28:39.346309 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.346141 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a" (UID: "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:28:39.347896 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.347870 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a" (UID: "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:28:39.347996 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.347969 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a" (UID: "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:28:39.348300 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.348277 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a" (UID: "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:28:39.348446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.348416 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a" (UID: "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:28:39.348526 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.348437 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-kube-api-access-nx94k" (OuterVolumeSpecName: "kube-api-access-nx94k") pod "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a" (UID: "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a"). InnerVolumeSpecName "kube-api-access-nx94k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:28:39.354241 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.354200 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a" (UID: "aa5f1199-c7b4-4aef-8ca3-fefa785ef54a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:28:39.440840 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.440799 2575 generic.go:358] "Generic (PLEG): container finished" podID="aa5f1199-c7b4-4aef-8ca3-fefa785ef54a" containerID="f17e3b4772c8db353484aca5915fbafe8c3fe7e171e46f4e23771faeb52bdc83" exitCode=0 Apr 20 19:28:39.441299 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.440873 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-96bb49785-fxs9k" Apr 20 19:28:39.441299 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.440883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-96bb49785-fxs9k" event={"ID":"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a","Type":"ContainerDied","Data":"f17e3b4772c8db353484aca5915fbafe8c3fe7e171e46f4e23771faeb52bdc83"} Apr 20 19:28:39.441299 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.440920 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-96bb49785-fxs9k" event={"ID":"aa5f1199-c7b4-4aef-8ca3-fefa785ef54a","Type":"ContainerDied","Data":"7612209c306c28f3b070069900f5a47ac506dc8770218a3ea9a3dd8359336806"} Apr 20 19:28:39.441299 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.440936 2575 scope.go:117] "RemoveContainer" containerID="f17e3b4772c8db353484aca5915fbafe8c3fe7e171e46f4e23771faeb52bdc83" Apr 20 19:28:39.442483 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.442454 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dc875cc46-p2shg" event={"ID":"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c","Type":"ContainerStarted","Data":"ecf6cf388dc22ec8d53b26183fca405adfa0af9014e34e2d77bcf9f5c1b132f0"} Apr 20 19:28:39.442583 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.442493 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dc875cc46-p2shg" event={"ID":"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c","Type":"ContainerStarted","Data":"76302d153f2670e4590201280f55b2411b41e6255ddb07f38b722b1e9e411b67"} Apr 20 19:28:39.446434 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.446409 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-tls\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:28:39.446434 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.446434 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-ca-trust-extracted\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:28:39.446593 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.446448 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-registry-certificates\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:28:39.446593 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.446459 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nx94k\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-kube-api-access-nx94k\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:28:39.446593 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.446469 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-bound-sa-token\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:28:39.446593 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.446477 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-trusted-ca\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:28:39.446593 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.446485 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-image-registry-private-configuration\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:28:39.446593 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.446495 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a-installation-pull-secrets\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:28:39.450747 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.450724 2575 scope.go:117] "RemoveContainer" containerID="f17e3b4772c8db353484aca5915fbafe8c3fe7e171e46f4e23771faeb52bdc83" Apr 20 19:28:39.451077 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:28:39.451056 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f17e3b4772c8db353484aca5915fbafe8c3fe7e171e46f4e23771faeb52bdc83\": container with ID starting with f17e3b4772c8db353484aca5915fbafe8c3fe7e171e46f4e23771faeb52bdc83 not found: ID does not exist" containerID="f17e3b4772c8db353484aca5915fbafe8c3fe7e171e46f4e23771faeb52bdc83" Apr 20 19:28:39.451144 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.451089 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f17e3b4772c8db353484aca5915fbafe8c3fe7e171e46f4e23771faeb52bdc83"} err="failed to get container status \"f17e3b4772c8db353484aca5915fbafe8c3fe7e171e46f4e23771faeb52bdc83\": rpc error: code = NotFound desc = could not find container \"f17e3b4772c8db353484aca5915fbafe8c3fe7e171e46f4e23771faeb52bdc83\": container with ID starting with f17e3b4772c8db353484aca5915fbafe8c3fe7e171e46f4e23771faeb52bdc83 not found: ID does not exist" Apr 20 19:28:39.462034 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.461984 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5dc875cc46-p2shg" podStartSLOduration=1.461966056 podStartE2EDuration="1.461966056s" podCreationTimestamp="2026-04-20 19:28:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:28:39.460927989 +0000 UTC m=+175.151341263" watchObservedRunningTime="2026-04-20 19:28:39.461966056 +0000 UTC m=+175.152379331" Apr 20 19:28:39.474691 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.474642 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-96bb49785-fxs9k"] Apr 20 19:28:39.477859 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:39.477824 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-96bb49785-fxs9k"] Apr 20 19:28:40.409938 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:40.409902 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-d6zt6" Apr 20 19:28:40.877054 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:40.876952 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:40.877054 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:40.877008 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:40.882258 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:40.882229 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:40.908055 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:40.908016 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5f1199-c7b4-4aef-8ca3-fefa785ef54a" path="/var/lib/kubelet/pods/aa5f1199-c7b4-4aef-8ca3-fefa785ef54a/volumes" Apr 20 19:28:41.453691 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:41.453663 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:28:46.559030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:46.558984 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:46.559030 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:46.559037 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:28:47.467432 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:47.467396 2575 generic.go:358] "Generic (PLEG): container finished" podID="eb4b3625-678c-45ba-87f5-25257a97723a" containerID="e3c778fe95538756878565b8b32ba7f602b06e81c11bbd8d4196b8aaf42f2753" exitCode=0 Apr 20 19:28:47.467672 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:47.467464 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-j9gnb" event={"ID":"eb4b3625-678c-45ba-87f5-25257a97723a","Type":"ContainerDied","Data":"e3c778fe95538756878565b8b32ba7f602b06e81c11bbd8d4196b8aaf42f2753"} Apr 20 19:28:47.467917 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:47.467898 2575 scope.go:117] "RemoveContainer" containerID="e3c778fe95538756878565b8b32ba7f602b06e81c11bbd8d4196b8aaf42f2753" Apr 20 19:28:48.088676 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:48.088649 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-65d6b6f7bd-fw8wn_28176d29-9406-4440-8156-fe54a5e5596e/router/0.log" Apr 20 19:28:48.094248 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:48.094214 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hsc5z_e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425/serve-healthcheck-canary/0.log" Apr 20 19:28:48.472887 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:48.472855 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-j9gnb" event={"ID":"eb4b3625-678c-45ba-87f5-25257a97723a","Type":"ContainerStarted","Data":"d8bee7621c6784e232fc2572e6cf4fce0bf1e2dbfdf8dd838ca19f70e78f2792"} Apr 20 19:28:48.591601 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:48.591552 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:48.591819 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:48.591647 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:48.596377 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:48.596350 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:49.480156 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:49.480124 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:28:49.528790 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:28:49.528748 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78d788cf7-z9lk4"] Apr 20 19:29:06.564204 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:06.564170 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:29:06.568737 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:06.568705 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-9dc95554d-27nnz" Apr 20 19:29:14.554323 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.554267 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-78d788cf7-z9lk4" podUID="09706621-fcf9-4aa5-bd71-c676016b81d3" containerName="console" containerID="cri-o://5808474f3542cc98ef7e4611f8a284ac1869ea078907240ba275a7b495cdbcb8" gracePeriod=15 Apr 20 19:29:14.807093 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.807024 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78d788cf7-z9lk4_09706621-fcf9-4aa5-bd71-c676016b81d3/console/0.log" Apr 20 19:29:14.807093 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.807090 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:29:14.868377 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.868349 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-oauth-serving-cert\") pod \"09706621-fcf9-4aa5-bd71-c676016b81d3\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " Apr 20 19:29:14.868578 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.868429 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-service-ca\") pod \"09706621-fcf9-4aa5-bd71-c676016b81d3\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " Apr 20 19:29:14.868578 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.868536 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09706621-fcf9-4aa5-bd71-c676016b81d3-console-oauth-config\") pod \"09706621-fcf9-4aa5-bd71-c676016b81d3\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " Apr 20 19:29:14.868727 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.868586 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-console-config\") pod \"09706621-fcf9-4aa5-bd71-c676016b81d3\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " Apr 20 19:29:14.868727 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.868638 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlpbp\" (UniqueName: \"kubernetes.io/projected/09706621-fcf9-4aa5-bd71-c676016b81d3-kube-api-access-nlpbp\") pod \"09706621-fcf9-4aa5-bd71-c676016b81d3\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " Apr 20 19:29:14.868837 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.868819 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09706621-fcf9-4aa5-bd71-c676016b81d3-console-serving-cert\") pod \"09706621-fcf9-4aa5-bd71-c676016b81d3\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " Apr 20 19:29:14.868913 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.868888 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "09706621-fcf9-4aa5-bd71-c676016b81d3" (UID: "09706621-fcf9-4aa5-bd71-c676016b81d3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:14.868913 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.868898 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-service-ca" (OuterVolumeSpecName: "service-ca") pod "09706621-fcf9-4aa5-bd71-c676016b81d3" (UID: "09706621-fcf9-4aa5-bd71-c676016b81d3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:14.869011 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.868918 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-trusted-ca-bundle\") pod \"09706621-fcf9-4aa5-bd71-c676016b81d3\" (UID: \"09706621-fcf9-4aa5-bd71-c676016b81d3\") " Apr 20 19:29:14.869011 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.868959 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-console-config" (OuterVolumeSpecName: "console-config") pod "09706621-fcf9-4aa5-bd71-c676016b81d3" (UID: "09706621-fcf9-4aa5-bd71-c676016b81d3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:14.869190 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.869173 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-oauth-serving-cert\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:14.869254 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.869200 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-service-ca\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:14.869254 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.869214 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-console-config\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:14.869254 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.869237 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09706621-fcf9-4aa5-bd71-c676016b81d3" (UID: "09706621-fcf9-4aa5-bd71-c676016b81d3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:14.871322 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.871292 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09706621-fcf9-4aa5-bd71-c676016b81d3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "09706621-fcf9-4aa5-bd71-c676016b81d3" (UID: "09706621-fcf9-4aa5-bd71-c676016b81d3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:14.871475 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.871351 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09706621-fcf9-4aa5-bd71-c676016b81d3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "09706621-fcf9-4aa5-bd71-c676016b81d3" (UID: "09706621-fcf9-4aa5-bd71-c676016b81d3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:14.871475 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.871369 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09706621-fcf9-4aa5-bd71-c676016b81d3-kube-api-access-nlpbp" (OuterVolumeSpecName: "kube-api-access-nlpbp") pod "09706621-fcf9-4aa5-bd71-c676016b81d3" (UID: "09706621-fcf9-4aa5-bd71-c676016b81d3"). InnerVolumeSpecName "kube-api-access-nlpbp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:29:14.970043 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.969997 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09706621-fcf9-4aa5-bd71-c676016b81d3-trusted-ca-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:14.970043 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.970043 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/09706621-fcf9-4aa5-bd71-c676016b81d3-console-oauth-config\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:14.970240 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.970060 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nlpbp\" (UniqueName: \"kubernetes.io/projected/09706621-fcf9-4aa5-bd71-c676016b81d3-kube-api-access-nlpbp\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:14.970240 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:14.970077 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/09706621-fcf9-4aa5-bd71-c676016b81d3-console-serving-cert\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:15.562597 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:15.562569 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78d788cf7-z9lk4_09706621-fcf9-4aa5-bd71-c676016b81d3/console/0.log" Apr 20 19:29:15.563037 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:15.562638 2575 generic.go:358] "Generic (PLEG): container finished" podID="09706621-fcf9-4aa5-bd71-c676016b81d3" containerID="5808474f3542cc98ef7e4611f8a284ac1869ea078907240ba275a7b495cdbcb8" exitCode=2 Apr 20 19:29:15.563037 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:15.562695 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78d788cf7-z9lk4" event={"ID":"09706621-fcf9-4aa5-bd71-c676016b81d3","Type":"ContainerDied","Data":"5808474f3542cc98ef7e4611f8a284ac1869ea078907240ba275a7b495cdbcb8"} Apr 20 19:29:15.563037 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:15.562721 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78d788cf7-z9lk4" event={"ID":"09706621-fcf9-4aa5-bd71-c676016b81d3","Type":"ContainerDied","Data":"76d9c13f127afeb45199c312c5ab2be205efaced2acb6bf7803d3e51a24215af"} Apr 20 19:29:15.563037 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:15.562736 2575 scope.go:117] "RemoveContainer" containerID="5808474f3542cc98ef7e4611f8a284ac1869ea078907240ba275a7b495cdbcb8" Apr 20 19:29:15.563037 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:15.562738 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78d788cf7-z9lk4" Apr 20 19:29:15.571311 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:15.571288 2575 scope.go:117] "RemoveContainer" containerID="5808474f3542cc98ef7e4611f8a284ac1869ea078907240ba275a7b495cdbcb8" Apr 20 19:29:15.571649 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:29:15.571590 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5808474f3542cc98ef7e4611f8a284ac1869ea078907240ba275a7b495cdbcb8\": container with ID starting with 5808474f3542cc98ef7e4611f8a284ac1869ea078907240ba275a7b495cdbcb8 not found: ID does not exist" containerID="5808474f3542cc98ef7e4611f8a284ac1869ea078907240ba275a7b495cdbcb8" Apr 20 19:29:15.571702 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:15.571652 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5808474f3542cc98ef7e4611f8a284ac1869ea078907240ba275a7b495cdbcb8"} err="failed to get container status \"5808474f3542cc98ef7e4611f8a284ac1869ea078907240ba275a7b495cdbcb8\": rpc error: code = NotFound desc = could not find container \"5808474f3542cc98ef7e4611f8a284ac1869ea078907240ba275a7b495cdbcb8\": container with ID starting with 5808474f3542cc98ef7e4611f8a284ac1869ea078907240ba275a7b495cdbcb8 not found: ID does not exist" Apr 20 19:29:15.579502 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:15.579469 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78d788cf7-z9lk4"] Apr 20 19:29:15.582889 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:15.582866 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-78d788cf7-z9lk4"] Apr 20 19:29:16.907313 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:16.907280 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09706621-fcf9-4aa5-bd71-c676016b81d3" path="/var/lib/kubelet/pods/09706621-fcf9-4aa5-bd71-c676016b81d3/volumes" Apr 20 19:29:40.217162 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.217124 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6755758b4b-hzksz"] Apr 20 19:29:40.217691 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.217429 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa5f1199-c7b4-4aef-8ca3-fefa785ef54a" containerName="registry" Apr 20 19:29:40.217691 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.217440 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5f1199-c7b4-4aef-8ca3-fefa785ef54a" containerName="registry" Apr 20 19:29:40.217691 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.217457 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09706621-fcf9-4aa5-bd71-c676016b81d3" containerName="console" Apr 20 19:29:40.217691 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.217463 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="09706621-fcf9-4aa5-bd71-c676016b81d3" containerName="console" Apr 20 19:29:40.217691 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.217505 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="09706621-fcf9-4aa5-bd71-c676016b81d3" containerName="console" Apr 20 19:29:40.217691 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.217513 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa5f1199-c7b4-4aef-8ca3-fefa785ef54a" containerName="registry" Apr 20 19:29:40.221791 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.221765 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.230216 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.230188 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6755758b4b-hzksz"] Apr 20 19:29:40.284248 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.284202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-oauth-config\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.284248 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.284248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-service-ca\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.284464 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.284270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-serving-cert\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.284464 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.284359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-config\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.284464 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.284458 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-oauth-serving-cert\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.284563 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.284478 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54r88\" (UniqueName: \"kubernetes.io/projected/a0408d02-505e-4f7d-a05c-1026c8a3090d-kube-api-access-54r88\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.284563 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.284506 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-trusted-ca-bundle\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.385198 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.385155 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-oauth-serving-cert\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.385198 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.385192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54r88\" (UniqueName: \"kubernetes.io/projected/a0408d02-505e-4f7d-a05c-1026c8a3090d-kube-api-access-54r88\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.385469 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.385218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-trusted-ca-bundle\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.385469 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.385242 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-oauth-config\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.385469 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.385275 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-service-ca\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.385469 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.385307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-serving-cert\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.385469 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.385356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-config\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.386133 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.386106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-trusted-ca-bundle\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.386133 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.386117 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-service-ca\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.386292 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.386117 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-oauth-serving-cert\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.386370 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.386350 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-config\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.388007 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.387972 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-oauth-config\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.388118 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.388070 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-serving-cert\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.393245 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.393216 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54r88\" (UniqueName: \"kubernetes.io/projected/a0408d02-505e-4f7d-a05c-1026c8a3090d-kube-api-access-54r88\") pod \"console-6755758b4b-hzksz\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.533846 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.533753 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:40.664192 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:40.664165 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6755758b4b-hzksz"] Apr 20 19:29:40.666879 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:29:40.666845 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0408d02_505e_4f7d_a05c_1026c8a3090d.slice/crio-0ac885035d3ae43bddf43c658bd222dd57eaabd68fabb0f720e4281d7a7df130 WatchSource:0}: Error finding container 0ac885035d3ae43bddf43c658bd222dd57eaabd68fabb0f720e4281d7a7df130: Status 404 returned error can't find the container with id 0ac885035d3ae43bddf43c658bd222dd57eaabd68fabb0f720e4281d7a7df130 Apr 20 19:29:41.638424 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:41.638381 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6755758b4b-hzksz" event={"ID":"a0408d02-505e-4f7d-a05c-1026c8a3090d","Type":"ContainerStarted","Data":"527941546c3275aaa3b972085af5244a3eb3a4de1021f3cbe4f0920e55d88352"} Apr 20 19:29:41.638424 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:41.638419 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6755758b4b-hzksz" event={"ID":"a0408d02-505e-4f7d-a05c-1026c8a3090d","Type":"ContainerStarted","Data":"0ac885035d3ae43bddf43c658bd222dd57eaabd68fabb0f720e4281d7a7df130"} Apr 20 19:29:41.656360 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:41.656299 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6755758b4b-hzksz" podStartSLOduration=1.656282392 podStartE2EDuration="1.656282392s" podCreationTimestamp="2026-04-20 19:29:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:29:41.654881537 +0000 UTC m=+237.345294810" watchObservedRunningTime="2026-04-20 19:29:41.656282392 +0000 UTC m=+237.346695665" Apr 20 19:29:42.091926 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.091831 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:29:42.092339 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.092289 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="alertmanager" containerID="cri-o://b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3" gracePeriod=120 Apr 20 19:29:42.092485 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.092331 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="kube-rbac-proxy-metric" containerID="cri-o://71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78" gracePeriod=120 Apr 20 19:29:42.092485 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.092379 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="config-reloader" containerID="cri-o://a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108" gracePeriod=120 Apr 20 19:29:42.092485 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.092363 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="kube-rbac-proxy-web" containerID="cri-o://268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3" gracePeriod=120 Apr 20 19:29:42.092485 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.092385 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="kube-rbac-proxy" containerID="cri-o://f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba" gracePeriod=120 Apr 20 19:29:42.092698 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.092362 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="prom-label-proxy" containerID="cri-o://beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc" gracePeriod=120 Apr 20 19:29:42.646747 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.646708 2575 generic.go:358] "Generic (PLEG): container finished" podID="5d3a8135-3b09-420c-92dc-fbe987865282" containerID="beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc" exitCode=0 Apr 20 19:29:42.646747 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.646737 2575 generic.go:358] "Generic (PLEG): container finished" podID="5d3a8135-3b09-420c-92dc-fbe987865282" containerID="f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba" exitCode=0 Apr 20 19:29:42.646747 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.646745 2575 generic.go:358] "Generic (PLEG): container finished" podID="5d3a8135-3b09-420c-92dc-fbe987865282" containerID="a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108" exitCode=0 Apr 20 19:29:42.646747 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.646751 2575 generic.go:358] "Generic (PLEG): container finished" podID="5d3a8135-3b09-420c-92dc-fbe987865282" containerID="b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3" exitCode=0 Apr 20 19:29:42.647219 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.646781 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerDied","Data":"beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc"} Apr 20 19:29:42.647219 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.646814 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerDied","Data":"f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba"} Apr 20 19:29:42.647219 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.646824 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerDied","Data":"a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108"} Apr 20 19:29:42.647219 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:42.646836 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerDied","Data":"b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3"} Apr 20 19:29:43.343165 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.343140 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.411322 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.411218 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5d3a8135-3b09-420c-92dc-fbe987865282-tls-assets\") pod \"5d3a8135-3b09-420c-92dc-fbe987865282\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " Apr 20 19:29:43.411322 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.411278 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-cluster-tls-config\") pod \"5d3a8135-3b09-420c-92dc-fbe987865282\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " Apr 20 19:29:43.411322 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.411315 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-trusted-ca-bundle\") pod \"5d3a8135-3b09-420c-92dc-fbe987865282\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " Apr 20 19:29:43.411608 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.411340 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbvsv\" (UniqueName: \"kubernetes.io/projected/5d3a8135-3b09-420c-92dc-fbe987865282-kube-api-access-lbvsv\") pod \"5d3a8135-3b09-420c-92dc-fbe987865282\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " Apr 20 19:29:43.411608 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.411364 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-web-config\") pod \"5d3a8135-3b09-420c-92dc-fbe987865282\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " Apr 20 19:29:43.411608 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.411394 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-metrics-client-ca\") pod \"5d3a8135-3b09-420c-92dc-fbe987865282\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " Apr 20 19:29:43.411608 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.411439 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy-web\") pod \"5d3a8135-3b09-420c-92dc-fbe987865282\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " Apr 20 19:29:43.411608 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.411475 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-main-tls\") pod \"5d3a8135-3b09-420c-92dc-fbe987865282\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " Apr 20 19:29:43.411608 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.411502 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-main-db\") pod \"5d3a8135-3b09-420c-92dc-fbe987865282\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " Apr 20 19:29:43.411608 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.411544 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy-metric\") pod \"5d3a8135-3b09-420c-92dc-fbe987865282\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " Apr 20 19:29:43.411608 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.411582 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy\") pod \"5d3a8135-3b09-420c-92dc-fbe987865282\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " Apr 20 19:29:43.412174 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.411685 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-config-volume\") pod \"5d3a8135-3b09-420c-92dc-fbe987865282\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " Apr 20 19:29:43.412174 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.411718 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5d3a8135-3b09-420c-92dc-fbe987865282-config-out\") pod \"5d3a8135-3b09-420c-92dc-fbe987865282\" (UID: \"5d3a8135-3b09-420c-92dc-fbe987865282\") " Apr 20 19:29:43.412174 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.411821 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "5d3a8135-3b09-420c-92dc-fbe987865282" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:43.412174 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.412025 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:43.413330 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.413296 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "5d3a8135-3b09-420c-92dc-fbe987865282" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:43.414919 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.414885 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "5d3a8135-3b09-420c-92dc-fbe987865282" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:29:43.416282 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.415384 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d3a8135-3b09-420c-92dc-fbe987865282-config-out" (OuterVolumeSpecName: "config-out") pod "5d3a8135-3b09-420c-92dc-fbe987865282" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:29:43.416282 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.415491 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-config-volume" (OuterVolumeSpecName: "config-volume") pod "5d3a8135-3b09-420c-92dc-fbe987865282" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:43.416282 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.415508 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3a8135-3b09-420c-92dc-fbe987865282-kube-api-access-lbvsv" (OuterVolumeSpecName: "kube-api-access-lbvsv") pod "5d3a8135-3b09-420c-92dc-fbe987865282" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282"). InnerVolumeSpecName "kube-api-access-lbvsv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:29:43.416696 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.416666 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "5d3a8135-3b09-420c-92dc-fbe987865282" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:43.417479 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.417446 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3a8135-3b09-420c-92dc-fbe987865282-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "5d3a8135-3b09-420c-92dc-fbe987865282" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:29:43.417578 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.417497 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "5d3a8135-3b09-420c-92dc-fbe987865282" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:43.418106 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.418075 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "5d3a8135-3b09-420c-92dc-fbe987865282" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:43.418208 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.418114 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "5d3a8135-3b09-420c-92dc-fbe987865282" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:43.420863 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.420827 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "5d3a8135-3b09-420c-92dc-fbe987865282" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:43.427804 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.427768 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-web-config" (OuterVolumeSpecName: "web-config") pod "5d3a8135-3b09-420c-92dc-fbe987865282" (UID: "5d3a8135-3b09-420c-92dc-fbe987865282"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:43.512559 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.512520 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:43.512559 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.512554 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-main-tls\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:43.512559 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.512566 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5d3a8135-3b09-420c-92dc-fbe987865282-alertmanager-main-db\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:43.512827 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.512575 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:43.512827 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.512585 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:43.512827 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.512595 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-config-volume\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:43.512827 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.512605 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5d3a8135-3b09-420c-92dc-fbe987865282-config-out\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:43.512827 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.512644 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5d3a8135-3b09-420c-92dc-fbe987865282-tls-assets\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:43.512827 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.512652 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-cluster-tls-config\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:43.512827 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.512660 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lbvsv\" (UniqueName: \"kubernetes.io/projected/5d3a8135-3b09-420c-92dc-fbe987865282-kube-api-access-lbvsv\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:43.512827 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.512669 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5d3a8135-3b09-420c-92dc-fbe987865282-web-config\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:43.512827 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.512679 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d3a8135-3b09-420c-92dc-fbe987865282-metrics-client-ca\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:29:43.652638 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.652570 2575 generic.go:358] "Generic (PLEG): container finished" podID="5d3a8135-3b09-420c-92dc-fbe987865282" containerID="71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78" exitCode=0 Apr 20 19:29:43.652638 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.652603 2575 generic.go:358] "Generic (PLEG): container finished" podID="5d3a8135-3b09-420c-92dc-fbe987865282" containerID="268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3" exitCode=0 Apr 20 19:29:43.653069 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.652660 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerDied","Data":"71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78"} Apr 20 19:29:43.653069 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.652699 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerDied","Data":"268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3"} Apr 20 19:29:43.653069 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.652707 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.653069 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.652719 2575 scope.go:117] "RemoveContainer" containerID="beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc" Apr 20 19:29:43.653069 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.652710 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5d3a8135-3b09-420c-92dc-fbe987865282","Type":"ContainerDied","Data":"2bd3c408e7c7a1415b6fe007ba54efcfa9aa51cbf3fa952bc407f85bb541a17f"} Apr 20 19:29:43.660936 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.660913 2575 scope.go:117] "RemoveContainer" containerID="71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78" Apr 20 19:29:43.669248 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.669225 2575 scope.go:117] "RemoveContainer" containerID="f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba" Apr 20 19:29:43.675446 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.675415 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:29:43.677145 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.677112 2575 scope.go:117] "RemoveContainer" containerID="268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3" Apr 20 19:29:43.680206 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.680180 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:29:43.684800 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.684780 2575 scope.go:117] "RemoveContainer" containerID="a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108" Apr 20 19:29:43.692519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.692498 2575 scope.go:117] "RemoveContainer" containerID="b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3" Apr 20 19:29:43.700158 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.700138 2575 scope.go:117] "RemoveContainer" containerID="a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04" Apr 20 19:29:43.703071 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703044 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:29:43.703340 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703327 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="prom-label-proxy" Apr 20 19:29:43.703383 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703343 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="prom-label-proxy" Apr 20 19:29:43.703383 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703353 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="config-reloader" Apr 20 19:29:43.703383 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703358 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="config-reloader" Apr 20 19:29:43.703383 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703366 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="kube-rbac-proxy-web" Apr 20 19:29:43.703383 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703372 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="kube-rbac-proxy-web" Apr 20 19:29:43.703383 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703380 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="kube-rbac-proxy" Apr 20 19:29:43.703552 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703386 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="kube-rbac-proxy" Apr 20 19:29:43.703552 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703404 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="init-config-reloader" Apr 20 19:29:43.703552 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703412 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="init-config-reloader" Apr 20 19:29:43.703552 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703418 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="alertmanager" Apr 20 19:29:43.703552 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703424 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="alertmanager" Apr 20 19:29:43.703552 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703435 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="kube-rbac-proxy-metric" Apr 20 19:29:43.703552 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703440 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="kube-rbac-proxy-metric" Apr 20 19:29:43.703552 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703487 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="alertmanager" Apr 20 19:29:43.703552 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703496 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="prom-label-proxy" Apr 20 19:29:43.703552 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703503 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="kube-rbac-proxy-web" Apr 20 19:29:43.703552 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703509 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="kube-rbac-proxy" Apr 20 19:29:43.703552 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703514 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="kube-rbac-proxy-metric" Apr 20 19:29:43.703552 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.703520 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" containerName="config-reloader" Apr 20 19:29:43.708386 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.708359 2575 scope.go:117] "RemoveContainer" containerID="beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc" Apr 20 19:29:43.708718 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:29:43.708697 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc\": container with ID starting with beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc not found: ID does not exist" containerID="beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc" Apr 20 19:29:43.708831 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.708733 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc"} err="failed to get container status \"beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc\": rpc error: code = NotFound desc = could not find container \"beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc\": container with ID starting with beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc not found: ID does not exist" Apr 20 19:29:43.708831 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.708755 2575 scope.go:117] "RemoveContainer" containerID="71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78" Apr 20 19:29:43.709003 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.708985 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.709071 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:29:43.709055 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78\": container with ID starting with 71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78 not found: ID does not exist" containerID="71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78" Apr 20 19:29:43.709111 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.709079 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78"} err="failed to get container status \"71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78\": rpc error: code = NotFound desc = could not find container \"71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78\": container with ID starting with 71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78 not found: ID does not exist" Apr 20 19:29:43.709111 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.709095 2575 scope.go:117] "RemoveContainer" containerID="f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba" Apr 20 19:29:43.709532 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:29:43.709511 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba\": container with ID starting with f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba not found: ID does not exist" containerID="f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba" Apr 20 19:29:43.709666 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.709537 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba"} err="failed to get container status \"f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba\": rpc error: code = NotFound desc = could not find container \"f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba\": container with ID starting with f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba not found: ID does not exist" Apr 20 19:29:43.709666 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.709553 2575 scope.go:117] "RemoveContainer" containerID="268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3" Apr 20 19:29:43.710087 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:29:43.710066 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3\": container with ID starting with 268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3 not found: ID does not exist" containerID="268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3" Apr 20 19:29:43.710188 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.710090 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3"} err="failed to get container status \"268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3\": rpc error: code = NotFound desc = could not find container \"268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3\": container with ID starting with 268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3 not found: ID does not exist" Apr 20 19:29:43.710188 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.710109 2575 scope.go:117] "RemoveContainer" containerID="a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108" Apr 20 19:29:43.710381 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:29:43.710361 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108\": container with ID starting with a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108 not found: ID does not exist" containerID="a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108" Apr 20 19:29:43.710427 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.710390 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108"} err="failed to get container status \"a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108\": rpc error: code = NotFound desc = could not find container \"a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108\": container with ID starting with a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108 not found: ID does not exist" Apr 20 19:29:43.710427 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.710415 2575 scope.go:117] "RemoveContainer" containerID="b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3" Apr 20 19:29:43.710732 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:29:43.710714 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3\": container with ID starting with b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3 not found: ID does not exist" containerID="b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3" Apr 20 19:29:43.710841 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.710735 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3"} err="failed to get container status \"b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3\": rpc error: code = NotFound desc = could not find container \"b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3\": container with ID starting with b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3 not found: ID does not exist" Apr 20 19:29:43.710841 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.710748 2575 scope.go:117] "RemoveContainer" containerID="a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04" Apr 20 19:29:43.711031 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:29:43.711014 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04\": container with ID starting with a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04 not found: ID does not exist" containerID="a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04" Apr 20 19:29:43.711075 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.711040 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04"} err="failed to get container status \"a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04\": rpc error: code = NotFound desc = could not find container \"a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04\": container with ID starting with a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04 not found: ID does not exist" Apr 20 19:29:43.711075 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.711053 2575 scope.go:117] "RemoveContainer" containerID="beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc" Apr 20 19:29:43.711352 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.711324 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc"} err="failed to get container status \"beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc\": rpc error: code = NotFound desc = could not find container \"beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc\": container with ID starting with beb0f1f88986e25cd410c1908fb6d81bc530c65729e48f8d3814791a405e9adc not found: ID does not exist" Apr 20 19:29:43.711411 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.711355 2575 scope.go:117] "RemoveContainer" containerID="71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78" Apr 20 19:29:43.711642 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.711589 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 19:29:43.711709 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.711686 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 19:29:43.711751 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.711659 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78"} err="failed to get container status \"71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78\": rpc error: code = NotFound desc = could not find container \"71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78\": container with ID starting with 71fd1bcf6865680c3762bd730fd78211f4c1f2f6d73482d2095ea2c3fccaae78 not found: ID does not exist" Apr 20 19:29:43.711751 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.711625 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 19:29:43.711751 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.711739 2575 scope.go:117] "RemoveContainer" containerID="f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba" Apr 20 19:29:43.711874 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.711682 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-q9zsw\"" Apr 20 19:29:43.712023 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.712008 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 19:29:43.712080 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.712007 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba"} err="failed to get container status \"f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba\": rpc error: code = NotFound desc = could not find container \"f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba\": container with ID starting with f762a7f7f44b9fe9b640f6562d30643c6e7277f28f5744711a8ed10f4408b2ba not found: ID does not exist" Apr 20 19:29:43.712080 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.712045 2575 scope.go:117] "RemoveContainer" containerID="268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3" Apr 20 19:29:43.712080 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.712076 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 19:29:43.712196 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.712147 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 19:29:43.712246 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.712231 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 19:29:43.712489 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.712300 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 19:29:43.712489 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.712426 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3"} err="failed to get container status \"268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3\": rpc error: code = NotFound desc = could not find container \"268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3\": container with ID starting with 268a070c6ee5e6751529cc58170c40f9b021e70679ee258e0cb6fbb4ffaeadf3 not found: ID does not exist" Apr 20 19:29:43.712489 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.712444 2575 scope.go:117] "RemoveContainer" containerID="a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108" Apr 20 19:29:43.712840 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.712816 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108"} err="failed to get container status \"a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108\": rpc error: code = NotFound desc = could not find container \"a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108\": container with ID starting with a1f18aef9f42efbfffb5d52074c757d882cc5eab944403999352ede129b2b108 not found: ID does not exist" Apr 20 19:29:43.712935 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.712842 2575 scope.go:117] "RemoveContainer" containerID="b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3" Apr 20 19:29:43.713190 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.713171 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3"} err="failed to get container status \"b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3\": rpc error: code = NotFound desc = could not find container \"b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3\": container with ID starting with b82ee518f8162ebef9c2f91036507c96039c6f42f28c67351f3faa9f3be89ae3 not found: ID does not exist" Apr 20 19:29:43.713190 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.713190 2575 scope.go:117] "RemoveContainer" containerID="a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04" Apr 20 19:29:43.713512 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.713477 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04"} err="failed to get container status \"a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04\": rpc error: code = NotFound desc = could not find container \"a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04\": container with ID starting with a178eddef10cb7192dbc9a61dbb9f8a0db55e414b73e303c33ada0237e2cda04 not found: ID does not exist" Apr 20 19:29:43.723665 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.723587 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 19:29:43.723870 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.723849 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:29:43.814681 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.814607 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-config-volume\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.814681 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.814687 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-web-config\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.814905 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.814708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d871b7e8-cab0-4215-80e3-01acd23fbf7a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.814905 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.814750 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d871b7e8-cab0-4215-80e3-01acd23fbf7a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.814905 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.814821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d871b7e8-cab0-4215-80e3-01acd23fbf7a-config-out\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.814905 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.814854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d871b7e8-cab0-4215-80e3-01acd23fbf7a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.815052 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.814903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrzpt\" (UniqueName: \"kubernetes.io/projected/d871b7e8-cab0-4215-80e3-01acd23fbf7a-kube-api-access-xrzpt\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.815052 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.814947 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.815052 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.814985 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.815052 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.815013 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.815052 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.815037 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.815214 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.815070 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.815214 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.815086 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d871b7e8-cab0-4215-80e3-01acd23fbf7a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.916294 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.916192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrzpt\" (UniqueName: \"kubernetes.io/projected/d871b7e8-cab0-4215-80e3-01acd23fbf7a-kube-api-access-xrzpt\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.916294 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.916256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.916294 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.916290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.916580 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.916322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.916580 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.916347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.916580 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.916376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.916835 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.916803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d871b7e8-cab0-4215-80e3-01acd23fbf7a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.917016 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.916865 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-config-volume\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.917016 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.916892 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-web-config\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.917016 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.916927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d871b7e8-cab0-4215-80e3-01acd23fbf7a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.917016 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.917005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d871b7e8-cab0-4215-80e3-01acd23fbf7a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.917418 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.917047 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d871b7e8-cab0-4215-80e3-01acd23fbf7a-config-out\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.917418 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.917075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d871b7e8-cab0-4215-80e3-01acd23fbf7a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.917418 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.917219 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d871b7e8-cab0-4215-80e3-01acd23fbf7a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.918404 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.918116 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d871b7e8-cab0-4215-80e3-01acd23fbf7a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.918756 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.918728 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d871b7e8-cab0-4215-80e3-01acd23fbf7a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.919464 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.919433 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.919692 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.919672 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.919692 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.919684 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.920279 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.920226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.920422 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.920404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.920754 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.920735 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d871b7e8-cab0-4215-80e3-01acd23fbf7a-config-out\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.921037 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.921012 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-web-config\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.921375 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.921356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d871b7e8-cab0-4215-80e3-01acd23fbf7a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.921553 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.921538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d871b7e8-cab0-4215-80e3-01acd23fbf7a-config-volume\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:43.925175 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:43.925149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrzpt\" (UniqueName: \"kubernetes.io/projected/d871b7e8-cab0-4215-80e3-01acd23fbf7a-kube-api-access-xrzpt\") pod \"alertmanager-main-0\" (UID: \"d871b7e8-cab0-4215-80e3-01acd23fbf7a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:44.024509 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:44.024467 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:29:44.158179 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:44.158086 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:29:44.174152 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:29:44.174121 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd871b7e8_cab0_4215_80e3_01acd23fbf7a.slice/crio-addd16fa9f31e812e3a3d565a619794f30064ce7414361c220fdcea646d5f3a0 WatchSource:0}: Error finding container addd16fa9f31e812e3a3d565a619794f30064ce7414361c220fdcea646d5f3a0: Status 404 returned error can't find the container with id addd16fa9f31e812e3a3d565a619794f30064ce7414361c220fdcea646d5f3a0 Apr 20 19:29:44.656657 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:44.656603 2575 generic.go:358] "Generic (PLEG): container finished" podID="d871b7e8-cab0-4215-80e3-01acd23fbf7a" containerID="af37a81678b71aaab116beca36afcf036ca0c865b8f5fbb860492d24c71ff40f" exitCode=0 Apr 20 19:29:44.657096 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:44.656722 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d871b7e8-cab0-4215-80e3-01acd23fbf7a","Type":"ContainerDied","Data":"af37a81678b71aaab116beca36afcf036ca0c865b8f5fbb860492d24c71ff40f"} Apr 20 19:29:44.657096 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:44.656765 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d871b7e8-cab0-4215-80e3-01acd23fbf7a","Type":"ContainerStarted","Data":"addd16fa9f31e812e3a3d565a619794f30064ce7414361c220fdcea646d5f3a0"} Apr 20 19:29:44.908864 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:44.908833 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d3a8135-3b09-420c-92dc-fbe987865282" path="/var/lib/kubelet/pods/5d3a8135-3b09-420c-92dc-fbe987865282/volumes" Apr 20 19:29:45.663528 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:45.663483 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d871b7e8-cab0-4215-80e3-01acd23fbf7a","Type":"ContainerStarted","Data":"8673948d511cdf2450c0f1c8db2e6722b551c7aa62b34f2f141fe4a1602b8bbd"} Apr 20 19:29:45.663528 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:45.663520 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d871b7e8-cab0-4215-80e3-01acd23fbf7a","Type":"ContainerStarted","Data":"dcd331df953786e06c491c8c1471214fb096ce64b58270f03f6896ba8057b13e"} Apr 20 19:29:45.663528 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:45.663534 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d871b7e8-cab0-4215-80e3-01acd23fbf7a","Type":"ContainerStarted","Data":"298744cb23638c85d605078d1013a41588e455d93f58e3734dba22258c2e64e1"} Apr 20 19:29:45.664097 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:45.663545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d871b7e8-cab0-4215-80e3-01acd23fbf7a","Type":"ContainerStarted","Data":"34ac82c9b0531f8865c31f443344d8b82e37a2aeee6a40fc8cc9acac5e965493"} Apr 20 19:29:45.664097 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:45.663554 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d871b7e8-cab0-4215-80e3-01acd23fbf7a","Type":"ContainerStarted","Data":"c87df360674210f71d4495d6bd1ded4d3af23d9a208f1fa1528245e796d23133"} Apr 20 19:29:45.664097 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:45.663562 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d871b7e8-cab0-4215-80e3-01acd23fbf7a","Type":"ContainerStarted","Data":"06d3ab7b3bea354b32c50fe3da9b04b1047daf2a903c7dd562d7ad3f1be565e4"} Apr 20 19:29:45.687965 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:45.687858 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.687839864 podStartE2EDuration="2.687839864s" podCreationTimestamp="2026-04-20 19:29:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:29:45.68639323 +0000 UTC m=+241.376806542" watchObservedRunningTime="2026-04-20 19:29:45.687839864 +0000 UTC m=+241.378253202" Apr 20 19:29:50.534563 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:50.534523 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:50.534989 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:50.534658 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:50.539423 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:50.539395 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:50.683585 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:50.683554 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:29:50.726608 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:50.726570 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5dc875cc46-p2shg"] Apr 20 19:29:56.834843 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:56.834746 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:29:56.837310 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:56.837280 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/234402da-caaa-48f3-8a69-400f12f55eb6-metrics-certs\") pod \"network-metrics-daemon-j9xp6\" (UID: \"234402da-caaa-48f3-8a69-400f12f55eb6\") " pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:29:57.007098 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:57.007067 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v8fpn\"" Apr 20 19:29:57.015153 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:57.015122 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j9xp6" Apr 20 19:29:57.143265 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:57.143231 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j9xp6"] Apr 20 19:29:57.146740 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:29:57.146705 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod234402da_caaa_48f3_8a69_400f12f55eb6.slice/crio-b91970a7e294d874dd8d0aff828230fac04a94c395ca245d2ad0dd079bfb42c7 WatchSource:0}: Error finding container b91970a7e294d874dd8d0aff828230fac04a94c395ca245d2ad0dd079bfb42c7: Status 404 returned error can't find the container with id b91970a7e294d874dd8d0aff828230fac04a94c395ca245d2ad0dd079bfb42c7 Apr 20 19:29:57.701887 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:57.701846 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j9xp6" event={"ID":"234402da-caaa-48f3-8a69-400f12f55eb6","Type":"ContainerStarted","Data":"b91970a7e294d874dd8d0aff828230fac04a94c395ca245d2ad0dd079bfb42c7"} Apr 20 19:29:58.706581 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:58.706548 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j9xp6" event={"ID":"234402da-caaa-48f3-8a69-400f12f55eb6","Type":"ContainerStarted","Data":"bcc76ba94d6ba601589ed23089a7a41cf52ddf02978fe578af66a7c948a07b63"} Apr 20 19:29:58.706581 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:58.706581 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j9xp6" event={"ID":"234402da-caaa-48f3-8a69-400f12f55eb6","Type":"ContainerStarted","Data":"c141fe04075bafb556a6e05af8ec339d20fb04fce5a7bf4e9afea4774d0ca323"} Apr 20 19:29:58.721072 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:29:58.721002 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-j9xp6" podStartSLOduration=252.618836386 podStartE2EDuration="4m13.720982131s" podCreationTimestamp="2026-04-20 19:25:45 +0000 UTC" firstStartedPulling="2026-04-20 19:29:57.149336547 +0000 UTC m=+252.839749800" lastFinishedPulling="2026-04-20 19:29:58.251482291 +0000 UTC m=+253.941895545" observedRunningTime="2026-04-20 19:29:58.720779983 +0000 UTC m=+254.411193258" watchObservedRunningTime="2026-04-20 19:29:58.720982131 +0000 UTC m=+254.411395406" Apr 20 19:30:15.747311 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:15.747246 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5dc875cc46-p2shg" podUID="69ef48d3-2a44-4de6-9e9e-f2f19c7d831c" containerName="console" containerID="cri-o://ecf6cf388dc22ec8d53b26183fca405adfa0af9014e34e2d77bcf9f5c1b132f0" gracePeriod=15 Apr 20 19:30:16.005910 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.005843 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5dc875cc46-p2shg_69ef48d3-2a44-4de6-9e9e-f2f19c7d831c/console/0.log" Apr 20 19:30:16.005910 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.005906 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:30:16.096787 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.096747 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-trusted-ca-bundle\") pod \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " Apr 20 19:30:16.096989 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.096797 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-oauth-serving-cert\") pod \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " Apr 20 19:30:16.096989 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.096825 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-config\") pod \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " Apr 20 19:30:16.096989 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.096862 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-service-ca\") pod \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " Apr 20 19:30:16.096989 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.096902 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-serving-cert\") pod \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " Apr 20 19:30:16.096989 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.096925 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2whg\" (UniqueName: \"kubernetes.io/projected/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-kube-api-access-d2whg\") pod \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " Apr 20 19:30:16.096989 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.096941 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-oauth-config\") pod \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\" (UID: \"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c\") " Apr 20 19:30:16.097328 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.097303 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "69ef48d3-2a44-4de6-9e9e-f2f19c7d831c" (UID: "69ef48d3-2a44-4de6-9e9e-f2f19c7d831c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:30:16.097388 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.097300 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "69ef48d3-2a44-4de6-9e9e-f2f19c7d831c" (UID: "69ef48d3-2a44-4de6-9e9e-f2f19c7d831c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:30:16.097451 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.097427 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-service-ca" (OuterVolumeSpecName: "service-ca") pod "69ef48d3-2a44-4de6-9e9e-f2f19c7d831c" (UID: "69ef48d3-2a44-4de6-9e9e-f2f19c7d831c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:30:16.097502 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.097424 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-config" (OuterVolumeSpecName: "console-config") pod "69ef48d3-2a44-4de6-9e9e-f2f19c7d831c" (UID: "69ef48d3-2a44-4de6-9e9e-f2f19c7d831c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:30:16.099317 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.099283 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "69ef48d3-2a44-4de6-9e9e-f2f19c7d831c" (UID: "69ef48d3-2a44-4de6-9e9e-f2f19c7d831c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:30:16.099413 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.099286 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-kube-api-access-d2whg" (OuterVolumeSpecName: "kube-api-access-d2whg") pod "69ef48d3-2a44-4de6-9e9e-f2f19c7d831c" (UID: "69ef48d3-2a44-4de6-9e9e-f2f19c7d831c"). InnerVolumeSpecName "kube-api-access-d2whg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:30:16.099413 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.099356 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "69ef48d3-2a44-4de6-9e9e-f2f19c7d831c" (UID: "69ef48d3-2a44-4de6-9e9e-f2f19c7d831c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:30:16.197913 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.197869 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d2whg\" (UniqueName: \"kubernetes.io/projected/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-kube-api-access-d2whg\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:30:16.197913 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.197908 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-oauth-config\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:30:16.197913 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.197918 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-trusted-ca-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:30:16.198162 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.197927 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-oauth-serving-cert\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:30:16.198162 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.197938 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-config\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:30:16.198162 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.197947 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-service-ca\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:30:16.198162 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.197955 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c-console-serving-cert\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:30:16.767772 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.767743 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5dc875cc46-p2shg_69ef48d3-2a44-4de6-9e9e-f2f19c7d831c/console/0.log" Apr 20 19:30:16.768308 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.767788 2575 generic.go:358] "Generic (PLEG): container finished" podID="69ef48d3-2a44-4de6-9e9e-f2f19c7d831c" containerID="ecf6cf388dc22ec8d53b26183fca405adfa0af9014e34e2d77bcf9f5c1b132f0" exitCode=2 Apr 20 19:30:16.768308 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.767857 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dc875cc46-p2shg" event={"ID":"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c","Type":"ContainerDied","Data":"ecf6cf388dc22ec8d53b26183fca405adfa0af9014e34e2d77bcf9f5c1b132f0"} Apr 20 19:30:16.768308 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.767892 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dc875cc46-p2shg" event={"ID":"69ef48d3-2a44-4de6-9e9e-f2f19c7d831c","Type":"ContainerDied","Data":"76302d153f2670e4590201280f55b2411b41e6255ddb07f38b722b1e9e411b67"} Apr 20 19:30:16.768308 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.767897 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dc875cc46-p2shg" Apr 20 19:30:16.768308 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.767911 2575 scope.go:117] "RemoveContainer" containerID="ecf6cf388dc22ec8d53b26183fca405adfa0af9014e34e2d77bcf9f5c1b132f0" Apr 20 19:30:16.776516 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.776498 2575 scope.go:117] "RemoveContainer" containerID="ecf6cf388dc22ec8d53b26183fca405adfa0af9014e34e2d77bcf9f5c1b132f0" Apr 20 19:30:16.776856 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:30:16.776836 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecf6cf388dc22ec8d53b26183fca405adfa0af9014e34e2d77bcf9f5c1b132f0\": container with ID starting with ecf6cf388dc22ec8d53b26183fca405adfa0af9014e34e2d77bcf9f5c1b132f0 not found: ID does not exist" containerID="ecf6cf388dc22ec8d53b26183fca405adfa0af9014e34e2d77bcf9f5c1b132f0" Apr 20 19:30:16.776914 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.776868 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecf6cf388dc22ec8d53b26183fca405adfa0af9014e34e2d77bcf9f5c1b132f0"} err="failed to get container status \"ecf6cf388dc22ec8d53b26183fca405adfa0af9014e34e2d77bcf9f5c1b132f0\": rpc error: code = NotFound desc = could not find container \"ecf6cf388dc22ec8d53b26183fca405adfa0af9014e34e2d77bcf9f5c1b132f0\": container with ID starting with ecf6cf388dc22ec8d53b26183fca405adfa0af9014e34e2d77bcf9f5c1b132f0 not found: ID does not exist" Apr 20 19:30:16.790603 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.790563 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5dc875cc46-p2shg"] Apr 20 19:30:16.793996 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.793962 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5dc875cc46-p2shg"] Apr 20 19:30:16.907588 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:16.907545 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69ef48d3-2a44-4de6-9e9e-f2f19c7d831c" path="/var/lib/kubelet/pods/69ef48d3-2a44-4de6-9e9e-f2f19c7d831c/volumes" Apr 20 19:30:34.434221 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.430928 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-85qqw"] Apr 20 19:30:34.434221 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.431360 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69ef48d3-2a44-4de6-9e9e-f2f19c7d831c" containerName="console" Apr 20 19:30:34.434221 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.431402 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ef48d3-2a44-4de6-9e9e-f2f19c7d831c" containerName="console" Apr 20 19:30:34.434221 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.431491 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="69ef48d3-2a44-4de6-9e9e-f2f19c7d831c" containerName="console" Apr 20 19:30:34.435143 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.435119 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-85qqw" Apr 20 19:30:34.435698 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.435665 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/18c8a371-8917-4525-b6a4-7df091337b68-dbus\") pod \"global-pull-secret-syncer-85qqw\" (UID: \"18c8a371-8917-4525-b6a4-7df091337b68\") " pod="kube-system/global-pull-secret-syncer-85qqw" Apr 20 19:30:34.435856 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.435809 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/18c8a371-8917-4525-b6a4-7df091337b68-original-pull-secret\") pod \"global-pull-secret-syncer-85qqw\" (UID: \"18c8a371-8917-4525-b6a4-7df091337b68\") " pod="kube-system/global-pull-secret-syncer-85qqw" Apr 20 19:30:34.435938 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.435858 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/18c8a371-8917-4525-b6a4-7df091337b68-kubelet-config\") pod \"global-pull-secret-syncer-85qqw\" (UID: \"18c8a371-8917-4525-b6a4-7df091337b68\") " pod="kube-system/global-pull-secret-syncer-85qqw" Apr 20 19:30:34.437887 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.437861 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 19:30:34.439088 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.439068 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-85qqw"] Apr 20 19:30:34.536661 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.536585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/18c8a371-8917-4525-b6a4-7df091337b68-original-pull-secret\") pod \"global-pull-secret-syncer-85qqw\" (UID: \"18c8a371-8917-4525-b6a4-7df091337b68\") " pod="kube-system/global-pull-secret-syncer-85qqw" Apr 20 19:30:34.536881 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.536679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/18c8a371-8917-4525-b6a4-7df091337b68-kubelet-config\") pod \"global-pull-secret-syncer-85qqw\" (UID: \"18c8a371-8917-4525-b6a4-7df091337b68\") " pod="kube-system/global-pull-secret-syncer-85qqw" Apr 20 19:30:34.536881 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.536725 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/18c8a371-8917-4525-b6a4-7df091337b68-dbus\") pod \"global-pull-secret-syncer-85qqw\" (UID: \"18c8a371-8917-4525-b6a4-7df091337b68\") " pod="kube-system/global-pull-secret-syncer-85qqw" Apr 20 19:30:34.536881 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.536849 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/18c8a371-8917-4525-b6a4-7df091337b68-kubelet-config\") pod \"global-pull-secret-syncer-85qqw\" (UID: \"18c8a371-8917-4525-b6a4-7df091337b68\") " pod="kube-system/global-pull-secret-syncer-85qqw" Apr 20 19:30:34.537020 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.536933 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/18c8a371-8917-4525-b6a4-7df091337b68-dbus\") pod \"global-pull-secret-syncer-85qqw\" (UID: \"18c8a371-8917-4525-b6a4-7df091337b68\") " pod="kube-system/global-pull-secret-syncer-85qqw" Apr 20 19:30:34.539184 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.539160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/18c8a371-8917-4525-b6a4-7df091337b68-original-pull-secret\") pod \"global-pull-secret-syncer-85qqw\" (UID: \"18c8a371-8917-4525-b6a4-7df091337b68\") " pod="kube-system/global-pull-secret-syncer-85qqw" Apr 20 19:30:34.745508 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.745397 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-85qqw" Apr 20 19:30:34.874215 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:34.874172 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-85qqw"] Apr 20 19:30:34.878139 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:30:34.878105 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18c8a371_8917_4525_b6a4_7df091337b68.slice/crio-1035e4822de39e364de2ce960b62a54e8c74896776f850b6eafb3d151bd83db5 WatchSource:0}: Error finding container 1035e4822de39e364de2ce960b62a54e8c74896776f850b6eafb3d151bd83db5: Status 404 returned error can't find the container with id 1035e4822de39e364de2ce960b62a54e8c74896776f850b6eafb3d151bd83db5 Apr 20 19:30:35.829337 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:35.829301 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-85qqw" event={"ID":"18c8a371-8917-4525-b6a4-7df091337b68","Type":"ContainerStarted","Data":"1035e4822de39e364de2ce960b62a54e8c74896776f850b6eafb3d151bd83db5"} Apr 20 19:30:39.843982 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:39.843944 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-85qqw" event={"ID":"18c8a371-8917-4525-b6a4-7df091337b68","Type":"ContainerStarted","Data":"4978edd8437cc7e670ab4dd58a3c172c0c6a456866a9a1288fac4a1cd26eddbc"} Apr 20 19:30:39.863931 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:39.863872 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-85qqw" podStartSLOduration=1.932356903 podStartE2EDuration="5.863856471s" podCreationTimestamp="2026-04-20 19:30:34 +0000 UTC" firstStartedPulling="2026-04-20 19:30:34.880061735 +0000 UTC m=+290.570474987" lastFinishedPulling="2026-04-20 19:30:38.8115613 +0000 UTC m=+294.501974555" observedRunningTime="2026-04-20 19:30:39.862750272 +0000 UTC m=+295.553163550" watchObservedRunningTime="2026-04-20 19:30:39.863856471 +0000 UTC m=+295.554269748" Apr 20 19:30:44.784325 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:44.784295 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w558t_97e58d97-c4b1-4d4a-a6f6-d87a86138255/ovn-acl-logging/0.log" Apr 20 19:30:44.784325 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:44.784318 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w558t_97e58d97-c4b1-4d4a-a6f6-d87a86138255/ovn-acl-logging/0.log" Apr 20 19:30:44.786999 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:44.786973 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 19:30:49.194471 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.194433 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x"] Apr 20 19:30:49.197726 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.197706 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" Apr 20 19:30:49.203352 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.203317 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:30:49.203687 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.203669 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5pq88\"" Apr 20 19:30:49.204901 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.204491 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:30:49.209960 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.209926 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x"] Apr 20 19:30:49.250124 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.250079 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x\" (UID: \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" Apr 20 19:30:49.250124 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.250124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x\" (UID: \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" Apr 20 19:30:49.250345 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.250191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzb46\" (UniqueName: \"kubernetes.io/projected/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-kube-api-access-zzb46\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x\" (UID: \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" Apr 20 19:30:49.351679 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.351641 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x\" (UID: \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" Apr 20 19:30:49.351821 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.351700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x\" (UID: \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" Apr 20 19:30:49.351821 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.351724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzb46\" (UniqueName: \"kubernetes.io/projected/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-kube-api-access-zzb46\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x\" (UID: \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" Apr 20 19:30:49.352026 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.352003 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x\" (UID: \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" Apr 20 19:30:49.352088 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.352067 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x\" (UID: \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" Apr 20 19:30:49.360540 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.360497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzb46\" (UniqueName: \"kubernetes.io/projected/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-kube-api-access-zzb46\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x\" (UID: \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" Apr 20 19:30:49.508171 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.508074 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" Apr 20 19:30:49.649005 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.648948 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x"] Apr 20 19:30:49.651919 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:30:49.651889 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79b2bc7d_8044_4eb2_b9f2_a55c53b8075c.slice/crio-4243decdcc05bfadb381f306a5ab58eb62ebf2eac06aca1b424e33ec268145b6 WatchSource:0}: Error finding container 4243decdcc05bfadb381f306a5ab58eb62ebf2eac06aca1b424e33ec268145b6: Status 404 returned error can't find the container with id 4243decdcc05bfadb381f306a5ab58eb62ebf2eac06aca1b424e33ec268145b6 Apr 20 19:30:49.653987 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.653964 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:30:49.874521 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:49.874432 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" event={"ID":"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c","Type":"ContainerStarted","Data":"4243decdcc05bfadb381f306a5ab58eb62ebf2eac06aca1b424e33ec268145b6"} Apr 20 19:30:55.894569 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:55.894532 2575 generic.go:358] "Generic (PLEG): container finished" podID="79b2bc7d-8044-4eb2-b9f2-a55c53b8075c" containerID="38a33503e72d86fdd61541088cd31bb6f88345569643b56f19987112f6d053a3" exitCode=0 Apr 20 19:30:55.894996 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:55.894582 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" event={"ID":"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c","Type":"ContainerDied","Data":"38a33503e72d86fdd61541088cd31bb6f88345569643b56f19987112f6d053a3"} Apr 20 19:30:58.905910 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:58.905879 2575 generic.go:358] "Generic (PLEG): container finished" podID="79b2bc7d-8044-4eb2-b9f2-a55c53b8075c" containerID="64ad9962f51fc4b1eec89e21b030cfa5149d660483e1d8ed923ad4111b315f15" exitCode=0 Apr 20 19:30:58.907931 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:30:58.907903 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" event={"ID":"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c","Type":"ContainerDied","Data":"64ad9962f51fc4b1eec89e21b030cfa5149d660483e1d8ed923ad4111b315f15"} Apr 20 19:31:05.929163 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:05.929125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" event={"ID":"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c","Type":"ContainerStarted","Data":"e205bf9b60c67d52e51af88a8e6292ac3304ca0f3d16cadc7a6896c6c9e63fdb"} Apr 20 19:31:05.947748 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:05.947687 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" podStartSLOduration=0.795727386 podStartE2EDuration="16.947668739s" podCreationTimestamp="2026-04-20 19:30:49 +0000 UTC" firstStartedPulling="2026-04-20 19:30:49.654099787 +0000 UTC m=+305.344513039" lastFinishedPulling="2026-04-20 19:31:05.806040935 +0000 UTC m=+321.496454392" observedRunningTime="2026-04-20 19:31:05.947419185 +0000 UTC m=+321.637832494" watchObservedRunningTime="2026-04-20 19:31:05.947668739 +0000 UTC m=+321.638082015" Apr 20 19:31:06.934329 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:06.934287 2575 generic.go:358] "Generic (PLEG): container finished" podID="79b2bc7d-8044-4eb2-b9f2-a55c53b8075c" containerID="e205bf9b60c67d52e51af88a8e6292ac3304ca0f3d16cadc7a6896c6c9e63fdb" exitCode=0 Apr 20 19:31:06.934739 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:06.934394 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" event={"ID":"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c","Type":"ContainerDied","Data":"e205bf9b60c67d52e51af88a8e6292ac3304ca0f3d16cadc7a6896c6c9e63fdb"} Apr 20 19:31:08.064422 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:08.064395 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" Apr 20 19:31:08.130188 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:08.130149 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-util\") pod \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\" (UID: \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\") " Apr 20 19:31:08.130385 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:08.130232 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-bundle\") pod \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\" (UID: \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\") " Apr 20 19:31:08.130385 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:08.130288 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzb46\" (UniqueName: \"kubernetes.io/projected/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-kube-api-access-zzb46\") pod \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\" (UID: \"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c\") " Apr 20 19:31:08.130916 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:08.130880 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-bundle" (OuterVolumeSpecName: "bundle") pod "79b2bc7d-8044-4eb2-b9f2-a55c53b8075c" (UID: "79b2bc7d-8044-4eb2-b9f2-a55c53b8075c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:31:08.132654 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:08.132627 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-kube-api-access-zzb46" (OuterVolumeSpecName: "kube-api-access-zzb46") pod "79b2bc7d-8044-4eb2-b9f2-a55c53b8075c" (UID: "79b2bc7d-8044-4eb2-b9f2-a55c53b8075c"). InnerVolumeSpecName "kube-api-access-zzb46". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:31:08.135672 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:08.135633 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-util" (OuterVolumeSpecName: "util") pod "79b2bc7d-8044-4eb2-b9f2-a55c53b8075c" (UID: "79b2bc7d-8044-4eb2-b9f2-a55c53b8075c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:31:08.231741 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:08.231638 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:31:08.231741 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:08.231695 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zzb46\" (UniqueName: \"kubernetes.io/projected/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-kube-api-access-zzb46\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:31:08.231741 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:08.231707 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/79b2bc7d-8044-4eb2-b9f2-a55c53b8075c-util\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:31:08.942335 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:08.942300 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" event={"ID":"79b2bc7d-8044-4eb2-b9f2-a55c53b8075c","Type":"ContainerDied","Data":"4243decdcc05bfadb381f306a5ab58eb62ebf2eac06aca1b424e33ec268145b6"} Apr 20 19:31:08.942335 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:08.942336 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4243decdcc05bfadb381f306a5ab58eb62ebf2eac06aca1b424e33ec268145b6" Apr 20 19:31:08.942536 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:08.942345 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d8252x" Apr 20 19:31:11.939221 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:11.939184 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vdlqr"] Apr 20 19:31:11.939645 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:11.939502 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79b2bc7d-8044-4eb2-b9f2-a55c53b8075c" containerName="extract" Apr 20 19:31:11.939645 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:11.939513 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b2bc7d-8044-4eb2-b9f2-a55c53b8075c" containerName="extract" Apr 20 19:31:11.939645 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:11.939526 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79b2bc7d-8044-4eb2-b9f2-a55c53b8075c" containerName="util" Apr 20 19:31:11.939645 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:11.939532 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b2bc7d-8044-4eb2-b9f2-a55c53b8075c" containerName="util" Apr 20 19:31:11.939645 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:11.939542 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79b2bc7d-8044-4eb2-b9f2-a55c53b8075c" containerName="pull" Apr 20 19:31:11.939645 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:11.939547 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b2bc7d-8044-4eb2-b9f2-a55c53b8075c" containerName="pull" Apr 20 19:31:11.939645 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:11.939603 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="79b2bc7d-8044-4eb2-b9f2-a55c53b8075c" containerName="extract" Apr 20 19:31:11.953909 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:11.953877 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vdlqr" Apr 20 19:31:11.956301 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:11.956266 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vdlqr"] Apr 20 19:31:11.957070 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:11.957048 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:31:11.957215 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:11.957048 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-4gtxw\"" Apr 20 19:31:11.957215 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:11.957051 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 19:31:12.064453 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:12.064399 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcdh6\" (UniqueName: \"kubernetes.io/projected/5e37c0e6-59c2-4bb9-a456-8e30369ff8a5-kube-api-access-mcdh6\") pod \"cert-manager-operator-controller-manager-54b9655956-vdlqr\" (UID: \"5e37c0e6-59c2-4bb9-a456-8e30369ff8a5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vdlqr" Apr 20 19:31:12.064768 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:12.064739 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e37c0e6-59c2-4bb9-a456-8e30369ff8a5-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-vdlqr\" (UID: \"5e37c0e6-59c2-4bb9-a456-8e30369ff8a5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vdlqr" Apr 20 19:31:12.165442 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:12.165387 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e37c0e6-59c2-4bb9-a456-8e30369ff8a5-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-vdlqr\" (UID: \"5e37c0e6-59c2-4bb9-a456-8e30369ff8a5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vdlqr" Apr 20 19:31:12.165680 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:12.165475 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcdh6\" (UniqueName: \"kubernetes.io/projected/5e37c0e6-59c2-4bb9-a456-8e30369ff8a5-kube-api-access-mcdh6\") pod \"cert-manager-operator-controller-manager-54b9655956-vdlqr\" (UID: \"5e37c0e6-59c2-4bb9-a456-8e30369ff8a5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vdlqr" Apr 20 19:31:12.165892 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:12.165869 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5e37c0e6-59c2-4bb9-a456-8e30369ff8a5-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-vdlqr\" (UID: \"5e37c0e6-59c2-4bb9-a456-8e30369ff8a5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vdlqr" Apr 20 19:31:12.180301 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:12.180266 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcdh6\" (UniqueName: \"kubernetes.io/projected/5e37c0e6-59c2-4bb9-a456-8e30369ff8a5-kube-api-access-mcdh6\") pod \"cert-manager-operator-controller-manager-54b9655956-vdlqr\" (UID: \"5e37c0e6-59c2-4bb9-a456-8e30369ff8a5\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vdlqr" Apr 20 19:31:12.266075 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:12.265970 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vdlqr" Apr 20 19:31:12.405121 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:12.405087 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vdlqr"] Apr 20 19:31:12.407974 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:31:12.407937 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e37c0e6_59c2_4bb9_a456_8e30369ff8a5.slice/crio-84afdca3d1815b72bc8498eb5b1c54ec451d260b8c4a07191feb6b4f6ddfd764 WatchSource:0}: Error finding container 84afdca3d1815b72bc8498eb5b1c54ec451d260b8c4a07191feb6b4f6ddfd764: Status 404 returned error can't find the container with id 84afdca3d1815b72bc8498eb5b1c54ec451d260b8c4a07191feb6b4f6ddfd764 Apr 20 19:31:12.956368 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:12.956327 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vdlqr" event={"ID":"5e37c0e6-59c2-4bb9-a456-8e30369ff8a5","Type":"ContainerStarted","Data":"84afdca3d1815b72bc8498eb5b1c54ec451d260b8c4a07191feb6b4f6ddfd764"} Apr 20 19:31:14.965150 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:14.965041 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vdlqr" event={"ID":"5e37c0e6-59c2-4bb9-a456-8e30369ff8a5","Type":"ContainerStarted","Data":"cb6faa52381a7efec6c5faf698489e91406d940c2f5e4af2c60002d48c20725d"} Apr 20 19:31:14.989654 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:14.989555 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-vdlqr" podStartSLOduration=1.7288685369999999 podStartE2EDuration="3.989536873s" podCreationTimestamp="2026-04-20 19:31:11 +0000 UTC" firstStartedPulling="2026-04-20 19:31:12.410586465 +0000 UTC m=+328.100999716" lastFinishedPulling="2026-04-20 19:31:14.6712548 +0000 UTC m=+330.361668052" observedRunningTime="2026-04-20 19:31:14.989225053 +0000 UTC m=+330.679638324" watchObservedRunningTime="2026-04-20 19:31:14.989536873 +0000 UTC m=+330.679950139" Apr 20 19:31:16.241126 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.241084 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr"] Apr 20 19:31:16.273428 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.273389 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr"] Apr 20 19:31:16.273596 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.273520 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" Apr 20 19:31:16.276509 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.276482 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5pq88\"" Apr 20 19:31:16.276797 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.276779 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:31:16.277670 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.277647 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:31:16.405168 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.405135 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7906465-9c4f-41af-953d-d544d557e29b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr\" (UID: \"c7906465-9c4f-41af-953d-d544d557e29b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" Apr 20 19:31:16.405168 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.405169 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7906465-9c4f-41af-953d-d544d557e29b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr\" (UID: \"c7906465-9c4f-41af-953d-d544d557e29b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" Apr 20 19:31:16.405387 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.405230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g49rc\" (UniqueName: \"kubernetes.io/projected/c7906465-9c4f-41af-953d-d544d557e29b-kube-api-access-g49rc\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr\" (UID: \"c7906465-9c4f-41af-953d-d544d557e29b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" Apr 20 19:31:16.506360 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.506257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7906465-9c4f-41af-953d-d544d557e29b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr\" (UID: \"c7906465-9c4f-41af-953d-d544d557e29b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" Apr 20 19:31:16.506360 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.506305 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7906465-9c4f-41af-953d-d544d557e29b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr\" (UID: \"c7906465-9c4f-41af-953d-d544d557e29b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" Apr 20 19:31:16.506360 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.506351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g49rc\" (UniqueName: \"kubernetes.io/projected/c7906465-9c4f-41af-953d-d544d557e29b-kube-api-access-g49rc\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr\" (UID: \"c7906465-9c4f-41af-953d-d544d557e29b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" Apr 20 19:31:16.506676 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.506658 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7906465-9c4f-41af-953d-d544d557e29b-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr\" (UID: \"c7906465-9c4f-41af-953d-d544d557e29b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" Apr 20 19:31:16.506739 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.506681 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7906465-9c4f-41af-953d-d544d557e29b-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr\" (UID: \"c7906465-9c4f-41af-953d-d544d557e29b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" Apr 20 19:31:16.514887 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.514852 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g49rc\" (UniqueName: \"kubernetes.io/projected/c7906465-9c4f-41af-953d-d544d557e29b-kube-api-access-g49rc\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr\" (UID: \"c7906465-9c4f-41af-953d-d544d557e29b\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" Apr 20 19:31:16.583404 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.583373 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" Apr 20 19:31:16.716491 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.716455 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr"] Apr 20 19:31:16.719527 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:31:16.719497 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7906465_9c4f_41af_953d_d544d557e29b.slice/crio-5179f26f0a5485980dea98be284b10543cc0edb31f8508333784d3b96d1468ed WatchSource:0}: Error finding container 5179f26f0a5485980dea98be284b10543cc0edb31f8508333784d3b96d1468ed: Status 404 returned error can't find the container with id 5179f26f0a5485980dea98be284b10543cc0edb31f8508333784d3b96d1468ed Apr 20 19:31:16.972331 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.972294 2575 generic.go:358] "Generic (PLEG): container finished" podID="c7906465-9c4f-41af-953d-d544d557e29b" containerID="f5fda27822be39f0a653e6a725a35bf45f56cd117610273430a5ba5d7f8efeb7" exitCode=0 Apr 20 19:31:16.972514 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.972372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" event={"ID":"c7906465-9c4f-41af-953d-d544d557e29b","Type":"ContainerDied","Data":"f5fda27822be39f0a653e6a725a35bf45f56cd117610273430a5ba5d7f8efeb7"} Apr 20 19:31:16.972514 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.972411 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" event={"ID":"c7906465-9c4f-41af-953d-d544d557e29b","Type":"ContainerStarted","Data":"5179f26f0a5485980dea98be284b10543cc0edb31f8508333784d3b96d1468ed"} Apr 20 19:31:16.993312 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:16.993273 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-bbrpm"] Apr 20 19:31:17.006012 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:17.005980 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-bbrpm"] Apr 20 19:31:17.006169 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:17.006103 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-bbrpm" Apr 20 19:31:17.008833 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:17.008790 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 19:31:17.008999 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:17.008800 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-4gnb7\"" Apr 20 19:31:17.008999 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:17.008799 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 19:31:17.112028 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:17.111984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8jrq\" (UniqueName: \"kubernetes.io/projected/8601eda2-9ecc-4b41-a979-44cea0cd3a73-kube-api-access-t8jrq\") pod \"cert-manager-webhook-587ccfb98-bbrpm\" (UID: \"8601eda2-9ecc-4b41-a979-44cea0cd3a73\") " pod="cert-manager/cert-manager-webhook-587ccfb98-bbrpm" Apr 20 19:31:17.112028 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:17.112031 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8601eda2-9ecc-4b41-a979-44cea0cd3a73-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-bbrpm\" (UID: \"8601eda2-9ecc-4b41-a979-44cea0cd3a73\") " pod="cert-manager/cert-manager-webhook-587ccfb98-bbrpm" Apr 20 19:31:17.217746 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:17.213375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8jrq\" (UniqueName: \"kubernetes.io/projected/8601eda2-9ecc-4b41-a979-44cea0cd3a73-kube-api-access-t8jrq\") pod \"cert-manager-webhook-587ccfb98-bbrpm\" (UID: \"8601eda2-9ecc-4b41-a979-44cea0cd3a73\") " pod="cert-manager/cert-manager-webhook-587ccfb98-bbrpm" Apr 20 19:31:17.217746 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:17.213455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8601eda2-9ecc-4b41-a979-44cea0cd3a73-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-bbrpm\" (UID: \"8601eda2-9ecc-4b41-a979-44cea0cd3a73\") " pod="cert-manager/cert-manager-webhook-587ccfb98-bbrpm" Apr 20 19:31:17.222177 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:17.222133 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8601eda2-9ecc-4b41-a979-44cea0cd3a73-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-bbrpm\" (UID: \"8601eda2-9ecc-4b41-a979-44cea0cd3a73\") " pod="cert-manager/cert-manager-webhook-587ccfb98-bbrpm" Apr 20 19:31:17.222321 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:17.222290 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8jrq\" (UniqueName: \"kubernetes.io/projected/8601eda2-9ecc-4b41-a979-44cea0cd3a73-kube-api-access-t8jrq\") pod \"cert-manager-webhook-587ccfb98-bbrpm\" (UID: \"8601eda2-9ecc-4b41-a979-44cea0cd3a73\") " pod="cert-manager/cert-manager-webhook-587ccfb98-bbrpm" Apr 20 19:31:17.327832 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:17.327724 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-bbrpm" Apr 20 19:31:17.457992 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:17.457962 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-bbrpm"] Apr 20 19:31:17.460736 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:31:17.460708 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8601eda2_9ecc_4b41_a979_44cea0cd3a73.slice/crio-b54f7e599dff3ac2cb6c71891fafb8d83728ba5b657bc914b35da9ffd4208460 WatchSource:0}: Error finding container b54f7e599dff3ac2cb6c71891fafb8d83728ba5b657bc914b35da9ffd4208460: Status 404 returned error can't find the container with id b54f7e599dff3ac2cb6c71891fafb8d83728ba5b657bc914b35da9ffd4208460 Apr 20 19:31:17.976949 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:17.976910 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-bbrpm" event={"ID":"8601eda2-9ecc-4b41-a979-44cea0cd3a73","Type":"ContainerStarted","Data":"b54f7e599dff3ac2cb6c71891fafb8d83728ba5b657bc914b35da9ffd4208460"} Apr 20 19:31:21.994928 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:21.994892 2575 generic.go:358] "Generic (PLEG): container finished" podID="c7906465-9c4f-41af-953d-d544d557e29b" containerID="70075ccd52007301013c1e463aa0f3c5e8c9542942e804bd563f12e9a88ae924" exitCode=0 Apr 20 19:31:21.995400 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:21.994980 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" event={"ID":"c7906465-9c4f-41af-953d-d544d557e29b","Type":"ContainerDied","Data":"70075ccd52007301013c1e463aa0f3c5e8c9542942e804bd563f12e9a88ae924"} Apr 20 19:31:21.996577 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:21.996548 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-bbrpm" event={"ID":"8601eda2-9ecc-4b41-a979-44cea0cd3a73","Type":"ContainerStarted","Data":"8a47cdde99e100f7d86acbe1d53f7f8cfa831cddc635ca9cfb2ba9e9413b724a"} Apr 20 19:31:21.996709 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:21.996647 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-bbrpm" Apr 20 19:31:22.028788 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:22.028717 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-bbrpm" podStartSLOduration=2.280896897 podStartE2EDuration="6.028694567s" podCreationTimestamp="2026-04-20 19:31:16 +0000 UTC" firstStartedPulling="2026-04-20 19:31:17.462716934 +0000 UTC m=+333.153130186" lastFinishedPulling="2026-04-20 19:31:21.210514599 +0000 UTC m=+336.900927856" observedRunningTime="2026-04-20 19:31:22.027990212 +0000 UTC m=+337.718403488" watchObservedRunningTime="2026-04-20 19:31:22.028694567 +0000 UTC m=+337.719107842" Apr 20 19:31:23.002596 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:23.002518 2575 generic.go:358] "Generic (PLEG): container finished" podID="c7906465-9c4f-41af-953d-d544d557e29b" containerID="91c5e94d8a3ad38e841dd8adcd235375a73c0209bdb617e6a17b243380305287" exitCode=0 Apr 20 19:31:23.003080 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:23.002624 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" event={"ID":"c7906465-9c4f-41af-953d-d544d557e29b","Type":"ContainerDied","Data":"91c5e94d8a3ad38e841dd8adcd235375a73c0209bdb617e6a17b243380305287"} Apr 20 19:31:24.131875 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:24.131844 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" Apr 20 19:31:24.279348 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:24.279246 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7906465-9c4f-41af-953d-d544d557e29b-bundle\") pod \"c7906465-9c4f-41af-953d-d544d557e29b\" (UID: \"c7906465-9c4f-41af-953d-d544d557e29b\") " Apr 20 19:31:24.279348 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:24.279340 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7906465-9c4f-41af-953d-d544d557e29b-util\") pod \"c7906465-9c4f-41af-953d-d544d557e29b\" (UID: \"c7906465-9c4f-41af-953d-d544d557e29b\") " Apr 20 19:31:24.279608 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:24.279367 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g49rc\" (UniqueName: \"kubernetes.io/projected/c7906465-9c4f-41af-953d-d544d557e29b-kube-api-access-g49rc\") pod \"c7906465-9c4f-41af-953d-d544d557e29b\" (UID: \"c7906465-9c4f-41af-953d-d544d557e29b\") " Apr 20 19:31:24.279733 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:24.279707 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7906465-9c4f-41af-953d-d544d557e29b-bundle" (OuterVolumeSpecName: "bundle") pod "c7906465-9c4f-41af-953d-d544d557e29b" (UID: "c7906465-9c4f-41af-953d-d544d557e29b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:31:24.281580 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:24.281555 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7906465-9c4f-41af-953d-d544d557e29b-kube-api-access-g49rc" (OuterVolumeSpecName: "kube-api-access-g49rc") pod "c7906465-9c4f-41af-953d-d544d557e29b" (UID: "c7906465-9c4f-41af-953d-d544d557e29b"). InnerVolumeSpecName "kube-api-access-g49rc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:31:24.283633 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:24.283578 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7906465-9c4f-41af-953d-d544d557e29b-util" (OuterVolumeSpecName: "util") pod "c7906465-9c4f-41af-953d-d544d557e29b" (UID: "c7906465-9c4f-41af-953d-d544d557e29b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:31:24.380327 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:24.380282 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7906465-9c4f-41af-953d-d544d557e29b-util\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:31:24.380327 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:24.380317 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g49rc\" (UniqueName: \"kubernetes.io/projected/c7906465-9c4f-41af-953d-d544d557e29b-kube-api-access-g49rc\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:31:24.380327 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:24.380333 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7906465-9c4f-41af-953d-d544d557e29b-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:31:25.011936 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:25.011903 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" Apr 20 19:31:25.012117 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:25.011901 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7pkzr" event={"ID":"c7906465-9c4f-41af-953d-d544d557e29b","Type":"ContainerDied","Data":"5179f26f0a5485980dea98be284b10543cc0edb31f8508333784d3b96d1468ed"} Apr 20 19:31:25.012117 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:25.012029 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5179f26f0a5485980dea98be284b10543cc0edb31f8508333784d3b96d1468ed" Apr 20 19:31:28.005309 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.005280 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-bbrpm" Apr 20 19:31:28.342026 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.341933 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-5tt9g"] Apr 20 19:31:28.342271 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.342259 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7906465-9c4f-41af-953d-d544d557e29b" containerName="extract" Apr 20 19:31:28.342315 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.342274 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7906465-9c4f-41af-953d-d544d557e29b" containerName="extract" Apr 20 19:31:28.342315 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.342290 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7906465-9c4f-41af-953d-d544d557e29b" containerName="pull" Apr 20 19:31:28.342315 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.342295 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7906465-9c4f-41af-953d-d544d557e29b" containerName="pull" Apr 20 19:31:28.342315 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.342305 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7906465-9c4f-41af-953d-d544d557e29b" containerName="util" Apr 20 19:31:28.342315 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.342311 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7906465-9c4f-41af-953d-d544d557e29b" containerName="util" Apr 20 19:31:28.342459 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.342353 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7906465-9c4f-41af-953d-d544d557e29b" containerName="extract" Apr 20 19:31:28.344550 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.344530 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-5tt9g" Apr 20 19:31:28.347711 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.347687 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-j9t9s\"" Apr 20 19:31:28.356417 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.356387 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-5tt9g"] Apr 20 19:31:28.410760 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.410722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a2a632e2-c681-4d83-98bc-34be231e075d-bound-sa-token\") pod \"cert-manager-79c8d999ff-5tt9g\" (UID: \"a2a632e2-c681-4d83-98bc-34be231e075d\") " pod="cert-manager/cert-manager-79c8d999ff-5tt9g" Apr 20 19:31:28.410952 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.410802 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7psh\" (UniqueName: \"kubernetes.io/projected/a2a632e2-c681-4d83-98bc-34be231e075d-kube-api-access-m7psh\") pod \"cert-manager-79c8d999ff-5tt9g\" (UID: \"a2a632e2-c681-4d83-98bc-34be231e075d\") " pod="cert-manager/cert-manager-79c8d999ff-5tt9g" Apr 20 19:31:28.511916 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.511878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a2a632e2-c681-4d83-98bc-34be231e075d-bound-sa-token\") pod \"cert-manager-79c8d999ff-5tt9g\" (UID: \"a2a632e2-c681-4d83-98bc-34be231e075d\") " pod="cert-manager/cert-manager-79c8d999ff-5tt9g" Apr 20 19:31:28.512096 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.511942 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7psh\" (UniqueName: \"kubernetes.io/projected/a2a632e2-c681-4d83-98bc-34be231e075d-kube-api-access-m7psh\") pod \"cert-manager-79c8d999ff-5tt9g\" (UID: \"a2a632e2-c681-4d83-98bc-34be231e075d\") " pod="cert-manager/cert-manager-79c8d999ff-5tt9g" Apr 20 19:31:28.521020 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.520981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a2a632e2-c681-4d83-98bc-34be231e075d-bound-sa-token\") pod \"cert-manager-79c8d999ff-5tt9g\" (UID: \"a2a632e2-c681-4d83-98bc-34be231e075d\") " pod="cert-manager/cert-manager-79c8d999ff-5tt9g" Apr 20 19:31:28.521165 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.520991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7psh\" (UniqueName: \"kubernetes.io/projected/a2a632e2-c681-4d83-98bc-34be231e075d-kube-api-access-m7psh\") pod \"cert-manager-79c8d999ff-5tt9g\" (UID: \"a2a632e2-c681-4d83-98bc-34be231e075d\") " pod="cert-manager/cert-manager-79c8d999ff-5tt9g" Apr 20 19:31:28.654156 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.654119 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-5tt9g" Apr 20 19:31:28.784168 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:28.784137 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-5tt9g"] Apr 20 19:31:28.786461 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:31:28.786429 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2a632e2_c681_4d83_98bc_34be231e075d.slice/crio-e74b855a1304d185d39b1609b979b24745c7b9d06f0b42346024756fabf08a80 WatchSource:0}: Error finding container e74b855a1304d185d39b1609b979b24745c7b9d06f0b42346024756fabf08a80: Status 404 returned error can't find the container with id e74b855a1304d185d39b1609b979b24745c7b9d06f0b42346024756fabf08a80 Apr 20 19:31:29.027166 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:29.027063 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-5tt9g" event={"ID":"a2a632e2-c681-4d83-98bc-34be231e075d","Type":"ContainerStarted","Data":"475aeeec67c853c4f8807f70a4d989a0373e2ad882e8f65444fa4f84b9ef633e"} Apr 20 19:31:29.027166 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:29.027103 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-5tt9g" event={"ID":"a2a632e2-c681-4d83-98bc-34be231e075d","Type":"ContainerStarted","Data":"e74b855a1304d185d39b1609b979b24745c7b9d06f0b42346024756fabf08a80"} Apr 20 19:31:29.042542 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:29.042480 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-5tt9g" podStartSLOduration=1.042463903 podStartE2EDuration="1.042463903s" podCreationTimestamp="2026-04-20 19:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:31:29.042096076 +0000 UTC m=+344.732509350" watchObservedRunningTime="2026-04-20 19:31:29.042463903 +0000 UTC m=+344.732877179" Apr 20 19:31:42.209947 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.209904 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q"] Apr 20 19:31:42.215096 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.215041 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" Apr 20 19:31:42.218482 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.218446 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:31:42.218482 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.218446 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:31:42.219348 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.219321 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5pq88\"" Apr 20 19:31:42.221924 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.221901 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q"] Apr 20 19:31:42.325233 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.325187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85af9ae4-05b7-4d50-aef9-52444372acd6-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q\" (UID: \"85af9ae4-05b7-4d50-aef9-52444372acd6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" Apr 20 19:31:42.325413 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.325256 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85af9ae4-05b7-4d50-aef9-52444372acd6-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q\" (UID: \"85af9ae4-05b7-4d50-aef9-52444372acd6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" Apr 20 19:31:42.325413 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.325342 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mqlg\" (UniqueName: \"kubernetes.io/projected/85af9ae4-05b7-4d50-aef9-52444372acd6-kube-api-access-8mqlg\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q\" (UID: \"85af9ae4-05b7-4d50-aef9-52444372acd6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" Apr 20 19:31:42.426154 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.426114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85af9ae4-05b7-4d50-aef9-52444372acd6-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q\" (UID: \"85af9ae4-05b7-4d50-aef9-52444372acd6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" Apr 20 19:31:42.426339 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.426179 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85af9ae4-05b7-4d50-aef9-52444372acd6-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q\" (UID: \"85af9ae4-05b7-4d50-aef9-52444372acd6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" Apr 20 19:31:42.426339 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.426233 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mqlg\" (UniqueName: \"kubernetes.io/projected/85af9ae4-05b7-4d50-aef9-52444372acd6-kube-api-access-8mqlg\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q\" (UID: \"85af9ae4-05b7-4d50-aef9-52444372acd6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" Apr 20 19:31:42.426583 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.426555 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85af9ae4-05b7-4d50-aef9-52444372acd6-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q\" (UID: \"85af9ae4-05b7-4d50-aef9-52444372acd6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" Apr 20 19:31:42.426697 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.426555 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85af9ae4-05b7-4d50-aef9-52444372acd6-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q\" (UID: \"85af9ae4-05b7-4d50-aef9-52444372acd6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" Apr 20 19:31:42.436018 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.435976 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mqlg\" (UniqueName: \"kubernetes.io/projected/85af9ae4-05b7-4d50-aef9-52444372acd6-kube-api-access-8mqlg\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q\" (UID: \"85af9ae4-05b7-4d50-aef9-52444372acd6\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" Apr 20 19:31:42.527961 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.527862 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" Apr 20 19:31:42.666363 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:42.666330 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q"] Apr 20 19:31:42.669213 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:31:42.669182 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85af9ae4_05b7_4d50_aef9_52444372acd6.slice/crio-b3ade62c5ff5b311969b3a641cb078f512abe1e87b91f0b3a7bccb0d9657be1d WatchSource:0}: Error finding container b3ade62c5ff5b311969b3a641cb078f512abe1e87b91f0b3a7bccb0d9657be1d: Status 404 returned error can't find the container with id b3ade62c5ff5b311969b3a641cb078f512abe1e87b91f0b3a7bccb0d9657be1d Apr 20 19:31:43.075298 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:43.075196 2575 generic.go:358] "Generic (PLEG): container finished" podID="85af9ae4-05b7-4d50-aef9-52444372acd6" containerID="f2624e6fa0b35a412d645e5036e9203bfc68ff50eb31fd1dcec46993a747bf06" exitCode=0 Apr 20 19:31:43.075298 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:43.075288 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" event={"ID":"85af9ae4-05b7-4d50-aef9-52444372acd6","Type":"ContainerDied","Data":"f2624e6fa0b35a412d645e5036e9203bfc68ff50eb31fd1dcec46993a747bf06"} Apr 20 19:31:43.075521 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:43.075319 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" event={"ID":"85af9ae4-05b7-4d50-aef9-52444372acd6","Type":"ContainerStarted","Data":"b3ade62c5ff5b311969b3a641cb078f512abe1e87b91f0b3a7bccb0d9657be1d"} Apr 20 19:31:44.083416 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:44.083328 2575 generic.go:358] "Generic (PLEG): container finished" podID="85af9ae4-05b7-4d50-aef9-52444372acd6" containerID="165eace916f0781960345b37b6583f53d5d13310e338bbb32b0877f2c66a4497" exitCode=0 Apr 20 19:31:44.083416 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:44.083393 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" event={"ID":"85af9ae4-05b7-4d50-aef9-52444372acd6","Type":"ContainerDied","Data":"165eace916f0781960345b37b6583f53d5d13310e338bbb32b0877f2c66a4497"} Apr 20 19:31:45.088580 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:45.088541 2575 generic.go:358] "Generic (PLEG): container finished" podID="85af9ae4-05b7-4d50-aef9-52444372acd6" containerID="4496214cd31d5f46d7f8907dd3a08ab85fd016376ab3789c2a1c883fbe36f864" exitCode=0 Apr 20 19:31:45.088976 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:45.088651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" event={"ID":"85af9ae4-05b7-4d50-aef9-52444372acd6","Type":"ContainerDied","Data":"4496214cd31d5f46d7f8907dd3a08ab85fd016376ab3789c2a1c883fbe36f864"} Apr 20 19:31:46.228411 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:46.228384 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" Apr 20 19:31:46.262059 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:46.262025 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mqlg\" (UniqueName: \"kubernetes.io/projected/85af9ae4-05b7-4d50-aef9-52444372acd6-kube-api-access-8mqlg\") pod \"85af9ae4-05b7-4d50-aef9-52444372acd6\" (UID: \"85af9ae4-05b7-4d50-aef9-52444372acd6\") " Apr 20 19:31:46.262253 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:46.262099 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85af9ae4-05b7-4d50-aef9-52444372acd6-bundle\") pod \"85af9ae4-05b7-4d50-aef9-52444372acd6\" (UID: \"85af9ae4-05b7-4d50-aef9-52444372acd6\") " Apr 20 19:31:46.262253 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:46.262117 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85af9ae4-05b7-4d50-aef9-52444372acd6-util\") pod \"85af9ae4-05b7-4d50-aef9-52444372acd6\" (UID: \"85af9ae4-05b7-4d50-aef9-52444372acd6\") " Apr 20 19:31:46.262888 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:46.262857 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85af9ae4-05b7-4d50-aef9-52444372acd6-bundle" (OuterVolumeSpecName: "bundle") pod "85af9ae4-05b7-4d50-aef9-52444372acd6" (UID: "85af9ae4-05b7-4d50-aef9-52444372acd6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:31:46.264477 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:46.264448 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85af9ae4-05b7-4d50-aef9-52444372acd6-kube-api-access-8mqlg" (OuterVolumeSpecName: "kube-api-access-8mqlg") pod "85af9ae4-05b7-4d50-aef9-52444372acd6" (UID: "85af9ae4-05b7-4d50-aef9-52444372acd6"). InnerVolumeSpecName "kube-api-access-8mqlg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:31:46.267924 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:46.267896 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85af9ae4-05b7-4d50-aef9-52444372acd6-util" (OuterVolumeSpecName: "util") pod "85af9ae4-05b7-4d50-aef9-52444372acd6" (UID: "85af9ae4-05b7-4d50-aef9-52444372acd6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:31:46.362956 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:46.362864 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8mqlg\" (UniqueName: \"kubernetes.io/projected/85af9ae4-05b7-4d50-aef9-52444372acd6-kube-api-access-8mqlg\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:31:46.362956 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:46.362898 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/85af9ae4-05b7-4d50-aef9-52444372acd6-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:31:46.362956 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:46.362908 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/85af9ae4-05b7-4d50-aef9-52444372acd6-util\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:31:47.097343 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:47.097306 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" event={"ID":"85af9ae4-05b7-4d50-aef9-52444372acd6","Type":"ContainerDied","Data":"b3ade62c5ff5b311969b3a641cb078f512abe1e87b91f0b3a7bccb0d9657be1d"} Apr 20 19:31:47.097343 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:47.097339 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c52nv6q" Apr 20 19:31:47.097555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:47.097344 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3ade62c5ff5b311969b3a641cb078f512abe1e87b91f0b3a7bccb0d9657be1d" Apr 20 19:31:51.828845 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.828805 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk"] Apr 20 19:31:51.829275 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.829258 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85af9ae4-05b7-4d50-aef9-52444372acd6" containerName="util" Apr 20 19:31:51.829333 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.829278 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="85af9ae4-05b7-4d50-aef9-52444372acd6" containerName="util" Apr 20 19:31:51.829333 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.829289 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85af9ae4-05b7-4d50-aef9-52444372acd6" containerName="pull" Apr 20 19:31:51.829333 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.829296 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="85af9ae4-05b7-4d50-aef9-52444372acd6" containerName="pull" Apr 20 19:31:51.829333 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.829312 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85af9ae4-05b7-4d50-aef9-52444372acd6" containerName="extract" Apr 20 19:31:51.829333 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.829319 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="85af9ae4-05b7-4d50-aef9-52444372acd6" containerName="extract" Apr 20 19:31:51.829505 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.829407 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="85af9ae4-05b7-4d50-aef9-52444372acd6" containerName="extract" Apr 20 19:31:51.833135 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.833111 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" Apr 20 19:31:51.845423 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.845390 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:31:51.845599 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.845407 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:31:51.846353 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.846282 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5pq88\"" Apr 20 19:31:51.849280 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.849249 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk"] Apr 20 19:31:51.908998 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.908943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5edc49a3-0f41-47fc-a53d-c2645de53a38-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk\" (UID: \"5edc49a3-0f41-47fc-a53d-c2645de53a38\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" Apr 20 19:31:51.909181 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.909030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5edc49a3-0f41-47fc-a53d-c2645de53a38-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk\" (UID: \"5edc49a3-0f41-47fc-a53d-c2645de53a38\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" Apr 20 19:31:51.909181 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:51.909101 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5j7d\" (UniqueName: \"kubernetes.io/projected/5edc49a3-0f41-47fc-a53d-c2645de53a38-kube-api-access-g5j7d\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk\" (UID: \"5edc49a3-0f41-47fc-a53d-c2645de53a38\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" Apr 20 19:31:52.009951 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:52.009895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5j7d\" (UniqueName: \"kubernetes.io/projected/5edc49a3-0f41-47fc-a53d-c2645de53a38-kube-api-access-g5j7d\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk\" (UID: \"5edc49a3-0f41-47fc-a53d-c2645de53a38\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" Apr 20 19:31:52.010169 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:52.009985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5edc49a3-0f41-47fc-a53d-c2645de53a38-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk\" (UID: \"5edc49a3-0f41-47fc-a53d-c2645de53a38\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" Apr 20 19:31:52.010169 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:52.010078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5edc49a3-0f41-47fc-a53d-c2645de53a38-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk\" (UID: \"5edc49a3-0f41-47fc-a53d-c2645de53a38\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" Apr 20 19:31:52.010460 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:52.010435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5edc49a3-0f41-47fc-a53d-c2645de53a38-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk\" (UID: \"5edc49a3-0f41-47fc-a53d-c2645de53a38\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" Apr 20 19:31:52.010554 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:52.010512 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5edc49a3-0f41-47fc-a53d-c2645de53a38-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk\" (UID: \"5edc49a3-0f41-47fc-a53d-c2645de53a38\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" Apr 20 19:31:52.019222 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:52.019184 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5j7d\" (UniqueName: \"kubernetes.io/projected/5edc49a3-0f41-47fc-a53d-c2645de53a38-kube-api-access-g5j7d\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk\" (UID: \"5edc49a3-0f41-47fc-a53d-c2645de53a38\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" Apr 20 19:31:52.157432 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:52.157390 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" Apr 20 19:31:52.516041 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:52.516014 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk"] Apr 20 19:31:52.518796 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:31:52.518758 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5edc49a3_0f41_47fc_a53d_c2645de53a38.slice/crio-ca3e333a783de270a87065ccb43317a2dc92edb8c0c773ca4f21e6cc98fbcbc1 WatchSource:0}: Error finding container ca3e333a783de270a87065ccb43317a2dc92edb8c0c773ca4f21e6cc98fbcbc1: Status 404 returned error can't find the container with id ca3e333a783de270a87065ccb43317a2dc92edb8c0c773ca4f21e6cc98fbcbc1 Apr 20 19:31:53.118916 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.118879 2575 generic.go:358] "Generic (PLEG): container finished" podID="5edc49a3-0f41-47fc-a53d-c2645de53a38" containerID="b43eb3908269ccb5a058bcdb58a79c04ac6ba0604f2ba5cc42ff8a44a1ac6379" exitCode=0 Apr 20 19:31:53.119455 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.118952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" event={"ID":"5edc49a3-0f41-47fc-a53d-c2645de53a38","Type":"ContainerDied","Data":"b43eb3908269ccb5a058bcdb58a79c04ac6ba0604f2ba5cc42ff8a44a1ac6379"} Apr 20 19:31:53.119455 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.118996 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" event={"ID":"5edc49a3-0f41-47fc-a53d-c2645de53a38","Type":"ContainerStarted","Data":"ca3e333a783de270a87065ccb43317a2dc92edb8c0c773ca4f21e6cc98fbcbc1"} Apr 20 19:31:53.780396 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.780364 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22"] Apr 20 19:31:53.785162 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.785135 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" Apr 20 19:31:53.787819 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.787790 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 19:31:53.788087 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.788069 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 19:31:53.788165 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.788097 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 19:31:53.788165 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.788099 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 19:31:53.788979 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.788961 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-cx2tl\"" Apr 20 19:31:53.796480 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.796447 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22"] Apr 20 19:31:53.827388 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.827352 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbc7t\" (UniqueName: \"kubernetes.io/projected/39b9a935-d4c4-4f7d-b2f1-461da5d3c126-kube-api-access-pbc7t\") pod \"opendatahub-operator-controller-manager-9f747d685-cdf22\" (UID: \"39b9a935-d4c4-4f7d-b2f1-461da5d3c126\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" Apr 20 19:31:53.827555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.827397 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/39b9a935-d4c4-4f7d-b2f1-461da5d3c126-webhook-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-cdf22\" (UID: \"39b9a935-d4c4-4f7d-b2f1-461da5d3c126\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" Apr 20 19:31:53.827555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.827448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/39b9a935-d4c4-4f7d-b2f1-461da5d3c126-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-cdf22\" (UID: \"39b9a935-d4c4-4f7d-b2f1-461da5d3c126\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" Apr 20 19:31:53.927995 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.927898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/39b9a935-d4c4-4f7d-b2f1-461da5d3c126-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-cdf22\" (UID: \"39b9a935-d4c4-4f7d-b2f1-461da5d3c126\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" Apr 20 19:31:53.927995 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.927993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbc7t\" (UniqueName: \"kubernetes.io/projected/39b9a935-d4c4-4f7d-b2f1-461da5d3c126-kube-api-access-pbc7t\") pod \"opendatahub-operator-controller-manager-9f747d685-cdf22\" (UID: \"39b9a935-d4c4-4f7d-b2f1-461da5d3c126\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" Apr 20 19:31:53.928189 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.928020 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/39b9a935-d4c4-4f7d-b2f1-461da5d3c126-webhook-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-cdf22\" (UID: \"39b9a935-d4c4-4f7d-b2f1-461da5d3c126\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" Apr 20 19:31:53.930766 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.930729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/39b9a935-d4c4-4f7d-b2f1-461da5d3c126-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-cdf22\" (UID: \"39b9a935-d4c4-4f7d-b2f1-461da5d3c126\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" Apr 20 19:31:53.930906 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.930796 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/39b9a935-d4c4-4f7d-b2f1-461da5d3c126-webhook-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-cdf22\" (UID: \"39b9a935-d4c4-4f7d-b2f1-461da5d3c126\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" Apr 20 19:31:53.937943 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:53.937906 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbc7t\" (UniqueName: \"kubernetes.io/projected/39b9a935-d4c4-4f7d-b2f1-461da5d3c126-kube-api-access-pbc7t\") pod \"opendatahub-operator-controller-manager-9f747d685-cdf22\" (UID: \"39b9a935-d4c4-4f7d-b2f1-461da5d3c126\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" Apr 20 19:31:54.118729 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:54.118689 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" Apr 20 19:31:54.124950 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:54.124913 2575 generic.go:358] "Generic (PLEG): container finished" podID="5edc49a3-0f41-47fc-a53d-c2645de53a38" containerID="632031304b1113575e3b318cf1dbecf50cec776c495518f3acbec0844163d54b" exitCode=0 Apr 20 19:31:54.125316 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:54.124967 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" event={"ID":"5edc49a3-0f41-47fc-a53d-c2645de53a38","Type":"ContainerDied","Data":"632031304b1113575e3b318cf1dbecf50cec776c495518f3acbec0844163d54b"} Apr 20 19:31:54.269286 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:54.269251 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22"] Apr 20 19:31:54.272769 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:31:54.272730 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b9a935_d4c4_4f7d_b2f1_461da5d3c126.slice/crio-0e38dc54ad4e1da649fee0678647dbc477a7c0bdd3e747e7d89ebfbf36892124 WatchSource:0}: Error finding container 0e38dc54ad4e1da649fee0678647dbc477a7c0bdd3e747e7d89ebfbf36892124: Status 404 returned error can't find the container with id 0e38dc54ad4e1da649fee0678647dbc477a7c0bdd3e747e7d89ebfbf36892124 Apr 20 19:31:55.131822 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:55.131780 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" event={"ID":"39b9a935-d4c4-4f7d-b2f1-461da5d3c126","Type":"ContainerStarted","Data":"0e38dc54ad4e1da649fee0678647dbc477a7c0bdd3e747e7d89ebfbf36892124"} Apr 20 19:31:55.134983 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:55.134663 2575 generic.go:358] "Generic (PLEG): container finished" podID="5edc49a3-0f41-47fc-a53d-c2645de53a38" containerID="7aee767ee5d8c1157a272c5426c46542f8e867e97a61c04574328b9138cee2f1" exitCode=0 Apr 20 19:31:55.134983 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:55.134726 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" event={"ID":"5edc49a3-0f41-47fc-a53d-c2645de53a38","Type":"ContainerDied","Data":"7aee767ee5d8c1157a272c5426c46542f8e867e97a61c04574328b9138cee2f1"} Apr 20 19:31:56.830142 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:56.830112 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" Apr 20 19:31:56.959730 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:56.959695 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5edc49a3-0f41-47fc-a53d-c2645de53a38-util\") pod \"5edc49a3-0f41-47fc-a53d-c2645de53a38\" (UID: \"5edc49a3-0f41-47fc-a53d-c2645de53a38\") " Apr 20 19:31:56.959911 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:56.959804 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5j7d\" (UniqueName: \"kubernetes.io/projected/5edc49a3-0f41-47fc-a53d-c2645de53a38-kube-api-access-g5j7d\") pod \"5edc49a3-0f41-47fc-a53d-c2645de53a38\" (UID: \"5edc49a3-0f41-47fc-a53d-c2645de53a38\") " Apr 20 19:31:56.959911 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:56.959855 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5edc49a3-0f41-47fc-a53d-c2645de53a38-bundle\") pod \"5edc49a3-0f41-47fc-a53d-c2645de53a38\" (UID: \"5edc49a3-0f41-47fc-a53d-c2645de53a38\") " Apr 20 19:31:56.960944 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:56.960910 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5edc49a3-0f41-47fc-a53d-c2645de53a38-bundle" (OuterVolumeSpecName: "bundle") pod "5edc49a3-0f41-47fc-a53d-c2645de53a38" (UID: "5edc49a3-0f41-47fc-a53d-c2645de53a38"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:31:56.962339 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:56.962300 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5edc49a3-0f41-47fc-a53d-c2645de53a38-kube-api-access-g5j7d" (OuterVolumeSpecName: "kube-api-access-g5j7d") pod "5edc49a3-0f41-47fc-a53d-c2645de53a38" (UID: "5edc49a3-0f41-47fc-a53d-c2645de53a38"). InnerVolumeSpecName "kube-api-access-g5j7d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:31:56.965737 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:56.965697 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5edc49a3-0f41-47fc-a53d-c2645de53a38-util" (OuterVolumeSpecName: "util") pod "5edc49a3-0f41-47fc-a53d-c2645de53a38" (UID: "5edc49a3-0f41-47fc-a53d-c2645de53a38"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:31:57.061139 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:57.061040 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5edc49a3-0f41-47fc-a53d-c2645de53a38-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:31:57.061139 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:57.061078 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5edc49a3-0f41-47fc-a53d-c2645de53a38-util\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:31:57.061139 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:57.061088 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5j7d\" (UniqueName: \"kubernetes.io/projected/5edc49a3-0f41-47fc-a53d-c2645de53a38-kube-api-access-g5j7d\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:31:57.143515 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:57.143476 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" event={"ID":"39b9a935-d4c4-4f7d-b2f1-461da5d3c126","Type":"ContainerStarted","Data":"28b6dab464100781c8e079392143fa2abe79a5c5682609372fa91ce3fb82a3ec"} Apr 20 19:31:57.143731 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:57.143571 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" Apr 20 19:31:57.145316 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:57.145275 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" event={"ID":"5edc49a3-0f41-47fc-a53d-c2645de53a38","Type":"ContainerDied","Data":"ca3e333a783de270a87065ccb43317a2dc92edb8c0c773ca4f21e6cc98fbcbc1"} Apr 20 19:31:57.145316 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:57.145311 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vmvfk" Apr 20 19:31:57.145502 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:57.145318 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca3e333a783de270a87065ccb43317a2dc92edb8c0c773ca4f21e6cc98fbcbc1" Apr 20 19:31:57.180343 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:57.180282 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" podStartSLOduration=1.5921664020000001 podStartE2EDuration="4.180264495s" podCreationTimestamp="2026-04-20 19:31:53 +0000 UTC" firstStartedPulling="2026-04-20 19:31:54.274463034 +0000 UTC m=+369.964876286" lastFinishedPulling="2026-04-20 19:31:56.862561119 +0000 UTC m=+372.552974379" observedRunningTime="2026-04-20 19:31:57.177293033 +0000 UTC m=+372.867706307" watchObservedRunningTime="2026-04-20 19:31:57.180264495 +0000 UTC m=+372.870677769" Apr 20 19:31:58.883296 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.883254 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn"] Apr 20 19:31:58.883718 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.883572 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5edc49a3-0f41-47fc-a53d-c2645de53a38" containerName="pull" Apr 20 19:31:58.883718 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.883583 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5edc49a3-0f41-47fc-a53d-c2645de53a38" containerName="pull" Apr 20 19:31:58.883718 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.883605 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5edc49a3-0f41-47fc-a53d-c2645de53a38" containerName="extract" Apr 20 19:31:58.883718 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.883639 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5edc49a3-0f41-47fc-a53d-c2645de53a38" containerName="extract" Apr 20 19:31:58.883718 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.883649 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5edc49a3-0f41-47fc-a53d-c2645de53a38" containerName="util" Apr 20 19:31:58.883718 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.883654 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5edc49a3-0f41-47fc-a53d-c2645de53a38" containerName="util" Apr 20 19:31:58.883718 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.883710 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5edc49a3-0f41-47fc-a53d-c2645de53a38" containerName="extract" Apr 20 19:31:58.887781 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.887756 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:31:58.890498 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.890470 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 19:31:58.890668 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.890470 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 19:31:58.891478 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.891450 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 19:31:58.891643 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.891450 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-8lsbm\"" Apr 20 19:31:58.891643 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.891452 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:31:58.891643 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.891453 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 19:31:58.895819 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.895795 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn"] Apr 20 19:31:58.977428 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.977393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/244b2580-8d8a-40f8-915e-e44bcee11364-cert\") pod \"lws-controller-manager-6ddf46b867-qzppn\" (UID: \"244b2580-8d8a-40f8-915e-e44bcee11364\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:31:58.977681 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.977470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/244b2580-8d8a-40f8-915e-e44bcee11364-metrics-cert\") pod \"lws-controller-manager-6ddf46b867-qzppn\" (UID: \"244b2580-8d8a-40f8-915e-e44bcee11364\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:31:58.977681 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.977503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww8v6\" (UniqueName: \"kubernetes.io/projected/244b2580-8d8a-40f8-915e-e44bcee11364-kube-api-access-ww8v6\") pod \"lws-controller-manager-6ddf46b867-qzppn\" (UID: \"244b2580-8d8a-40f8-915e-e44bcee11364\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:31:58.977681 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:58.977521 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/244b2580-8d8a-40f8-915e-e44bcee11364-manager-config\") pod \"lws-controller-manager-6ddf46b867-qzppn\" (UID: \"244b2580-8d8a-40f8-915e-e44bcee11364\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:31:59.078934 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:59.078873 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/244b2580-8d8a-40f8-915e-e44bcee11364-metrics-cert\") pod \"lws-controller-manager-6ddf46b867-qzppn\" (UID: \"244b2580-8d8a-40f8-915e-e44bcee11364\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:31:59.078934 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:59.078937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ww8v6\" (UniqueName: \"kubernetes.io/projected/244b2580-8d8a-40f8-915e-e44bcee11364-kube-api-access-ww8v6\") pod \"lws-controller-manager-6ddf46b867-qzppn\" (UID: \"244b2580-8d8a-40f8-915e-e44bcee11364\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:31:59.079177 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:59.078961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/244b2580-8d8a-40f8-915e-e44bcee11364-manager-config\") pod \"lws-controller-manager-6ddf46b867-qzppn\" (UID: \"244b2580-8d8a-40f8-915e-e44bcee11364\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:31:59.079177 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:59.079095 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/244b2580-8d8a-40f8-915e-e44bcee11364-cert\") pod \"lws-controller-manager-6ddf46b867-qzppn\" (UID: \"244b2580-8d8a-40f8-915e-e44bcee11364\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:31:59.079759 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:59.079693 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/244b2580-8d8a-40f8-915e-e44bcee11364-manager-config\") pod \"lws-controller-manager-6ddf46b867-qzppn\" (UID: \"244b2580-8d8a-40f8-915e-e44bcee11364\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:31:59.081673 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:59.081649 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/244b2580-8d8a-40f8-915e-e44bcee11364-metrics-cert\") pod \"lws-controller-manager-6ddf46b867-qzppn\" (UID: \"244b2580-8d8a-40f8-915e-e44bcee11364\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:31:59.081822 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:59.081801 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/244b2580-8d8a-40f8-915e-e44bcee11364-cert\") pod \"lws-controller-manager-6ddf46b867-qzppn\" (UID: \"244b2580-8d8a-40f8-915e-e44bcee11364\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:31:59.087865 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:59.087833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww8v6\" (UniqueName: \"kubernetes.io/projected/244b2580-8d8a-40f8-915e-e44bcee11364-kube-api-access-ww8v6\") pod \"lws-controller-manager-6ddf46b867-qzppn\" (UID: \"244b2580-8d8a-40f8-915e-e44bcee11364\") " pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:31:59.198736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:59.198642 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:31:59.333929 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:31:59.333887 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn"] Apr 20 19:31:59.337339 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:31:59.337298 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod244b2580_8d8a_40f8_915e_e44bcee11364.slice/crio-7b7df179677c0c69597d2374d5321dbda8ba726810c1df3359197d6bf5fb266a WatchSource:0}: Error finding container 7b7df179677c0c69597d2374d5321dbda8ba726810c1df3359197d6bf5fb266a: Status 404 returned error can't find the container with id 7b7df179677c0c69597d2374d5321dbda8ba726810c1df3359197d6bf5fb266a Apr 20 19:32:00.156909 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:00.156873 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" event={"ID":"244b2580-8d8a-40f8-915e-e44bcee11364","Type":"ContainerStarted","Data":"7b7df179677c0c69597d2374d5321dbda8ba726810c1df3359197d6bf5fb266a"} Apr 20 19:32:02.165684 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:02.165646 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" event={"ID":"244b2580-8d8a-40f8-915e-e44bcee11364","Type":"ContainerStarted","Data":"c42b7709cc42673d696abc7a22e6136305e0eb40580e1d4ad3390be39dab7c90"} Apr 20 19:32:02.166068 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:02.165715 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:32:02.182939 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:02.182872 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" podStartSLOduration=1.680774395 podStartE2EDuration="4.182850543s" podCreationTimestamp="2026-04-20 19:31:58 +0000 UTC" firstStartedPulling="2026-04-20 19:31:59.339420522 +0000 UTC m=+375.029833777" lastFinishedPulling="2026-04-20 19:32:01.841496672 +0000 UTC m=+377.531909925" observedRunningTime="2026-04-20 19:32:02.181505387 +0000 UTC m=+377.871918690" watchObservedRunningTime="2026-04-20 19:32:02.182850543 +0000 UTC m=+377.873263818" Apr 20 19:32:08.150712 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:08.150678 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-cdf22" Apr 20 19:32:13.172214 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:13.172181 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6ddf46b867-qzppn" Apr 20 19:32:21.631933 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.631882 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct"] Apr 20 19:32:21.636504 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.636472 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" Apr 20 19:32:21.639523 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.639486 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:32:21.639727 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.639711 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5pq88\"" Apr 20 19:32:21.640512 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.640492 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:32:21.644877 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.644824 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct"] Apr 20 19:32:21.744350 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.744290 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt"] Apr 20 19:32:21.747887 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.747858 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" Apr 20 19:32:21.750438 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.750398 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 19:32:21.750650 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.750454 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 19:32:21.750650 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.750538 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-blzjv\"" Apr 20 19:32:21.758701 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.758490 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt"] Apr 20 19:32:21.776015 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.775974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d546a132-7444-4bf4-8fff-3916cb6f7949-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct\" (UID: \"d546a132-7444-4bf4-8fff-3916cb6f7949\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" Apr 20 19:32:21.776192 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.776027 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzhwh\" (UniqueName: \"kubernetes.io/projected/d546a132-7444-4bf4-8fff-3916cb6f7949-kube-api-access-wzhwh\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct\" (UID: \"d546a132-7444-4bf4-8fff-3916cb6f7949\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" Apr 20 19:32:21.776280 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.776233 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d546a132-7444-4bf4-8fff-3916cb6f7949-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct\" (UID: \"d546a132-7444-4bf4-8fff-3916cb6f7949\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" Apr 20 19:32:21.877356 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.877316 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e71888-9f98-483b-ae51-52a227f9b41c-tls-certs\") pod \"kube-auth-proxy-79f76cb8cc-9vbzt\" (UID: \"e4e71888-9f98-483b-ae51-52a227f9b41c\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" Apr 20 19:32:21.877356 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.877361 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4e71888-9f98-483b-ae51-52a227f9b41c-tmp\") pod \"kube-auth-proxy-79f76cb8cc-9vbzt\" (UID: \"e4e71888-9f98-483b-ae51-52a227f9b41c\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" Apr 20 19:32:21.877671 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.877385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msv58\" (UniqueName: \"kubernetes.io/projected/e4e71888-9f98-483b-ae51-52a227f9b41c-kube-api-access-msv58\") pod \"kube-auth-proxy-79f76cb8cc-9vbzt\" (UID: \"e4e71888-9f98-483b-ae51-52a227f9b41c\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" Apr 20 19:32:21.877671 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.877453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d546a132-7444-4bf4-8fff-3916cb6f7949-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct\" (UID: \"d546a132-7444-4bf4-8fff-3916cb6f7949\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" Apr 20 19:32:21.877671 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.877483 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d546a132-7444-4bf4-8fff-3916cb6f7949-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct\" (UID: \"d546a132-7444-4bf4-8fff-3916cb6f7949\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" Apr 20 19:32:21.877671 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.877578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzhwh\" (UniqueName: \"kubernetes.io/projected/d546a132-7444-4bf4-8fff-3916cb6f7949-kube-api-access-wzhwh\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct\" (UID: \"d546a132-7444-4bf4-8fff-3916cb6f7949\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" Apr 20 19:32:21.877862 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.877844 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d546a132-7444-4bf4-8fff-3916cb6f7949-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct\" (UID: \"d546a132-7444-4bf4-8fff-3916cb6f7949\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" Apr 20 19:32:21.877977 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.877957 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d546a132-7444-4bf4-8fff-3916cb6f7949-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct\" (UID: \"d546a132-7444-4bf4-8fff-3916cb6f7949\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" Apr 20 19:32:21.886127 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.886087 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzhwh\" (UniqueName: \"kubernetes.io/projected/d546a132-7444-4bf4-8fff-3916cb6f7949-kube-api-access-wzhwh\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct\" (UID: \"d546a132-7444-4bf4-8fff-3916cb6f7949\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" Apr 20 19:32:21.948061 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.948014 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" Apr 20 19:32:21.978215 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.978177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e71888-9f98-483b-ae51-52a227f9b41c-tls-certs\") pod \"kube-auth-proxy-79f76cb8cc-9vbzt\" (UID: \"e4e71888-9f98-483b-ae51-52a227f9b41c\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" Apr 20 19:32:21.978215 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.978218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4e71888-9f98-483b-ae51-52a227f9b41c-tmp\") pod \"kube-auth-proxy-79f76cb8cc-9vbzt\" (UID: \"e4e71888-9f98-483b-ae51-52a227f9b41c\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" Apr 20 19:32:21.978431 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.978237 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msv58\" (UniqueName: \"kubernetes.io/projected/e4e71888-9f98-483b-ae51-52a227f9b41c-kube-api-access-msv58\") pod \"kube-auth-proxy-79f76cb8cc-9vbzt\" (UID: \"e4e71888-9f98-483b-ae51-52a227f9b41c\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" Apr 20 19:32:21.978431 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:32:21.978353 2575 secret.go:189] Couldn't get secret openshift-ingress/kube-auth-proxy-tls: secret "kube-auth-proxy-tls" not found Apr 20 19:32:21.978504 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:32:21.978472 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4e71888-9f98-483b-ae51-52a227f9b41c-tls-certs podName:e4e71888-9f98-483b-ae51-52a227f9b41c nodeName:}" failed. No retries permitted until 2026-04-20 19:32:22.478429837 +0000 UTC m=+398.168843090 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/e4e71888-9f98-483b-ae51-52a227f9b41c-tls-certs") pod "kube-auth-proxy-79f76cb8cc-9vbzt" (UID: "e4e71888-9f98-483b-ae51-52a227f9b41c") : secret "kube-auth-proxy-tls" not found Apr 20 19:32:21.980757 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.980730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4e71888-9f98-483b-ae51-52a227f9b41c-tmp\") pod \"kube-auth-proxy-79f76cb8cc-9vbzt\" (UID: \"e4e71888-9f98-483b-ae51-52a227f9b41c\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" Apr 20 19:32:21.987710 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:21.987674 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msv58\" (UniqueName: \"kubernetes.io/projected/e4e71888-9f98-483b-ae51-52a227f9b41c-kube-api-access-msv58\") pod \"kube-auth-proxy-79f76cb8cc-9vbzt\" (UID: \"e4e71888-9f98-483b-ae51-52a227f9b41c\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" Apr 20 19:32:22.085750 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:22.085694 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct"] Apr 20 19:32:22.088134 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:32:22.088098 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd546a132_7444_4bf4_8fff_3916cb6f7949.slice/crio-de1f908f0cbaf892bf03f1d37b0cb7231c884502a01e80cca865fe446c9f2010 WatchSource:0}: Error finding container de1f908f0cbaf892bf03f1d37b0cb7231c884502a01e80cca865fe446c9f2010: Status 404 returned error can't find the container with id de1f908f0cbaf892bf03f1d37b0cb7231c884502a01e80cca865fe446c9f2010 Apr 20 19:32:22.238291 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:22.238258 2575 generic.go:358] "Generic (PLEG): container finished" podID="d546a132-7444-4bf4-8fff-3916cb6f7949" containerID="4bd2c6c8666969611c7031d0b913ea56b9bd3d5d35e46595a78b93035cf5bec6" exitCode=0 Apr 20 19:32:22.238472 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:22.238341 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" event={"ID":"d546a132-7444-4bf4-8fff-3916cb6f7949","Type":"ContainerDied","Data":"4bd2c6c8666969611c7031d0b913ea56b9bd3d5d35e46595a78b93035cf5bec6"} Apr 20 19:32:22.238472 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:22.238373 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" event={"ID":"d546a132-7444-4bf4-8fff-3916cb6f7949","Type":"ContainerStarted","Data":"de1f908f0cbaf892bf03f1d37b0cb7231c884502a01e80cca865fe446c9f2010"} Apr 20 19:32:22.483182 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:22.483088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e71888-9f98-483b-ae51-52a227f9b41c-tls-certs\") pod \"kube-auth-proxy-79f76cb8cc-9vbzt\" (UID: \"e4e71888-9f98-483b-ae51-52a227f9b41c\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" Apr 20 19:32:22.485817 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:22.485787 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e71888-9f98-483b-ae51-52a227f9b41c-tls-certs\") pod \"kube-auth-proxy-79f76cb8cc-9vbzt\" (UID: \"e4e71888-9f98-483b-ae51-52a227f9b41c\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" Apr 20 19:32:22.659913 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:22.659868 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" Apr 20 19:32:22.788842 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:22.788812 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt"] Apr 20 19:32:22.791223 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:32:22.791190 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4e71888_9f98_483b_ae51_52a227f9b41c.slice/crio-a49a82f3fa5d27224549459aa59032845fce080da384ae688ed13b23235aa1ae WatchSource:0}: Error finding container a49a82f3fa5d27224549459aa59032845fce080da384ae688ed13b23235aa1ae: Status 404 returned error can't find the container with id a49a82f3fa5d27224549459aa59032845fce080da384ae688ed13b23235aa1ae Apr 20 19:32:23.243451 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:23.243403 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" event={"ID":"e4e71888-9f98-483b-ae51-52a227f9b41c","Type":"ContainerStarted","Data":"a49a82f3fa5d27224549459aa59032845fce080da384ae688ed13b23235aa1ae"} Apr 20 19:32:23.245169 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:23.245133 2575 generic.go:358] "Generic (PLEG): container finished" podID="d546a132-7444-4bf4-8fff-3916cb6f7949" containerID="fa4e9c467c11b1d73923b383c97c43842d891b9641fbc3e93d56ff118280fd62" exitCode=0 Apr 20 19:32:23.245339 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:23.245208 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" event={"ID":"d546a132-7444-4bf4-8fff-3916cb6f7949","Type":"ContainerDied","Data":"fa4e9c467c11b1d73923b383c97c43842d891b9641fbc3e93d56ff118280fd62"} Apr 20 19:32:24.251420 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:24.251384 2575 generic.go:358] "Generic (PLEG): container finished" podID="d546a132-7444-4bf4-8fff-3916cb6f7949" containerID="920b1e73cc083b7435c1ee66da42f2431cb7879266c678bc01160c1c4d762561" exitCode=0 Apr 20 19:32:24.252017 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:24.251503 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" event={"ID":"d546a132-7444-4bf4-8fff-3916cb6f7949","Type":"ContainerDied","Data":"920b1e73cc083b7435c1ee66da42f2431cb7879266c678bc01160c1c4d762561"} Apr 20 19:32:25.875603 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:25.875560 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" Apr 20 19:32:26.015718 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:26.015678 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d546a132-7444-4bf4-8fff-3916cb6f7949-util\") pod \"d546a132-7444-4bf4-8fff-3916cb6f7949\" (UID: \"d546a132-7444-4bf4-8fff-3916cb6f7949\") " Apr 20 19:32:26.015896 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:26.015779 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d546a132-7444-4bf4-8fff-3916cb6f7949-bundle\") pod \"d546a132-7444-4bf4-8fff-3916cb6f7949\" (UID: \"d546a132-7444-4bf4-8fff-3916cb6f7949\") " Apr 20 19:32:26.015896 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:26.015820 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzhwh\" (UniqueName: \"kubernetes.io/projected/d546a132-7444-4bf4-8fff-3916cb6f7949-kube-api-access-wzhwh\") pod \"d546a132-7444-4bf4-8fff-3916cb6f7949\" (UID: \"d546a132-7444-4bf4-8fff-3916cb6f7949\") " Apr 20 19:32:26.016781 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:26.016749 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d546a132-7444-4bf4-8fff-3916cb6f7949-bundle" (OuterVolumeSpecName: "bundle") pod "d546a132-7444-4bf4-8fff-3916cb6f7949" (UID: "d546a132-7444-4bf4-8fff-3916cb6f7949"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:32:26.018297 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:26.018269 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d546a132-7444-4bf4-8fff-3916cb6f7949-kube-api-access-wzhwh" (OuterVolumeSpecName: "kube-api-access-wzhwh") pod "d546a132-7444-4bf4-8fff-3916cb6f7949" (UID: "d546a132-7444-4bf4-8fff-3916cb6f7949"). InnerVolumeSpecName "kube-api-access-wzhwh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:32:26.020586 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:26.020546 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d546a132-7444-4bf4-8fff-3916cb6f7949-util" (OuterVolumeSpecName: "util") pod "d546a132-7444-4bf4-8fff-3916cb6f7949" (UID: "d546a132-7444-4bf4-8fff-3916cb6f7949"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:32:26.117698 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:26.117580 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d546a132-7444-4bf4-8fff-3916cb6f7949-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:32:26.117698 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:26.117635 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wzhwh\" (UniqueName: \"kubernetes.io/projected/d546a132-7444-4bf4-8fff-3916cb6f7949-kube-api-access-wzhwh\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:32:26.117698 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:26.117647 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d546a132-7444-4bf4-8fff-3916cb6f7949-util\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:32:26.261190 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:26.261151 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" event={"ID":"e4e71888-9f98-483b-ae51-52a227f9b41c","Type":"ContainerStarted","Data":"e93790e758e31b23d238b691f736b0ea701a6b5a2547ce4935a9d9e74fb9fc64"} Apr 20 19:32:26.263045 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:26.263011 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" event={"ID":"d546a132-7444-4bf4-8fff-3916cb6f7949","Type":"ContainerDied","Data":"de1f908f0cbaf892bf03f1d37b0cb7231c884502a01e80cca865fe446c9f2010"} Apr 20 19:32:26.263045 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:26.263042 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835k8zct" Apr 20 19:32:26.263045 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:26.263052 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de1f908f0cbaf892bf03f1d37b0cb7231c884502a01e80cca865fe446c9f2010" Apr 20 19:32:26.278133 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:26.278076 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-9vbzt" podStartSLOduration=2.155024072 podStartE2EDuration="5.278059756s" podCreationTimestamp="2026-04-20 19:32:21 +0000 UTC" firstStartedPulling="2026-04-20 19:32:22.793202716 +0000 UTC m=+398.483615970" lastFinishedPulling="2026-04-20 19:32:25.916238388 +0000 UTC m=+401.606651654" observedRunningTime="2026-04-20 19:32:26.276408384 +0000 UTC m=+401.966821683" watchObservedRunningTime="2026-04-20 19:32:26.278059756 +0000 UTC m=+401.968473050" Apr 20 19:32:35.414996 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.414961 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc"] Apr 20 19:32:35.415410 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.415316 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d546a132-7444-4bf4-8fff-3916cb6f7949" containerName="util" Apr 20 19:32:35.415410 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.415326 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d546a132-7444-4bf4-8fff-3916cb6f7949" containerName="util" Apr 20 19:32:35.415410 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.415340 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d546a132-7444-4bf4-8fff-3916cb6f7949" containerName="extract" Apr 20 19:32:35.415410 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.415346 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d546a132-7444-4bf4-8fff-3916cb6f7949" containerName="extract" Apr 20 19:32:35.415410 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.415357 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d546a132-7444-4bf4-8fff-3916cb6f7949" containerName="pull" Apr 20 19:32:35.415410 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.415362 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d546a132-7444-4bf4-8fff-3916cb6f7949" containerName="pull" Apr 20 19:32:35.415595 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.415422 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d546a132-7444-4bf4-8fff-3916cb6f7949" containerName="extract" Apr 20 19:32:35.421997 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.421974 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" Apr 20 19:32:35.425454 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.425427 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:32:35.426395 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.426362 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:32:35.426395 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.426382 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5pq88\"" Apr 20 19:32:35.429512 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.429479 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc"] Apr 20 19:32:35.486402 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.486360 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9bd170d-e73d-449a-bf32-d16b74c782ee-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc\" (UID: \"c9bd170d-e73d-449a-bf32-d16b74c782ee\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" Apr 20 19:32:35.486586 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.486469 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9bd170d-e73d-449a-bf32-d16b74c782ee-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc\" (UID: \"c9bd170d-e73d-449a-bf32-d16b74c782ee\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" Apr 20 19:32:35.486586 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.486499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qvdl\" (UniqueName: \"kubernetes.io/projected/c9bd170d-e73d-449a-bf32-d16b74c782ee-kube-api-access-5qvdl\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc\" (UID: \"c9bd170d-e73d-449a-bf32-d16b74c782ee\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" Apr 20 19:32:35.587308 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.587258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9bd170d-e73d-449a-bf32-d16b74c782ee-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc\" (UID: \"c9bd170d-e73d-449a-bf32-d16b74c782ee\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" Apr 20 19:32:35.587308 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.587313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qvdl\" (UniqueName: \"kubernetes.io/projected/c9bd170d-e73d-449a-bf32-d16b74c782ee-kube-api-access-5qvdl\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc\" (UID: \"c9bd170d-e73d-449a-bf32-d16b74c782ee\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" Apr 20 19:32:35.587531 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.587371 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9bd170d-e73d-449a-bf32-d16b74c782ee-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc\" (UID: \"c9bd170d-e73d-449a-bf32-d16b74c782ee\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" Apr 20 19:32:35.587700 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.587677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9bd170d-e73d-449a-bf32-d16b74c782ee-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc\" (UID: \"c9bd170d-e73d-449a-bf32-d16b74c782ee\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" Apr 20 19:32:35.587761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.587746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9bd170d-e73d-449a-bf32-d16b74c782ee-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc\" (UID: \"c9bd170d-e73d-449a-bf32-d16b74c782ee\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" Apr 20 19:32:35.601272 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.601234 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qvdl\" (UniqueName: \"kubernetes.io/projected/c9bd170d-e73d-449a-bf32-d16b74c782ee-kube-api-access-5qvdl\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc\" (UID: \"c9bd170d-e73d-449a-bf32-d16b74c782ee\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" Apr 20 19:32:35.733027 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.732934 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" Apr 20 19:32:35.903706 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:35.903674 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc"] Apr 20 19:32:35.906055 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:32:35.906021 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9bd170d_e73d_449a_bf32_d16b74c782ee.slice/crio-b74b3c7971fc53bbc25bf6e36b6008608af18107abfec7b43d96f752ee3a7135 WatchSource:0}: Error finding container b74b3c7971fc53bbc25bf6e36b6008608af18107abfec7b43d96f752ee3a7135: Status 404 returned error can't find the container with id b74b3c7971fc53bbc25bf6e36b6008608af18107abfec7b43d96f752ee3a7135 Apr 20 19:32:36.301729 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:36.301633 2575 generic.go:358] "Generic (PLEG): container finished" podID="c9bd170d-e73d-449a-bf32-d16b74c782ee" containerID="5c7ff32f2aaee5185c2b5de3d44754f0b3910e213a400b482d2c4dc71dc1650a" exitCode=0 Apr 20 19:32:36.301729 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:36.301717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" event={"ID":"c9bd170d-e73d-449a-bf32-d16b74c782ee","Type":"ContainerDied","Data":"5c7ff32f2aaee5185c2b5de3d44754f0b3910e213a400b482d2c4dc71dc1650a"} Apr 20 19:32:36.301916 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:36.301750 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" event={"ID":"c9bd170d-e73d-449a-bf32-d16b74c782ee","Type":"ContainerStarted","Data":"b74b3c7971fc53bbc25bf6e36b6008608af18107abfec7b43d96f752ee3a7135"} Apr 20 19:32:37.307810 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:37.307715 2575 generic.go:358] "Generic (PLEG): container finished" podID="c9bd170d-e73d-449a-bf32-d16b74c782ee" containerID="57f5811dd34de359f17fe9aaec6d90813a25bce36aaa57287508278f06842009" exitCode=0 Apr 20 19:32:37.308180 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:37.307806 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" event={"ID":"c9bd170d-e73d-449a-bf32-d16b74c782ee","Type":"ContainerDied","Data":"57f5811dd34de359f17fe9aaec6d90813a25bce36aaa57287508278f06842009"} Apr 20 19:32:38.313941 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:38.313910 2575 generic.go:358] "Generic (PLEG): container finished" podID="c9bd170d-e73d-449a-bf32-d16b74c782ee" containerID="fab56ecd819fa026144a882eff55ee9dc92c238628a491a829b2eafb2a5f9238" exitCode=0 Apr 20 19:32:38.314331 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:38.314009 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" event={"ID":"c9bd170d-e73d-449a-bf32-d16b74c782ee","Type":"ContainerDied","Data":"fab56ecd819fa026144a882eff55ee9dc92c238628a491a829b2eafb2a5f9238"} Apr 20 19:32:39.444119 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:39.444093 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" Apr 20 19:32:39.521883 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:39.521853 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qvdl\" (UniqueName: \"kubernetes.io/projected/c9bd170d-e73d-449a-bf32-d16b74c782ee-kube-api-access-5qvdl\") pod \"c9bd170d-e73d-449a-bf32-d16b74c782ee\" (UID: \"c9bd170d-e73d-449a-bf32-d16b74c782ee\") " Apr 20 19:32:39.522065 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:39.521897 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9bd170d-e73d-449a-bf32-d16b74c782ee-bundle\") pod \"c9bd170d-e73d-449a-bf32-d16b74c782ee\" (UID: \"c9bd170d-e73d-449a-bf32-d16b74c782ee\") " Apr 20 19:32:39.522065 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:39.521917 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9bd170d-e73d-449a-bf32-d16b74c782ee-util\") pod \"c9bd170d-e73d-449a-bf32-d16b74c782ee\" (UID: \"c9bd170d-e73d-449a-bf32-d16b74c782ee\") " Apr 20 19:32:39.522756 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:39.522720 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9bd170d-e73d-449a-bf32-d16b74c782ee-bundle" (OuterVolumeSpecName: "bundle") pod "c9bd170d-e73d-449a-bf32-d16b74c782ee" (UID: "c9bd170d-e73d-449a-bf32-d16b74c782ee"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:32:39.524133 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:39.524102 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9bd170d-e73d-449a-bf32-d16b74c782ee-kube-api-access-5qvdl" (OuterVolumeSpecName: "kube-api-access-5qvdl") pod "c9bd170d-e73d-449a-bf32-d16b74c782ee" (UID: "c9bd170d-e73d-449a-bf32-d16b74c782ee"). InnerVolumeSpecName "kube-api-access-5qvdl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:32:39.527318 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:39.527289 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9bd170d-e73d-449a-bf32-d16b74c782ee-util" (OuterVolumeSpecName: "util") pod "c9bd170d-e73d-449a-bf32-d16b74c782ee" (UID: "c9bd170d-e73d-449a-bf32-d16b74c782ee"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:32:39.623290 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:39.623185 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5qvdl\" (UniqueName: \"kubernetes.io/projected/c9bd170d-e73d-449a-bf32-d16b74c782ee-kube-api-access-5qvdl\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:32:39.623290 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:39.623223 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9bd170d-e73d-449a-bf32-d16b74c782ee-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:32:39.623290 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:39.623233 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9bd170d-e73d-449a-bf32-d16b74c782ee-util\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:32:40.328469 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:40.328432 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" event={"ID":"c9bd170d-e73d-449a-bf32-d16b74c782ee","Type":"ContainerDied","Data":"b74b3c7971fc53bbc25bf6e36b6008608af18107abfec7b43d96f752ee3a7135"} Apr 20 19:32:40.328672 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:40.328474 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b74b3c7971fc53bbc25bf6e36b6008608af18107abfec7b43d96f752ee3a7135" Apr 20 19:32:40.328672 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:32:40.328492 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jhwlc" Apr 20 19:33:26.817181 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.817138 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg"] Apr 20 19:33:26.817674 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.817490 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9bd170d-e73d-449a-bf32-d16b74c782ee" containerName="pull" Apr 20 19:33:26.817674 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.817502 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9bd170d-e73d-449a-bf32-d16b74c782ee" containerName="pull" Apr 20 19:33:26.817674 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.817516 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9bd170d-e73d-449a-bf32-d16b74c782ee" containerName="extract" Apr 20 19:33:26.817674 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.817523 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9bd170d-e73d-449a-bf32-d16b74c782ee" containerName="extract" Apr 20 19:33:26.817674 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.817534 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9bd170d-e73d-449a-bf32-d16b74c782ee" containerName="util" Apr 20 19:33:26.817674 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.817539 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9bd170d-e73d-449a-bf32-d16b74c782ee" containerName="util" Apr 20 19:33:26.817674 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.817595 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9bd170d-e73d-449a-bf32-d16b74c782ee" containerName="extract" Apr 20 19:33:26.820699 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.820675 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" Apr 20 19:33:26.823425 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.823397 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 19:33:26.823585 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.823445 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 19:33:26.824355 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.824334 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-s2h9h\"" Apr 20 19:33:26.831664 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.831607 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1273a89-da0f-4ea8-b5bf-d04231bca953-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg\" (UID: \"c1273a89-da0f-4ea8-b5bf-d04231bca953\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" Apr 20 19:33:26.831949 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.831929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1273a89-da0f-4ea8-b5bf-d04231bca953-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg\" (UID: \"c1273a89-da0f-4ea8-b5bf-d04231bca953\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" Apr 20 19:33:26.832098 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.832083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcfsq\" (UniqueName: \"kubernetes.io/projected/c1273a89-da0f-4ea8-b5bf-d04231bca953-kube-api-access-gcfsq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg\" (UID: \"c1273a89-da0f-4ea8-b5bf-d04231bca953\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" Apr 20 19:33:26.832291 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.832271 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg"] Apr 20 19:33:26.932823 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.932780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1273a89-da0f-4ea8-b5bf-d04231bca953-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg\" (UID: \"c1273a89-da0f-4ea8-b5bf-d04231bca953\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" Apr 20 19:33:26.932823 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.932830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcfsq\" (UniqueName: \"kubernetes.io/projected/c1273a89-da0f-4ea8-b5bf-d04231bca953-kube-api-access-gcfsq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg\" (UID: \"c1273a89-da0f-4ea8-b5bf-d04231bca953\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" Apr 20 19:33:26.933185 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.932910 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1273a89-da0f-4ea8-b5bf-d04231bca953-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg\" (UID: \"c1273a89-da0f-4ea8-b5bf-d04231bca953\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" Apr 20 19:33:26.933275 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.933252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1273a89-da0f-4ea8-b5bf-d04231bca953-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg\" (UID: \"c1273a89-da0f-4ea8-b5bf-d04231bca953\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" Apr 20 19:33:26.933336 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.933254 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1273a89-da0f-4ea8-b5bf-d04231bca953-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg\" (UID: \"c1273a89-da0f-4ea8-b5bf-d04231bca953\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" Apr 20 19:33:26.941353 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:26.941326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcfsq\" (UniqueName: \"kubernetes.io/projected/c1273a89-da0f-4ea8-b5bf-d04231bca953-kube-api-access-gcfsq\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg\" (UID: \"c1273a89-da0f-4ea8-b5bf-d04231bca953\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" Apr 20 19:33:27.133411 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.133379 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" Apr 20 19:33:27.268337 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.268305 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg"] Apr 20 19:33:27.270389 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:33:27.270345 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1273a89_da0f_4ea8_b5bf_d04231bca953.slice/crio-5020c790a253b07144022c699c10d82939cb6be5f605b72350ae7750f51c8ad5 WatchSource:0}: Error finding container 5020c790a253b07144022c699c10d82939cb6be5f605b72350ae7750f51c8ad5: Status 404 returned error can't find the container with id 5020c790a253b07144022c699c10d82939cb6be5f605b72350ae7750f51c8ad5 Apr 20 19:33:27.403867 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.403772 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9"] Apr 20 19:33:27.407070 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.407045 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" Apr 20 19:33:27.414221 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.414187 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9"] Apr 20 19:33:27.438337 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.438282 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdbrx\" (UniqueName: \"kubernetes.io/projected/436fc7c2-2792-4cd5-93aa-53b635a2ab18-kube-api-access-vdbrx\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9\" (UID: \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" Apr 20 19:33:27.438514 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.438355 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/436fc7c2-2792-4cd5-93aa-53b635a2ab18-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9\" (UID: \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" Apr 20 19:33:27.438514 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.438397 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/436fc7c2-2792-4cd5-93aa-53b635a2ab18-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9\" (UID: \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" Apr 20 19:33:27.502066 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.502027 2575 generic.go:358] "Generic (PLEG): container finished" podID="c1273a89-da0f-4ea8-b5bf-d04231bca953" containerID="5c8e1276cb57412fdb5889add9a48f0a4864bb44d9581104270c7770afe3034c" exitCode=0 Apr 20 19:33:27.502272 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.502117 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" event={"ID":"c1273a89-da0f-4ea8-b5bf-d04231bca953","Type":"ContainerDied","Data":"5c8e1276cb57412fdb5889add9a48f0a4864bb44d9581104270c7770afe3034c"} Apr 20 19:33:27.502272 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.502158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" event={"ID":"c1273a89-da0f-4ea8-b5bf-d04231bca953","Type":"ContainerStarted","Data":"5020c790a253b07144022c699c10d82939cb6be5f605b72350ae7750f51c8ad5"} Apr 20 19:33:27.539469 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.539425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/436fc7c2-2792-4cd5-93aa-53b635a2ab18-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9\" (UID: \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" Apr 20 19:33:27.539717 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.539499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/436fc7c2-2792-4cd5-93aa-53b635a2ab18-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9\" (UID: \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" Apr 20 19:33:27.539717 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.539554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdbrx\" (UniqueName: \"kubernetes.io/projected/436fc7c2-2792-4cd5-93aa-53b635a2ab18-kube-api-access-vdbrx\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9\" (UID: \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" Apr 20 19:33:27.539951 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.539923 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/436fc7c2-2792-4cd5-93aa-53b635a2ab18-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9\" (UID: \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" Apr 20 19:33:27.540004 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.539986 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/436fc7c2-2792-4cd5-93aa-53b635a2ab18-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9\" (UID: \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" Apr 20 19:33:27.552305 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.552261 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdbrx\" (UniqueName: \"kubernetes.io/projected/436fc7c2-2792-4cd5-93aa-53b635a2ab18-kube-api-access-vdbrx\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9\" (UID: \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" Apr 20 19:33:27.732306 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.732210 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" Apr 20 19:33:27.863205 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:27.863155 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9"] Apr 20 19:33:27.871091 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:33:27.869784 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod436fc7c2_2792_4cd5_93aa_53b635a2ab18.slice/crio-b39637c6bfe12b04368b197128791c654dfe4bb7613add9c8b671401d7bd6473 WatchSource:0}: Error finding container b39637c6bfe12b04368b197128791c654dfe4bb7613add9c8b671401d7bd6473: Status 404 returned error can't find the container with id b39637c6bfe12b04368b197128791c654dfe4bb7613add9c8b671401d7bd6473 Apr 20 19:33:28.009413 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.009322 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj"] Apr 20 19:33:28.013096 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.013070 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" Apr 20 19:33:28.018981 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.018950 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj"] Apr 20 19:33:28.043463 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.043425 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77968d31-953a-4f8b-942d-87fcb75f5352-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj\" (UID: \"77968d31-953a-4f8b-942d-87fcb75f5352\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" Apr 20 19:33:28.043599 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.043483 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77968d31-953a-4f8b-942d-87fcb75f5352-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj\" (UID: \"77968d31-953a-4f8b-942d-87fcb75f5352\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" Apr 20 19:33:28.043599 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.043550 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qw29\" (UniqueName: \"kubernetes.io/projected/77968d31-953a-4f8b-942d-87fcb75f5352-kube-api-access-2qw29\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj\" (UID: \"77968d31-953a-4f8b-942d-87fcb75f5352\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" Apr 20 19:33:28.144919 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.144874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qw29\" (UniqueName: \"kubernetes.io/projected/77968d31-953a-4f8b-942d-87fcb75f5352-kube-api-access-2qw29\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj\" (UID: \"77968d31-953a-4f8b-942d-87fcb75f5352\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" Apr 20 19:33:28.145130 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.144940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77968d31-953a-4f8b-942d-87fcb75f5352-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj\" (UID: \"77968d31-953a-4f8b-942d-87fcb75f5352\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" Apr 20 19:33:28.145130 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.144971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77968d31-953a-4f8b-942d-87fcb75f5352-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj\" (UID: \"77968d31-953a-4f8b-942d-87fcb75f5352\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" Apr 20 19:33:28.145348 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.145327 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77968d31-953a-4f8b-942d-87fcb75f5352-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj\" (UID: \"77968d31-953a-4f8b-942d-87fcb75f5352\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" Apr 20 19:33:28.145407 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.145335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77968d31-953a-4f8b-942d-87fcb75f5352-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj\" (UID: \"77968d31-953a-4f8b-942d-87fcb75f5352\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" Apr 20 19:33:28.153890 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.153851 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qw29\" (UniqueName: \"kubernetes.io/projected/77968d31-953a-4f8b-942d-87fcb75f5352-kube-api-access-2qw29\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj\" (UID: \"77968d31-953a-4f8b-942d-87fcb75f5352\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" Apr 20 19:33:28.344021 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.343983 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" Apr 20 19:33:28.404797 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.402451 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf"] Apr 20 19:33:28.408112 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.408081 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" Apr 20 19:33:28.414095 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.414062 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf"] Apr 20 19:33:28.448465 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.448428 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf75t\" (UniqueName: \"kubernetes.io/projected/48fab0b8-61f2-4a90-9db9-fca6dedd5126-kube-api-access-nf75t\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf\" (UID: \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" Apr 20 19:33:28.448681 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.448482 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48fab0b8-61f2-4a90-9db9-fca6dedd5126-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf\" (UID: \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" Apr 20 19:33:28.448681 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.448557 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48fab0b8-61f2-4a90-9db9-fca6dedd5126-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf\" (UID: \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" Apr 20 19:33:28.490990 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.490955 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj"] Apr 20 19:33:28.493599 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:33:28.493564 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77968d31_953a_4f8b_942d_87fcb75f5352.slice/crio-e3f2cea008f9cbcb4334b39c7bb9ec3eebf502738bbd6d2504769d33e601deab WatchSource:0}: Error finding container e3f2cea008f9cbcb4334b39c7bb9ec3eebf502738bbd6d2504769d33e601deab: Status 404 returned error can't find the container with id e3f2cea008f9cbcb4334b39c7bb9ec3eebf502738bbd6d2504769d33e601deab Apr 20 19:33:28.511504 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.511470 2575 generic.go:358] "Generic (PLEG): container finished" podID="436fc7c2-2792-4cd5-93aa-53b635a2ab18" containerID="a3857ec6cdfb232b99d8e34cbd04396ccc3334ab7403b32bf81256906191e39e" exitCode=0 Apr 20 19:33:28.511764 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.511505 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" event={"ID":"436fc7c2-2792-4cd5-93aa-53b635a2ab18","Type":"ContainerDied","Data":"a3857ec6cdfb232b99d8e34cbd04396ccc3334ab7403b32bf81256906191e39e"} Apr 20 19:33:28.511764 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.511545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" event={"ID":"436fc7c2-2792-4cd5-93aa-53b635a2ab18","Type":"ContainerStarted","Data":"b39637c6bfe12b04368b197128791c654dfe4bb7613add9c8b671401d7bd6473"} Apr 20 19:33:28.513366 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.513339 2575 generic.go:358] "Generic (PLEG): container finished" podID="c1273a89-da0f-4ea8-b5bf-d04231bca953" containerID="1a1745738d84f759e2cbada11dbd2c5da6931edcc628d147e12b2733373852d3" exitCode=0 Apr 20 19:33:28.513503 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.513418 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" event={"ID":"c1273a89-da0f-4ea8-b5bf-d04231bca953","Type":"ContainerDied","Data":"1a1745738d84f759e2cbada11dbd2c5da6931edcc628d147e12b2733373852d3"} Apr 20 19:33:28.514861 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.514832 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" event={"ID":"77968d31-953a-4f8b-942d-87fcb75f5352","Type":"ContainerStarted","Data":"e3f2cea008f9cbcb4334b39c7bb9ec3eebf502738bbd6d2504769d33e601deab"} Apr 20 19:33:28.549217 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.549167 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nf75t\" (UniqueName: \"kubernetes.io/projected/48fab0b8-61f2-4a90-9db9-fca6dedd5126-kube-api-access-nf75t\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf\" (UID: \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" Apr 20 19:33:28.549217 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.549218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48fab0b8-61f2-4a90-9db9-fca6dedd5126-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf\" (UID: \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" Apr 20 19:33:28.549452 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.549279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48fab0b8-61f2-4a90-9db9-fca6dedd5126-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf\" (UID: \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" Apr 20 19:33:28.549654 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.549604 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48fab0b8-61f2-4a90-9db9-fca6dedd5126-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf\" (UID: \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" Apr 20 19:33:28.549774 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.549677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48fab0b8-61f2-4a90-9db9-fca6dedd5126-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf\" (UID: \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" Apr 20 19:33:28.557499 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.557465 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf75t\" (UniqueName: \"kubernetes.io/projected/48fab0b8-61f2-4a90-9db9-fca6dedd5126-kube-api-access-nf75t\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf\" (UID: \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" Apr 20 19:33:28.745101 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.745054 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" Apr 20 19:33:28.881433 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:28.881402 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf"] Apr 20 19:33:28.883254 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:33:28.883224 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48fab0b8_61f2_4a90_9db9_fca6dedd5126.slice/crio-9efe4590d51721b1e40a6e4231124a19ebf3bfaff236f0aa49bfa8cd11804bcc WatchSource:0}: Error finding container 9efe4590d51721b1e40a6e4231124a19ebf3bfaff236f0aa49bfa8cd11804bcc: Status 404 returned error can't find the container with id 9efe4590d51721b1e40a6e4231124a19ebf3bfaff236f0aa49bfa8cd11804bcc Apr 20 19:33:29.519971 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:29.519934 2575 generic.go:358] "Generic (PLEG): container finished" podID="436fc7c2-2792-4cd5-93aa-53b635a2ab18" containerID="6e1d4336b94cf8b1efeaf0d2cb2c945189f602302178c5fed216ff42627d3bd9" exitCode=0 Apr 20 19:33:29.520116 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:29.520021 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" event={"ID":"436fc7c2-2792-4cd5-93aa-53b635a2ab18","Type":"ContainerDied","Data":"6e1d4336b94cf8b1efeaf0d2cb2c945189f602302178c5fed216ff42627d3bd9"} Apr 20 19:33:29.522042 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:29.522016 2575 generic.go:358] "Generic (PLEG): container finished" podID="c1273a89-da0f-4ea8-b5bf-d04231bca953" containerID="a44265702d0a144334f3c178cf7a5f8f8149396a991a6647f463eb6318f85299" exitCode=0 Apr 20 19:33:29.522135 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:29.522107 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" event={"ID":"c1273a89-da0f-4ea8-b5bf-d04231bca953","Type":"ContainerDied","Data":"a44265702d0a144334f3c178cf7a5f8f8149396a991a6647f463eb6318f85299"} Apr 20 19:33:29.523494 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:29.523467 2575 generic.go:358] "Generic (PLEG): container finished" podID="77968d31-953a-4f8b-942d-87fcb75f5352" containerID="583149c153acb87906bcae121c16cd1f9f738c32a73dc8c4f6effa3a041406b0" exitCode=0 Apr 20 19:33:29.523633 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:29.523516 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" event={"ID":"77968d31-953a-4f8b-942d-87fcb75f5352","Type":"ContainerDied","Data":"583149c153acb87906bcae121c16cd1f9f738c32a73dc8c4f6effa3a041406b0"} Apr 20 19:33:29.525301 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:29.525156 2575 generic.go:358] "Generic (PLEG): container finished" podID="48fab0b8-61f2-4a90-9db9-fca6dedd5126" containerID="a3c5d9941b2260575f3ea20b4c44c82cc9d6fe303b740a1427316cb6eb04c213" exitCode=0 Apr 20 19:33:29.525301 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:29.525228 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" event={"ID":"48fab0b8-61f2-4a90-9db9-fca6dedd5126","Type":"ContainerDied","Data":"a3c5d9941b2260575f3ea20b4c44c82cc9d6fe303b740a1427316cb6eb04c213"} Apr 20 19:33:29.525301 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:29.525258 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" event={"ID":"48fab0b8-61f2-4a90-9db9-fca6dedd5126","Type":"ContainerStarted","Data":"9efe4590d51721b1e40a6e4231124a19ebf3bfaff236f0aa49bfa8cd11804bcc"} Apr 20 19:33:30.531448 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.531407 2575 generic.go:358] "Generic (PLEG): container finished" podID="436fc7c2-2792-4cd5-93aa-53b635a2ab18" containerID="67f1d4ddc775b77f35563b9fd820b6438822b775862bdf4aca1897af7356d5fe" exitCode=0 Apr 20 19:33:30.531896 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.531505 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" event={"ID":"436fc7c2-2792-4cd5-93aa-53b635a2ab18","Type":"ContainerDied","Data":"67f1d4ddc775b77f35563b9fd820b6438822b775862bdf4aca1897af7356d5fe"} Apr 20 19:33:30.533270 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.533248 2575 generic.go:358] "Generic (PLEG): container finished" podID="77968d31-953a-4f8b-942d-87fcb75f5352" containerID="062c3033ae3bbf8c63511ad3381094a6fe19d6aae65d72bb415fcd53fa5e28a9" exitCode=0 Apr 20 19:33:30.533394 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.533332 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" event={"ID":"77968d31-953a-4f8b-942d-87fcb75f5352","Type":"ContainerDied","Data":"062c3033ae3bbf8c63511ad3381094a6fe19d6aae65d72bb415fcd53fa5e28a9"} Apr 20 19:33:30.535296 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.535188 2575 generic.go:358] "Generic (PLEG): container finished" podID="48fab0b8-61f2-4a90-9db9-fca6dedd5126" containerID="58f7f9e81374ac95260a1d6b9777f2f6274916e033c6b241972d4a995e998e58" exitCode=0 Apr 20 19:33:30.535702 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.535680 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" event={"ID":"48fab0b8-61f2-4a90-9db9-fca6dedd5126","Type":"ContainerDied","Data":"58f7f9e81374ac95260a1d6b9777f2f6274916e033c6b241972d4a995e998e58"} Apr 20 19:33:30.676272 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.675982 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" Apr 20 19:33:30.771530 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.771493 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1273a89-da0f-4ea8-b5bf-d04231bca953-util\") pod \"c1273a89-da0f-4ea8-b5bf-d04231bca953\" (UID: \"c1273a89-da0f-4ea8-b5bf-d04231bca953\") " Apr 20 19:33:30.771752 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.771539 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1273a89-da0f-4ea8-b5bf-d04231bca953-bundle\") pod \"c1273a89-da0f-4ea8-b5bf-d04231bca953\" (UID: \"c1273a89-da0f-4ea8-b5bf-d04231bca953\") " Apr 20 19:33:30.771752 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.771560 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcfsq\" (UniqueName: \"kubernetes.io/projected/c1273a89-da0f-4ea8-b5bf-d04231bca953-kube-api-access-gcfsq\") pod \"c1273a89-da0f-4ea8-b5bf-d04231bca953\" (UID: \"c1273a89-da0f-4ea8-b5bf-d04231bca953\") " Apr 20 19:33:30.772138 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.772102 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1273a89-da0f-4ea8-b5bf-d04231bca953-bundle" (OuterVolumeSpecName: "bundle") pod "c1273a89-da0f-4ea8-b5bf-d04231bca953" (UID: "c1273a89-da0f-4ea8-b5bf-d04231bca953"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:33:30.774108 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.774073 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1273a89-da0f-4ea8-b5bf-d04231bca953-kube-api-access-gcfsq" (OuterVolumeSpecName: "kube-api-access-gcfsq") pod "c1273a89-da0f-4ea8-b5bf-d04231bca953" (UID: "c1273a89-da0f-4ea8-b5bf-d04231bca953"). InnerVolumeSpecName "kube-api-access-gcfsq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:33:30.777443 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.777413 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1273a89-da0f-4ea8-b5bf-d04231bca953-util" (OuterVolumeSpecName: "util") pod "c1273a89-da0f-4ea8-b5bf-d04231bca953" (UID: "c1273a89-da0f-4ea8-b5bf-d04231bca953"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:33:30.872539 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.872436 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1273a89-da0f-4ea8-b5bf-d04231bca953-util\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:33:30.872539 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.872474 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1273a89-da0f-4ea8-b5bf-d04231bca953-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:33:30.872539 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:30.872485 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gcfsq\" (UniqueName: \"kubernetes.io/projected/c1273a89-da0f-4ea8-b5bf-d04231bca953-kube-api-access-gcfsq\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:33:31.541987 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.541945 2575 generic.go:358] "Generic (PLEG): container finished" podID="48fab0b8-61f2-4a90-9db9-fca6dedd5126" containerID="a115aa3ffc24ac8ba459b56ed76ed5fac837dc6e4a5f68484e888c6ac7f010e0" exitCode=0 Apr 20 19:33:31.542416 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.542067 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" event={"ID":"48fab0b8-61f2-4a90-9db9-fca6dedd5126","Type":"ContainerDied","Data":"a115aa3ffc24ac8ba459b56ed76ed5fac837dc6e4a5f68484e888c6ac7f010e0"} Apr 20 19:33:31.543823 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.543787 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" event={"ID":"c1273a89-da0f-4ea8-b5bf-d04231bca953","Type":"ContainerDied","Data":"5020c790a253b07144022c699c10d82939cb6be5f605b72350ae7750f51c8ad5"} Apr 20 19:33:31.543823 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.543813 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg" Apr 20 19:33:31.543823 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.543824 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5020c790a253b07144022c699c10d82939cb6be5f605b72350ae7750f51c8ad5" Apr 20 19:33:31.546367 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.546333 2575 generic.go:358] "Generic (PLEG): container finished" podID="77968d31-953a-4f8b-942d-87fcb75f5352" containerID="a61af395e21824c56fe6297b1d2ff9c1ec62c07666060476bb804a53c2d84a6a" exitCode=0 Apr 20 19:33:31.546511 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.546375 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" event={"ID":"77968d31-953a-4f8b-942d-87fcb75f5352","Type":"ContainerDied","Data":"a61af395e21824c56fe6297b1d2ff9c1ec62c07666060476bb804a53c2d84a6a"} Apr 20 19:33:31.676654 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.676599 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" Apr 20 19:33:31.780084 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.780043 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/436fc7c2-2792-4cd5-93aa-53b635a2ab18-bundle\") pod \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\" (UID: \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\") " Apr 20 19:33:31.780084 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.780089 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/436fc7c2-2792-4cd5-93aa-53b635a2ab18-util\") pod \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\" (UID: \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\") " Apr 20 19:33:31.780342 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.780154 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdbrx\" (UniqueName: \"kubernetes.io/projected/436fc7c2-2792-4cd5-93aa-53b635a2ab18-kube-api-access-vdbrx\") pod \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\" (UID: \"436fc7c2-2792-4cd5-93aa-53b635a2ab18\") " Apr 20 19:33:31.780699 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.780672 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/436fc7c2-2792-4cd5-93aa-53b635a2ab18-bundle" (OuterVolumeSpecName: "bundle") pod "436fc7c2-2792-4cd5-93aa-53b635a2ab18" (UID: "436fc7c2-2792-4cd5-93aa-53b635a2ab18"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:33:31.782415 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.782390 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/436fc7c2-2792-4cd5-93aa-53b635a2ab18-kube-api-access-vdbrx" (OuterVolumeSpecName: "kube-api-access-vdbrx") pod "436fc7c2-2792-4cd5-93aa-53b635a2ab18" (UID: "436fc7c2-2792-4cd5-93aa-53b635a2ab18"). InnerVolumeSpecName "kube-api-access-vdbrx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:33:31.785111 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.785083 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/436fc7c2-2792-4cd5-93aa-53b635a2ab18-util" (OuterVolumeSpecName: "util") pod "436fc7c2-2792-4cd5-93aa-53b635a2ab18" (UID: "436fc7c2-2792-4cd5-93aa-53b635a2ab18"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:33:31.880850 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.880758 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/436fc7c2-2792-4cd5-93aa-53b635a2ab18-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:33:31.880850 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.880790 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/436fc7c2-2792-4cd5-93aa-53b635a2ab18-util\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:33:31.880850 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:31.880800 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vdbrx\" (UniqueName: \"kubernetes.io/projected/436fc7c2-2792-4cd5-93aa-53b635a2ab18-kube-api-access-vdbrx\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:33:32.552697 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.552645 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" Apr 20 19:33:32.552697 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.552666 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9" event={"ID":"436fc7c2-2792-4cd5-93aa-53b635a2ab18","Type":"ContainerDied","Data":"b39637c6bfe12b04368b197128791c654dfe4bb7613add9c8b671401d7bd6473"} Apr 20 19:33:32.552697 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.552697 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b39637c6bfe12b04368b197128791c654dfe4bb7613add9c8b671401d7bd6473" Apr 20 19:33:32.688603 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.688566 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" Apr 20 19:33:32.722901 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.722873 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" Apr 20 19:33:32.789837 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.789794 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77968d31-953a-4f8b-942d-87fcb75f5352-bundle\") pod \"77968d31-953a-4f8b-942d-87fcb75f5352\" (UID: \"77968d31-953a-4f8b-942d-87fcb75f5352\") " Apr 20 19:33:32.790023 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.789850 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77968d31-953a-4f8b-942d-87fcb75f5352-util\") pod \"77968d31-953a-4f8b-942d-87fcb75f5352\" (UID: \"77968d31-953a-4f8b-942d-87fcb75f5352\") " Apr 20 19:33:32.790023 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.789867 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48fab0b8-61f2-4a90-9db9-fca6dedd5126-bundle\") pod \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\" (UID: \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\") " Apr 20 19:33:32.790023 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.789886 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48fab0b8-61f2-4a90-9db9-fca6dedd5126-util\") pod \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\" (UID: \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\") " Apr 20 19:33:32.790023 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.789915 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf75t\" (UniqueName: \"kubernetes.io/projected/48fab0b8-61f2-4a90-9db9-fca6dedd5126-kube-api-access-nf75t\") pod \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\" (UID: \"48fab0b8-61f2-4a90-9db9-fca6dedd5126\") " Apr 20 19:33:32.790023 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.789989 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qw29\" (UniqueName: \"kubernetes.io/projected/77968d31-953a-4f8b-942d-87fcb75f5352-kube-api-access-2qw29\") pod \"77968d31-953a-4f8b-942d-87fcb75f5352\" (UID: \"77968d31-953a-4f8b-942d-87fcb75f5352\") " Apr 20 19:33:32.790424 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.790391 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77968d31-953a-4f8b-942d-87fcb75f5352-bundle" (OuterVolumeSpecName: "bundle") pod "77968d31-953a-4f8b-942d-87fcb75f5352" (UID: "77968d31-953a-4f8b-942d-87fcb75f5352"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:33:32.790690 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.790665 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48fab0b8-61f2-4a90-9db9-fca6dedd5126-bundle" (OuterVolumeSpecName: "bundle") pod "48fab0b8-61f2-4a90-9db9-fca6dedd5126" (UID: "48fab0b8-61f2-4a90-9db9-fca6dedd5126"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:33:32.792353 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.792324 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77968d31-953a-4f8b-942d-87fcb75f5352-kube-api-access-2qw29" (OuterVolumeSpecName: "kube-api-access-2qw29") pod "77968d31-953a-4f8b-942d-87fcb75f5352" (UID: "77968d31-953a-4f8b-942d-87fcb75f5352"). InnerVolumeSpecName "kube-api-access-2qw29". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:33:32.792682 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.792652 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fab0b8-61f2-4a90-9db9-fca6dedd5126-kube-api-access-nf75t" (OuterVolumeSpecName: "kube-api-access-nf75t") pod "48fab0b8-61f2-4a90-9db9-fca6dedd5126" (UID: "48fab0b8-61f2-4a90-9db9-fca6dedd5126"). InnerVolumeSpecName "kube-api-access-nf75t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:33:32.796006 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.795950 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48fab0b8-61f2-4a90-9db9-fca6dedd5126-util" (OuterVolumeSpecName: "util") pod "48fab0b8-61f2-4a90-9db9-fca6dedd5126" (UID: "48fab0b8-61f2-4a90-9db9-fca6dedd5126"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:33:32.796238 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.796203 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77968d31-953a-4f8b-942d-87fcb75f5352-util" (OuterVolumeSpecName: "util") pod "77968d31-953a-4f8b-942d-87fcb75f5352" (UID: "77968d31-953a-4f8b-942d-87fcb75f5352"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:33:32.891363 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.891333 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2qw29\" (UniqueName: \"kubernetes.io/projected/77968d31-953a-4f8b-942d-87fcb75f5352-kube-api-access-2qw29\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:33:32.891363 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.891366 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77968d31-953a-4f8b-942d-87fcb75f5352-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:33:32.891532 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.891377 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77968d31-953a-4f8b-942d-87fcb75f5352-util\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:33:32.891532 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.891386 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48fab0b8-61f2-4a90-9db9-fca6dedd5126-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:33:32.891532 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.891393 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48fab0b8-61f2-4a90-9db9-fca6dedd5126-util\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:33:32.891532 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:32.891402 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nf75t\" (UniqueName: \"kubernetes.io/projected/48fab0b8-61f2-4a90-9db9-fca6dedd5126-kube-api-access-nf75t\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:33:33.558922 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:33.558880 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" event={"ID":"48fab0b8-61f2-4a90-9db9-fca6dedd5126","Type":"ContainerDied","Data":"9efe4590d51721b1e40a6e4231124a19ebf3bfaff236f0aa49bfa8cd11804bcc"} Apr 20 19:33:33.558922 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:33.558925 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9efe4590d51721b1e40a6e4231124a19ebf3bfaff236f0aa49bfa8cd11804bcc" Apr 20 19:33:33.559390 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:33.559003 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf" Apr 20 19:33:33.560741 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:33.560701 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" event={"ID":"77968d31-953a-4f8b-942d-87fcb75f5352","Type":"ContainerDied","Data":"e3f2cea008f9cbcb4334b39c7bb9ec3eebf502738bbd6d2504769d33e601deab"} Apr 20 19:33:33.560741 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:33.560736 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3f2cea008f9cbcb4334b39c7bb9ec3eebf502738bbd6d2504769d33e601deab" Apr 20 19:33:33.560915 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:33.560773 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj" Apr 20 19:33:56.468025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:33:56.467989 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6755758b4b-hzksz"] Apr 20 19:34:03.144995 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.144953 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq"] Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145299 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77968d31-953a-4f8b-942d-87fcb75f5352" containerName="pull" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145310 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="77968d31-953a-4f8b-942d-87fcb75f5352" containerName="pull" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145320 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48fab0b8-61f2-4a90-9db9-fca6dedd5126" containerName="extract" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145326 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fab0b8-61f2-4a90-9db9-fca6dedd5126" containerName="extract" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145337 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="436fc7c2-2792-4cd5-93aa-53b635a2ab18" containerName="extract" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145343 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="436fc7c2-2792-4cd5-93aa-53b635a2ab18" containerName="extract" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145351 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1273a89-da0f-4ea8-b5bf-d04231bca953" containerName="util" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145356 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1273a89-da0f-4ea8-b5bf-d04231bca953" containerName="util" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145362 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77968d31-953a-4f8b-942d-87fcb75f5352" containerName="util" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145367 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="77968d31-953a-4f8b-942d-87fcb75f5352" containerName="util" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145374 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="436fc7c2-2792-4cd5-93aa-53b635a2ab18" containerName="pull" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145378 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="436fc7c2-2792-4cd5-93aa-53b635a2ab18" containerName="pull" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145386 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1273a89-da0f-4ea8-b5bf-d04231bca953" containerName="pull" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145391 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1273a89-da0f-4ea8-b5bf-d04231bca953" containerName="pull" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145401 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1273a89-da0f-4ea8-b5bf-d04231bca953" containerName="extract" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145406 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1273a89-da0f-4ea8-b5bf-d04231bca953" containerName="extract" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145416 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="436fc7c2-2792-4cd5-93aa-53b635a2ab18" containerName="util" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145421 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="436fc7c2-2792-4cd5-93aa-53b635a2ab18" containerName="util" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145426 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77968d31-953a-4f8b-942d-87fcb75f5352" containerName="extract" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145432 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="77968d31-953a-4f8b-942d-87fcb75f5352" containerName="extract" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145439 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48fab0b8-61f2-4a90-9db9-fca6dedd5126" containerName="util" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145443 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fab0b8-61f2-4a90-9db9-fca6dedd5126" containerName="util" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145451 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48fab0b8-61f2-4a90-9db9-fca6dedd5126" containerName="pull" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145456 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fab0b8-61f2-4a90-9db9-fca6dedd5126" containerName="pull" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145503 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="436fc7c2-2792-4cd5-93aa-53b635a2ab18" containerName="extract" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145513 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="77968d31-953a-4f8b-942d-87fcb75f5352" containerName="extract" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145520 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="48fab0b8-61f2-4a90-9db9-fca6dedd5126" containerName="extract" Apr 20 19:34:03.145519 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.145526 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1273a89-da0f-4ea8-b5bf-d04231bca953" containerName="extract" Apr 20 19:34:03.148743 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.148719 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" Apr 20 19:34:03.151694 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.151667 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 19:34:03.151963 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.151703 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 19:34:03.151963 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.151761 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 19:34:03.152629 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.152595 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-s2h9h\"" Apr 20 19:34:03.152697 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.152595 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 19:34:03.158779 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.158747 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq"] Apr 20 19:34:03.255984 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.255927 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdr2c\" (UniqueName: \"kubernetes.io/projected/ca8d07b0-2a3f-4077-8471-8a969c792dcf-kube-api-access-bdr2c\") pod \"kuadrant-console-plugin-6cb54b5c86-r85fq\" (UID: \"ca8d07b0-2a3f-4077-8471-8a969c792dcf\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" Apr 20 19:34:03.256179 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.256091 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ca8d07b0-2a3f-4077-8471-8a969c792dcf-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-r85fq\" (UID: \"ca8d07b0-2a3f-4077-8471-8a969c792dcf\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" Apr 20 19:34:03.256179 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.256129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8d07b0-2a3f-4077-8471-8a969c792dcf-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-r85fq\" (UID: \"ca8d07b0-2a3f-4077-8471-8a969c792dcf\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" Apr 20 19:34:03.357545 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.357492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ca8d07b0-2a3f-4077-8471-8a969c792dcf-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-r85fq\" (UID: \"ca8d07b0-2a3f-4077-8471-8a969c792dcf\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" Apr 20 19:34:03.357545 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.357551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8d07b0-2a3f-4077-8471-8a969c792dcf-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-r85fq\" (UID: \"ca8d07b0-2a3f-4077-8471-8a969c792dcf\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" Apr 20 19:34:03.357730 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.357579 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdr2c\" (UniqueName: \"kubernetes.io/projected/ca8d07b0-2a3f-4077-8471-8a969c792dcf-kube-api-access-bdr2c\") pod \"kuadrant-console-plugin-6cb54b5c86-r85fq\" (UID: \"ca8d07b0-2a3f-4077-8471-8a969c792dcf\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" Apr 20 19:34:03.357730 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:34:03.357712 2575 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 20 19:34:03.357793 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:34:03.357785 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca8d07b0-2a3f-4077-8471-8a969c792dcf-plugin-serving-cert podName:ca8d07b0-2a3f-4077-8471-8a969c792dcf nodeName:}" failed. No retries permitted until 2026-04-20 19:34:03.857767803 +0000 UTC m=+499.548181055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/ca8d07b0-2a3f-4077-8471-8a969c792dcf-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-r85fq" (UID: "ca8d07b0-2a3f-4077-8471-8a969c792dcf") : secret "plugin-serving-cert" not found Apr 20 19:34:03.358144 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.358124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ca8d07b0-2a3f-4077-8471-8a969c792dcf-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-r85fq\" (UID: \"ca8d07b0-2a3f-4077-8471-8a969c792dcf\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" Apr 20 19:34:03.368841 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.368805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdr2c\" (UniqueName: \"kubernetes.io/projected/ca8d07b0-2a3f-4077-8471-8a969c792dcf-kube-api-access-bdr2c\") pod \"kuadrant-console-plugin-6cb54b5c86-r85fq\" (UID: \"ca8d07b0-2a3f-4077-8471-8a969c792dcf\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" Apr 20 19:34:03.862948 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.862900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8d07b0-2a3f-4077-8471-8a969c792dcf-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-r85fq\" (UID: \"ca8d07b0-2a3f-4077-8471-8a969c792dcf\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" Apr 20 19:34:03.865567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:03.865538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8d07b0-2a3f-4077-8471-8a969c792dcf-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-r85fq\" (UID: \"ca8d07b0-2a3f-4077-8471-8a969c792dcf\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" Apr 20 19:34:04.059242 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:04.059197 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" Apr 20 19:34:04.198212 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:04.198184 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq"] Apr 20 19:34:04.200040 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:34:04.200009 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca8d07b0_2a3f_4077_8471_8a969c792dcf.slice/crio-0dbfd05768f6da70bde39656597efc10d5b8ce8b763bf176d6feed7eb0226e81 WatchSource:0}: Error finding container 0dbfd05768f6da70bde39656597efc10d5b8ce8b763bf176d6feed7eb0226e81: Status 404 returned error can't find the container with id 0dbfd05768f6da70bde39656597efc10d5b8ce8b763bf176d6feed7eb0226e81 Apr 20 19:34:04.696848 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:04.696807 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" event={"ID":"ca8d07b0-2a3f-4077-8471-8a969c792dcf","Type":"ContainerStarted","Data":"0dbfd05768f6da70bde39656597efc10d5b8ce8b763bf176d6feed7eb0226e81"} Apr 20 19:34:21.487799 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:21.487757 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6755758b4b-hzksz" podUID="a0408d02-505e-4f7d-a05c-1026c8a3090d" containerName="console" containerID="cri-o://527941546c3275aaa3b972085af5244a3eb3a4de1021f3cbe4f0920e55d88352" gracePeriod=15 Apr 20 19:34:26.673531 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.673484 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6755758b4b-hzksz_a0408d02-505e-4f7d-a05c-1026c8a3090d/console/0.log" Apr 20 19:34:26.673981 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.673572 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:34:26.765948 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.765056 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-config\") pod \"a0408d02-505e-4f7d-a05c-1026c8a3090d\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " Apr 20 19:34:26.765948 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.765154 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54r88\" (UniqueName: \"kubernetes.io/projected/a0408d02-505e-4f7d-a05c-1026c8a3090d-kube-api-access-54r88\") pod \"a0408d02-505e-4f7d-a05c-1026c8a3090d\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " Apr 20 19:34:26.765948 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.765199 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-oauth-serving-cert\") pod \"a0408d02-505e-4f7d-a05c-1026c8a3090d\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " Apr 20 19:34:26.765948 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.765232 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-trusted-ca-bundle\") pod \"a0408d02-505e-4f7d-a05c-1026c8a3090d\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " Apr 20 19:34:26.765948 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.765285 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-service-ca\") pod \"a0408d02-505e-4f7d-a05c-1026c8a3090d\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " Apr 20 19:34:26.765948 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.765347 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-oauth-config\") pod \"a0408d02-505e-4f7d-a05c-1026c8a3090d\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " Apr 20 19:34:26.765948 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.765395 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-serving-cert\") pod \"a0408d02-505e-4f7d-a05c-1026c8a3090d\" (UID: \"a0408d02-505e-4f7d-a05c-1026c8a3090d\") " Apr 20 19:34:26.766424 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.766199 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a0408d02-505e-4f7d-a05c-1026c8a3090d" (UID: "a0408d02-505e-4f7d-a05c-1026c8a3090d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:34:26.769158 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.769116 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0408d02-505e-4f7d-a05c-1026c8a3090d-kube-api-access-54r88" (OuterVolumeSpecName: "kube-api-access-54r88") pod "a0408d02-505e-4f7d-a05c-1026c8a3090d" (UID: "a0408d02-505e-4f7d-a05c-1026c8a3090d"). InnerVolumeSpecName "kube-api-access-54r88". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:34:26.769312 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.769158 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a0408d02-505e-4f7d-a05c-1026c8a3090d" (UID: "a0408d02-505e-4f7d-a05c-1026c8a3090d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:34:26.769312 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.769202 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a0408d02-505e-4f7d-a05c-1026c8a3090d" (UID: "a0408d02-505e-4f7d-a05c-1026c8a3090d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:34:26.769312 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.769288 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-service-ca" (OuterVolumeSpecName: "service-ca") pod "a0408d02-505e-4f7d-a05c-1026c8a3090d" (UID: "a0408d02-505e-4f7d-a05c-1026c8a3090d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:34:26.769561 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.769540 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-config" (OuterVolumeSpecName: "console-config") pod "a0408d02-505e-4f7d-a05c-1026c8a3090d" (UID: "a0408d02-505e-4f7d-a05c-1026c8a3090d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:34:26.773713 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.773672 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a0408d02-505e-4f7d-a05c-1026c8a3090d" (UID: "a0408d02-505e-4f7d-a05c-1026c8a3090d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:34:26.794029 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.793923 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" event={"ID":"ca8d07b0-2a3f-4077-8471-8a969c792dcf","Type":"ContainerStarted","Data":"c2bbcf76edf97ea650958a8e9934be5c6346cf34ff4685230eff1843c43562b9"} Apr 20 19:34:26.795257 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.795234 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6755758b4b-hzksz_a0408d02-505e-4f7d-a05c-1026c8a3090d/console/0.log" Apr 20 19:34:26.795412 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.795276 2575 generic.go:358] "Generic (PLEG): container finished" podID="a0408d02-505e-4f7d-a05c-1026c8a3090d" containerID="527941546c3275aaa3b972085af5244a3eb3a4de1021f3cbe4f0920e55d88352" exitCode=2 Apr 20 19:34:26.795412 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.795306 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6755758b4b-hzksz" event={"ID":"a0408d02-505e-4f7d-a05c-1026c8a3090d","Type":"ContainerDied","Data":"527941546c3275aaa3b972085af5244a3eb3a4de1021f3cbe4f0920e55d88352"} Apr 20 19:34:26.795412 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.795341 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6755758b4b-hzksz" event={"ID":"a0408d02-505e-4f7d-a05c-1026c8a3090d","Type":"ContainerDied","Data":"0ac885035d3ae43bddf43c658bd222dd57eaabd68fabb0f720e4281d7a7df130"} Apr 20 19:34:26.795412 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.795355 2575 scope.go:117] "RemoveContainer" containerID="527941546c3275aaa3b972085af5244a3eb3a4de1021f3cbe4f0920e55d88352" Apr 20 19:34:26.795412 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.795372 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6755758b4b-hzksz" Apr 20 19:34:26.804737 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.804711 2575 scope.go:117] "RemoveContainer" containerID="527941546c3275aaa3b972085af5244a3eb3a4de1021f3cbe4f0920e55d88352" Apr 20 19:34:26.805028 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:34:26.805006 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"527941546c3275aaa3b972085af5244a3eb3a4de1021f3cbe4f0920e55d88352\": container with ID starting with 527941546c3275aaa3b972085af5244a3eb3a4de1021f3cbe4f0920e55d88352 not found: ID does not exist" containerID="527941546c3275aaa3b972085af5244a3eb3a4de1021f3cbe4f0920e55d88352" Apr 20 19:34:26.805104 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.805041 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"527941546c3275aaa3b972085af5244a3eb3a4de1021f3cbe4f0920e55d88352"} err="failed to get container status \"527941546c3275aaa3b972085af5244a3eb3a4de1021f3cbe4f0920e55d88352\": rpc error: code = NotFound desc = could not find container \"527941546c3275aaa3b972085af5244a3eb3a4de1021f3cbe4f0920e55d88352\": container with ID starting with 527941546c3275aaa3b972085af5244a3eb3a4de1021f3cbe4f0920e55d88352 not found: ID does not exist" Apr 20 19:34:26.812002 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.811948 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r85fq" podStartSLOduration=1.424895313 podStartE2EDuration="23.811929042s" podCreationTimestamp="2026-04-20 19:34:03 +0000 UTC" firstStartedPulling="2026-04-20 19:34:04.201446452 +0000 UTC m=+499.891859718" lastFinishedPulling="2026-04-20 19:34:26.58848019 +0000 UTC m=+522.278893447" observedRunningTime="2026-04-20 19:34:26.811605653 +0000 UTC m=+522.502018926" watchObservedRunningTime="2026-04-20 19:34:26.811929042 +0000 UTC m=+522.502342330" Apr 20 19:34:26.833134 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.833091 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6755758b4b-hzksz"] Apr 20 19:34:26.840453 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.840421 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6755758b4b-hzksz"] Apr 20 19:34:26.866632 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.866584 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-oauth-serving-cert\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:34:26.866632 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.866640 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-trusted-ca-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:34:26.866885 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.866653 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-service-ca\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:34:26.866885 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.866662 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-oauth-config\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:34:26.866885 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.866671 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-serving-cert\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:34:26.866885 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.866680 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a0408d02-505e-4f7d-a05c-1026c8a3090d-console-config\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:34:26.866885 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.866689 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-54r88\" (UniqueName: \"kubernetes.io/projected/a0408d02-505e-4f7d-a05c-1026c8a3090d-kube-api-access-54r88\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:34:26.908071 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:26.908025 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0408d02-505e-4f7d-a05c-1026c8a3090d" path="/var/lib/kubelet/pods/a0408d02-505e-4f7d-a05c-1026c8a3090d/volumes" Apr 20 19:34:56.162509 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.162454 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx6q2"] Apr 20 19:34:56.164457 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.164423 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0408d02-505e-4f7d-a05c-1026c8a3090d" containerName="console" Apr 20 19:34:56.164457 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.164460 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0408d02-505e-4f7d-a05c-1026c8a3090d" containerName="console" Apr 20 19:34:56.164690 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.164548 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0408d02-505e-4f7d-a05c-1026c8a3090d" containerName="console" Apr 20 19:34:56.267238 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.267203 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx6q2"] Apr 20 19:34:56.267238 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.267237 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx6q2"] Apr 20 19:34:56.267445 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.267352 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-cx6q2" Apr 20 19:34:56.269803 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.269779 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 19:34:56.431020 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.430926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b636e4cb-4980-4c0a-84a8-ba712bb3d0b0-config-file\") pod \"limitador-limitador-78c99df468-cx6q2\" (UID: \"b636e4cb-4980-4c0a-84a8-ba712bb3d0b0\") " pod="kuadrant-system/limitador-limitador-78c99df468-cx6q2" Apr 20 19:34:56.431020 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.430970 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jxns\" (UniqueName: \"kubernetes.io/projected/b636e4cb-4980-4c0a-84a8-ba712bb3d0b0-kube-api-access-2jxns\") pod \"limitador-limitador-78c99df468-cx6q2\" (UID: \"b636e4cb-4980-4c0a-84a8-ba712bb3d0b0\") " pod="kuadrant-system/limitador-limitador-78c99df468-cx6q2" Apr 20 19:34:56.532056 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.532019 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b636e4cb-4980-4c0a-84a8-ba712bb3d0b0-config-file\") pod \"limitador-limitador-78c99df468-cx6q2\" (UID: \"b636e4cb-4980-4c0a-84a8-ba712bb3d0b0\") " pod="kuadrant-system/limitador-limitador-78c99df468-cx6q2" Apr 20 19:34:56.532056 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.532063 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jxns\" (UniqueName: \"kubernetes.io/projected/b636e4cb-4980-4c0a-84a8-ba712bb3d0b0-kube-api-access-2jxns\") pod \"limitador-limitador-78c99df468-cx6q2\" (UID: \"b636e4cb-4980-4c0a-84a8-ba712bb3d0b0\") " pod="kuadrant-system/limitador-limitador-78c99df468-cx6q2" Apr 20 19:34:56.532736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.532716 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b636e4cb-4980-4c0a-84a8-ba712bb3d0b0-config-file\") pod \"limitador-limitador-78c99df468-cx6q2\" (UID: \"b636e4cb-4980-4c0a-84a8-ba712bb3d0b0\") " pod="kuadrant-system/limitador-limitador-78c99df468-cx6q2" Apr 20 19:34:56.540105 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.540075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jxns\" (UniqueName: \"kubernetes.io/projected/b636e4cb-4980-4c0a-84a8-ba712bb3d0b0-kube-api-access-2jxns\") pod \"limitador-limitador-78c99df468-cx6q2\" (UID: \"b636e4cb-4980-4c0a-84a8-ba712bb3d0b0\") " pod="kuadrant-system/limitador-limitador-78c99df468-cx6q2" Apr 20 19:34:56.578462 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.578417 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-cx6q2" Apr 20 19:34:56.653681 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.653638 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8ds4t"] Apr 20 19:34:56.745017 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:34:56.744978 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb636e4cb_4980_4c0a_84a8_ba712bb3d0b0.slice/crio-e4c1ee1ccb0af80ec8f7a4b70c0bb85150b23b50a6f0d305a92ff6503b5e2671 WatchSource:0}: Error finding container e4c1ee1ccb0af80ec8f7a4b70c0bb85150b23b50a6f0d305a92ff6503b5e2671: Status 404 returned error can't find the container with id e4c1ee1ccb0af80ec8f7a4b70c0bb85150b23b50a6f0d305a92ff6503b5e2671 Apr 20 19:34:56.766087 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.766045 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8ds4t"] Apr 20 19:34:56.766087 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.766076 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx6q2"] Apr 20 19:34:56.766291 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.766177 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8ds4t" Apr 20 19:34:56.768869 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.768841 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-j57d9\"" Apr 20 19:34:56.836126 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.836050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67mnc\" (UniqueName: \"kubernetes.io/projected/279057fc-4767-4ddb-ae06-cd83890d360b-kube-api-access-67mnc\") pod \"authorino-f99f4b5cd-8ds4t\" (UID: \"279057fc-4767-4ddb-ae06-cd83890d360b\") " pod="kuadrant-system/authorino-f99f4b5cd-8ds4t" Apr 20 19:34:56.915105 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.915070 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-cx6q2" event={"ID":"b636e4cb-4980-4c0a-84a8-ba712bb3d0b0","Type":"ContainerStarted","Data":"e4c1ee1ccb0af80ec8f7a4b70c0bb85150b23b50a6f0d305a92ff6503b5e2671"} Apr 20 19:34:56.937772 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.937680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67mnc\" (UniqueName: \"kubernetes.io/projected/279057fc-4767-4ddb-ae06-cd83890d360b-kube-api-access-67mnc\") pod \"authorino-f99f4b5cd-8ds4t\" (UID: \"279057fc-4767-4ddb-ae06-cd83890d360b\") " pod="kuadrant-system/authorino-f99f4b5cd-8ds4t" Apr 20 19:34:56.945964 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:56.945925 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67mnc\" (UniqueName: \"kubernetes.io/projected/279057fc-4767-4ddb-ae06-cd83890d360b-kube-api-access-67mnc\") pod \"authorino-f99f4b5cd-8ds4t\" (UID: \"279057fc-4767-4ddb-ae06-cd83890d360b\") " pod="kuadrant-system/authorino-f99f4b5cd-8ds4t" Apr 20 19:34:57.076733 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:57.076689 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8ds4t" Apr 20 19:34:57.217841 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:57.217740 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8ds4t"] Apr 20 19:34:57.221750 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:34:57.221708 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod279057fc_4767_4ddb_ae06_cd83890d360b.slice/crio-7d6b135e476730d167a2bac2490ffa651f351c0c6ffe25ecb7cb5df8bb934d47 WatchSource:0}: Error finding container 7d6b135e476730d167a2bac2490ffa651f351c0c6ffe25ecb7cb5df8bb934d47: Status 404 returned error can't find the container with id 7d6b135e476730d167a2bac2490ffa651f351c0c6ffe25ecb7cb5df8bb934d47 Apr 20 19:34:57.922216 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:34:57.922178 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8ds4t" event={"ID":"279057fc-4767-4ddb-ae06-cd83890d360b","Type":"ContainerStarted","Data":"7d6b135e476730d167a2bac2490ffa651f351c0c6ffe25ecb7cb5df8bb934d47"} Apr 20 19:35:00.675225 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:00.675190 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8ds4t"] Apr 20 19:35:01.941056 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:01.941018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8ds4t" event={"ID":"279057fc-4767-4ddb-ae06-cd83890d360b","Type":"ContainerStarted","Data":"d2b62941c625867606dbe8c6d15b893de62eeb4614ae89edc97638ace012051f"} Apr 20 19:35:01.941681 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:01.941054 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-8ds4t" podUID="279057fc-4767-4ddb-ae06-cd83890d360b" containerName="authorino" containerID="cri-o://d2b62941c625867606dbe8c6d15b893de62eeb4614ae89edc97638ace012051f" gracePeriod=30 Apr 20 19:35:01.942530 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:01.942499 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-cx6q2" event={"ID":"b636e4cb-4980-4c0a-84a8-ba712bb3d0b0","Type":"ContainerStarted","Data":"b1b6beccd4e885d156ae59b33df3ee259a1f647faca6db7157254e3f369caa15"} Apr 20 19:35:01.942667 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:01.942607 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-cx6q2" Apr 20 19:35:01.956906 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:01.956861 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-8ds4t" podStartSLOduration=1.878619984 podStartE2EDuration="5.956844028s" podCreationTimestamp="2026-04-20 19:34:56 +0000 UTC" firstStartedPulling="2026-04-20 19:34:57.223369006 +0000 UTC m=+552.913782258" lastFinishedPulling="2026-04-20 19:35:01.301593045 +0000 UTC m=+556.992006302" observedRunningTime="2026-04-20 19:35:01.954722706 +0000 UTC m=+557.645135993" watchObservedRunningTime="2026-04-20 19:35:01.956844028 +0000 UTC m=+557.647257302" Apr 20 19:35:01.971140 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:01.971083 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-cx6q2" podStartSLOduration=1.417041193 podStartE2EDuration="5.971068041s" podCreationTimestamp="2026-04-20 19:34:56 +0000 UTC" firstStartedPulling="2026-04-20 19:34:56.747090308 +0000 UTC m=+552.437503564" lastFinishedPulling="2026-04-20 19:35:01.30111716 +0000 UTC m=+556.991530412" observedRunningTime="2026-04-20 19:35:01.96949794 +0000 UTC m=+557.659911214" watchObservedRunningTime="2026-04-20 19:35:01.971068041 +0000 UTC m=+557.661481314" Apr 20 19:35:02.198456 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:02.198378 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8ds4t" Apr 20 19:35:02.286594 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:02.286554 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67mnc\" (UniqueName: \"kubernetes.io/projected/279057fc-4767-4ddb-ae06-cd83890d360b-kube-api-access-67mnc\") pod \"279057fc-4767-4ddb-ae06-cd83890d360b\" (UID: \"279057fc-4767-4ddb-ae06-cd83890d360b\") " Apr 20 19:35:02.288840 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:02.288810 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/279057fc-4767-4ddb-ae06-cd83890d360b-kube-api-access-67mnc" (OuterVolumeSpecName: "kube-api-access-67mnc") pod "279057fc-4767-4ddb-ae06-cd83890d360b" (UID: "279057fc-4767-4ddb-ae06-cd83890d360b"). InnerVolumeSpecName "kube-api-access-67mnc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:35:02.387973 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:02.387928 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-67mnc\" (UniqueName: \"kubernetes.io/projected/279057fc-4767-4ddb-ae06-cd83890d360b-kube-api-access-67mnc\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:35:02.947463 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:02.947432 2575 generic.go:358] "Generic (PLEG): container finished" podID="279057fc-4767-4ddb-ae06-cd83890d360b" containerID="d2b62941c625867606dbe8c6d15b893de62eeb4614ae89edc97638ace012051f" exitCode=0 Apr 20 19:35:02.947926 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:02.947484 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8ds4t" Apr 20 19:35:02.947926 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:02.947515 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8ds4t" event={"ID":"279057fc-4767-4ddb-ae06-cd83890d360b","Type":"ContainerDied","Data":"d2b62941c625867606dbe8c6d15b893de62eeb4614ae89edc97638ace012051f"} Apr 20 19:35:02.947926 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:02.947556 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8ds4t" event={"ID":"279057fc-4767-4ddb-ae06-cd83890d360b","Type":"ContainerDied","Data":"7d6b135e476730d167a2bac2490ffa651f351c0c6ffe25ecb7cb5df8bb934d47"} Apr 20 19:35:02.947926 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:02.947576 2575 scope.go:117] "RemoveContainer" containerID="d2b62941c625867606dbe8c6d15b893de62eeb4614ae89edc97638ace012051f" Apr 20 19:35:02.956666 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:02.956644 2575 scope.go:117] "RemoveContainer" containerID="d2b62941c625867606dbe8c6d15b893de62eeb4614ae89edc97638ace012051f" Apr 20 19:35:02.957004 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:35:02.956981 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2b62941c625867606dbe8c6d15b893de62eeb4614ae89edc97638ace012051f\": container with ID starting with d2b62941c625867606dbe8c6d15b893de62eeb4614ae89edc97638ace012051f not found: ID does not exist" containerID="d2b62941c625867606dbe8c6d15b893de62eeb4614ae89edc97638ace012051f" Apr 20 19:35:02.957055 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:02.957014 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2b62941c625867606dbe8c6d15b893de62eeb4614ae89edc97638ace012051f"} err="failed to get container status \"d2b62941c625867606dbe8c6d15b893de62eeb4614ae89edc97638ace012051f\": rpc error: code = NotFound desc = could not find container \"d2b62941c625867606dbe8c6d15b893de62eeb4614ae89edc97638ace012051f\": container with ID starting with d2b62941c625867606dbe8c6d15b893de62eeb4614ae89edc97638ace012051f not found: ID does not exist" Apr 20 19:35:02.963470 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:02.963430 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8ds4t"] Apr 20 19:35:02.970449 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:02.970414 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8ds4t"] Apr 20 19:35:04.908428 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:04.908392 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="279057fc-4767-4ddb-ae06-cd83890d360b" path="/var/lib/kubelet/pods/279057fc-4767-4ddb-ae06-cd83890d360b/volumes" Apr 20 19:35:12.949084 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:12.949051 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-cx6q2" Apr 20 19:35:31.167911 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.167866 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j"] Apr 20 19:35:31.168351 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.168197 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="279057fc-4767-4ddb-ae06-cd83890d360b" containerName="authorino" Apr 20 19:35:31.168351 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.168207 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="279057fc-4767-4ddb-ae06-cd83890d360b" containerName="authorino" Apr 20 19:35:31.168351 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.168271 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="279057fc-4767-4ddb-ae06-cd83890d360b" containerName="authorino" Apr 20 19:35:31.171673 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.171652 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" Apr 20 19:35:31.174258 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.174232 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5pq88\"" Apr 20 19:35:31.174407 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.174287 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 19:35:31.174407 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.174287 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 19:35:31.178672 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.178642 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j"] Apr 20 19:35:31.210973 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.210933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mdqw\" (UniqueName: \"kubernetes.io/projected/e3d2d644-208c-4eab-8170-31fc1d44e565-kube-api-access-4mdqw\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j\" (UID: \"e3d2d644-208c-4eab-8170-31fc1d44e565\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" Apr 20 19:35:31.210973 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.210987 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3d2d644-208c-4eab-8170-31fc1d44e565-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j\" (UID: \"e3d2d644-208c-4eab-8170-31fc1d44e565\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" Apr 20 19:35:31.211207 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.211009 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3d2d644-208c-4eab-8170-31fc1d44e565-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j\" (UID: \"e3d2d644-208c-4eab-8170-31fc1d44e565\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" Apr 20 19:35:31.312450 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.312400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mdqw\" (UniqueName: \"kubernetes.io/projected/e3d2d644-208c-4eab-8170-31fc1d44e565-kube-api-access-4mdqw\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j\" (UID: \"e3d2d644-208c-4eab-8170-31fc1d44e565\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" Apr 20 19:35:31.312450 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.312448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3d2d644-208c-4eab-8170-31fc1d44e565-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j\" (UID: \"e3d2d644-208c-4eab-8170-31fc1d44e565\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" Apr 20 19:35:31.312755 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.312568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3d2d644-208c-4eab-8170-31fc1d44e565-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j\" (UID: \"e3d2d644-208c-4eab-8170-31fc1d44e565\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" Apr 20 19:35:31.312942 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.312919 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3d2d644-208c-4eab-8170-31fc1d44e565-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j\" (UID: \"e3d2d644-208c-4eab-8170-31fc1d44e565\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" Apr 20 19:35:31.312997 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.312950 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3d2d644-208c-4eab-8170-31fc1d44e565-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j\" (UID: \"e3d2d644-208c-4eab-8170-31fc1d44e565\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" Apr 20 19:35:31.321295 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.321262 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mdqw\" (UniqueName: \"kubernetes.io/projected/e3d2d644-208c-4eab-8170-31fc1d44e565-kube-api-access-4mdqw\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j\" (UID: \"e3d2d644-208c-4eab-8170-31fc1d44e565\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" Apr 20 19:35:31.482267 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.482139 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" Apr 20 19:35:31.619304 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:31.619274 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j"] Apr 20 19:35:31.621050 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:35:31.621011 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3d2d644_208c_4eab_8170_31fc1d44e565.slice/crio-84b408e169a81a9d53de95990bb1cb906c4c558267a57d68ea48ec0cf3c7e7bf WatchSource:0}: Error finding container 84b408e169a81a9d53de95990bb1cb906c4c558267a57d68ea48ec0cf3c7e7bf: Status 404 returned error can't find the container with id 84b408e169a81a9d53de95990bb1cb906c4c558267a57d68ea48ec0cf3c7e7bf Apr 20 19:35:32.061679 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:32.061645 2575 generic.go:358] "Generic (PLEG): container finished" podID="e3d2d644-208c-4eab-8170-31fc1d44e565" containerID="042f0b0ad9429795285eb2dea601a36331d73517f14bdc225afe6765e9542678" exitCode=0 Apr 20 19:35:32.061857 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:32.061738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" event={"ID":"e3d2d644-208c-4eab-8170-31fc1d44e565","Type":"ContainerDied","Data":"042f0b0ad9429795285eb2dea601a36331d73517f14bdc225afe6765e9542678"} Apr 20 19:35:32.061857 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:32.061773 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" event={"ID":"e3d2d644-208c-4eab-8170-31fc1d44e565","Type":"ContainerStarted","Data":"84b408e169a81a9d53de95990bb1cb906c4c558267a57d68ea48ec0cf3c7e7bf"} Apr 20 19:35:33.067180 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:33.067085 2575 generic.go:358] "Generic (PLEG): container finished" podID="e3d2d644-208c-4eab-8170-31fc1d44e565" containerID="2eecb7411ac4224daf2e406235d019b540cd75f2b42b4b10d4794265bc4109ce" exitCode=0 Apr 20 19:35:33.067180 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:33.067154 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" event={"ID":"e3d2d644-208c-4eab-8170-31fc1d44e565","Type":"ContainerDied","Data":"2eecb7411ac4224daf2e406235d019b540cd75f2b42b4b10d4794265bc4109ce"} Apr 20 19:35:34.073353 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:34.073307 2575 generic.go:358] "Generic (PLEG): container finished" podID="e3d2d644-208c-4eab-8170-31fc1d44e565" containerID="a5b9bf16f1da8feb38a9fb3cfdf01864f02d9b9fa27448b7ddbd522919c1ae57" exitCode=0 Apr 20 19:35:34.073753 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:34.073364 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" event={"ID":"e3d2d644-208c-4eab-8170-31fc1d44e565","Type":"ContainerDied","Data":"a5b9bf16f1da8feb38a9fb3cfdf01864f02d9b9fa27448b7ddbd522919c1ae57"} Apr 20 19:35:35.210925 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:35.210889 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" Apr 20 19:35:35.248452 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:35.248407 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mdqw\" (UniqueName: \"kubernetes.io/projected/e3d2d644-208c-4eab-8170-31fc1d44e565-kube-api-access-4mdqw\") pod \"e3d2d644-208c-4eab-8170-31fc1d44e565\" (UID: \"e3d2d644-208c-4eab-8170-31fc1d44e565\") " Apr 20 19:35:35.248672 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:35.248463 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3d2d644-208c-4eab-8170-31fc1d44e565-util\") pod \"e3d2d644-208c-4eab-8170-31fc1d44e565\" (UID: \"e3d2d644-208c-4eab-8170-31fc1d44e565\") " Apr 20 19:35:35.248672 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:35.248596 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3d2d644-208c-4eab-8170-31fc1d44e565-bundle\") pod \"e3d2d644-208c-4eab-8170-31fc1d44e565\" (UID: \"e3d2d644-208c-4eab-8170-31fc1d44e565\") " Apr 20 19:35:35.249064 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:35.249035 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3d2d644-208c-4eab-8170-31fc1d44e565-bundle" (OuterVolumeSpecName: "bundle") pod "e3d2d644-208c-4eab-8170-31fc1d44e565" (UID: "e3d2d644-208c-4eab-8170-31fc1d44e565"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:35:35.250836 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:35.250797 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3d2d644-208c-4eab-8170-31fc1d44e565-kube-api-access-4mdqw" (OuterVolumeSpecName: "kube-api-access-4mdqw") pod "e3d2d644-208c-4eab-8170-31fc1d44e565" (UID: "e3d2d644-208c-4eab-8170-31fc1d44e565"). InnerVolumeSpecName "kube-api-access-4mdqw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:35:35.254206 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:35.254169 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3d2d644-208c-4eab-8170-31fc1d44e565-util" (OuterVolumeSpecName: "util") pod "e3d2d644-208c-4eab-8170-31fc1d44e565" (UID: "e3d2d644-208c-4eab-8170-31fc1d44e565"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:35:35.349562 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:35.349457 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4mdqw\" (UniqueName: \"kubernetes.io/projected/e3d2d644-208c-4eab-8170-31fc1d44e565-kube-api-access-4mdqw\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:35:35.349562 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:35.349495 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e3d2d644-208c-4eab-8170-31fc1d44e565-util\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:35:35.349562 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:35.349512 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e3d2d644-208c-4eab-8170-31fc1d44e565-bundle\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:35:36.082736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:36.082696 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" event={"ID":"e3d2d644-208c-4eab-8170-31fc1d44e565","Type":"ContainerDied","Data":"84b408e169a81a9d53de95990bb1cb906c4c558267a57d68ea48ec0cf3c7e7bf"} Apr 20 19:35:36.082736 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:36.082743 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84b408e169a81a9d53de95990bb1cb906c4c558267a57d68ea48ec0cf3c7e7bf" Apr 20 19:35:36.082998 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:36.082711 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350wkx4j" Apr 20 19:35:44.815504 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:44.815477 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w558t_97e58d97-c4b1-4d4a-a6f6-d87a86138255/ovn-acl-logging/0.log" Apr 20 19:35:44.816060 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:35:44.815882 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w558t_97e58d97-c4b1-4d4a-a6f6-d87a86138255/ovn-acl-logging/0.log" Apr 20 19:36:24.831075 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:24.831036 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-lq5v6"] Apr 20 19:36:24.831509 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:24.831366 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3d2d644-208c-4eab-8170-31fc1d44e565" containerName="pull" Apr 20 19:36:24.831509 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:24.831379 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d2d644-208c-4eab-8170-31fc1d44e565" containerName="pull" Apr 20 19:36:24.831509 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:24.831393 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3d2d644-208c-4eab-8170-31fc1d44e565" containerName="util" Apr 20 19:36:24.831509 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:24.831398 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d2d644-208c-4eab-8170-31fc1d44e565" containerName="util" Apr 20 19:36:24.831509 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:24.831405 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3d2d644-208c-4eab-8170-31fc1d44e565" containerName="extract" Apr 20 19:36:24.831509 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:24.831410 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3d2d644-208c-4eab-8170-31fc1d44e565" containerName="extract" Apr 20 19:36:24.831509 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:24.831471 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e3d2d644-208c-4eab-8170-31fc1d44e565" containerName="extract" Apr 20 19:36:24.833702 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:24.833683 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-lq5v6" Apr 20 19:36:24.836130 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:24.836104 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-j57d9\"" Apr 20 19:36:24.840516 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:24.840485 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-lq5v6"] Apr 20 19:36:24.882345 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:24.882297 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd69t\" (UniqueName: \"kubernetes.io/projected/661a56e3-9af8-4ef6-9907-f4df0bb0bdc5-kube-api-access-fd69t\") pod \"authorino-8b475cf9f-lq5v6\" (UID: \"661a56e3-9af8-4ef6-9907-f4df0bb0bdc5\") " pod="kuadrant-system/authorino-8b475cf9f-lq5v6" Apr 20 19:36:24.983342 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:24.983304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd69t\" (UniqueName: \"kubernetes.io/projected/661a56e3-9af8-4ef6-9907-f4df0bb0bdc5-kube-api-access-fd69t\") pod \"authorino-8b475cf9f-lq5v6\" (UID: \"661a56e3-9af8-4ef6-9907-f4df0bb0bdc5\") " pod="kuadrant-system/authorino-8b475cf9f-lq5v6" Apr 20 19:36:24.991781 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:24.991749 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd69t\" (UniqueName: \"kubernetes.io/projected/661a56e3-9af8-4ef6-9907-f4df0bb0bdc5-kube-api-access-fd69t\") pod \"authorino-8b475cf9f-lq5v6\" (UID: \"661a56e3-9af8-4ef6-9907-f4df0bb0bdc5\") " pod="kuadrant-system/authorino-8b475cf9f-lq5v6" Apr 20 19:36:25.051541 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.051497 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-lq5v6"] Apr 20 19:36:25.051810 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.051797 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-lq5v6" Apr 20 19:36:25.080461 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.080423 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-576cffc9f8-65n9h"] Apr 20 19:36:25.083437 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.083358 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-576cffc9f8-65n9h" Apr 20 19:36:25.089866 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.089835 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-576cffc9f8-65n9h"] Apr 20 19:36:25.184434 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.184399 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dwsm\" (UniqueName: \"kubernetes.io/projected/6178ac8c-e6a1-4920-8e7a-0319af994c36-kube-api-access-7dwsm\") pod \"authorino-576cffc9f8-65n9h\" (UID: \"6178ac8c-e6a1-4920-8e7a-0319af994c36\") " pod="kuadrant-system/authorino-576cffc9f8-65n9h" Apr 20 19:36:25.193216 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.193190 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-lq5v6"] Apr 20 19:36:25.195757 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:36:25.195728 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod661a56e3_9af8_4ef6_9907_f4df0bb0bdc5.slice/crio-ebe3ce4b8e45c35b6ce78ac76355bb08a2471901361bf2adc7b3a6fa6ee9cbb5 WatchSource:0}: Error finding container ebe3ce4b8e45c35b6ce78ac76355bb08a2471901361bf2adc7b3a6fa6ee9cbb5: Status 404 returned error can't find the container with id ebe3ce4b8e45c35b6ce78ac76355bb08a2471901361bf2adc7b3a6fa6ee9cbb5 Apr 20 19:36:25.197555 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.197536 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:36:25.276135 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.276089 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-lq5v6" event={"ID":"661a56e3-9af8-4ef6-9907-f4df0bb0bdc5","Type":"ContainerStarted","Data":"ebe3ce4b8e45c35b6ce78ac76355bb08a2471901361bf2adc7b3a6fa6ee9cbb5"} Apr 20 19:36:25.292028 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.291977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dwsm\" (UniqueName: \"kubernetes.io/projected/6178ac8c-e6a1-4920-8e7a-0319af994c36-kube-api-access-7dwsm\") pod \"authorino-576cffc9f8-65n9h\" (UID: \"6178ac8c-e6a1-4920-8e7a-0319af994c36\") " pod="kuadrant-system/authorino-576cffc9f8-65n9h" Apr 20 19:36:25.301512 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.301468 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dwsm\" (UniqueName: \"kubernetes.io/projected/6178ac8c-e6a1-4920-8e7a-0319af994c36-kube-api-access-7dwsm\") pod \"authorino-576cffc9f8-65n9h\" (UID: \"6178ac8c-e6a1-4920-8e7a-0319af994c36\") " pod="kuadrant-system/authorino-576cffc9f8-65n9h" Apr 20 19:36:25.394802 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.394761 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-576cffc9f8-65n9h" Apr 20 19:36:25.411928 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.411891 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-576cffc9f8-65n9h"] Apr 20 19:36:25.439969 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.439933 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-78596c8ffd-kp5v8"] Apr 20 19:36:25.443034 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.443005 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-78596c8ffd-kp5v8" Apr 20 19:36:25.449388 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.445772 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 19:36:25.451051 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.451019 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-78596c8ffd-kp5v8"] Apr 20 19:36:25.493798 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.493760 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98xfc\" (UniqueName: \"kubernetes.io/projected/fdfe2ca0-c39e-42c1-a16e-82d3960202e3-kube-api-access-98xfc\") pod \"authorino-78596c8ffd-kp5v8\" (UID: \"fdfe2ca0-c39e-42c1-a16e-82d3960202e3\") " pod="kuadrant-system/authorino-78596c8ffd-kp5v8" Apr 20 19:36:25.493954 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.493905 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/fdfe2ca0-c39e-42c1-a16e-82d3960202e3-tls-cert\") pod \"authorino-78596c8ffd-kp5v8\" (UID: \"fdfe2ca0-c39e-42c1-a16e-82d3960202e3\") " pod="kuadrant-system/authorino-78596c8ffd-kp5v8" Apr 20 19:36:25.533474 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.533434 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-576cffc9f8-65n9h"] Apr 20 19:36:25.536397 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:36:25.536371 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6178ac8c_e6a1_4920_8e7a_0319af994c36.slice/crio-6faf4bf087107f1c1163865be33ebc8a2b96d8ef7b39fb8d57dabd1c507a634c WatchSource:0}: Error finding container 6faf4bf087107f1c1163865be33ebc8a2b96d8ef7b39fb8d57dabd1c507a634c: Status 404 returned error can't find the container with id 6faf4bf087107f1c1163865be33ebc8a2b96d8ef7b39fb8d57dabd1c507a634c Apr 20 19:36:25.594647 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.594571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/fdfe2ca0-c39e-42c1-a16e-82d3960202e3-tls-cert\") pod \"authorino-78596c8ffd-kp5v8\" (UID: \"fdfe2ca0-c39e-42c1-a16e-82d3960202e3\") " pod="kuadrant-system/authorino-78596c8ffd-kp5v8" Apr 20 19:36:25.594813 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.594686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98xfc\" (UniqueName: \"kubernetes.io/projected/fdfe2ca0-c39e-42c1-a16e-82d3960202e3-kube-api-access-98xfc\") pod \"authorino-78596c8ffd-kp5v8\" (UID: \"fdfe2ca0-c39e-42c1-a16e-82d3960202e3\") " pod="kuadrant-system/authorino-78596c8ffd-kp5v8" Apr 20 19:36:25.597205 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.597178 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/fdfe2ca0-c39e-42c1-a16e-82d3960202e3-tls-cert\") pod \"authorino-78596c8ffd-kp5v8\" (UID: \"fdfe2ca0-c39e-42c1-a16e-82d3960202e3\") " pod="kuadrant-system/authorino-78596c8ffd-kp5v8" Apr 20 19:36:25.604263 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.604232 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98xfc\" (UniqueName: \"kubernetes.io/projected/fdfe2ca0-c39e-42c1-a16e-82d3960202e3-kube-api-access-98xfc\") pod \"authorino-78596c8ffd-kp5v8\" (UID: \"fdfe2ca0-c39e-42c1-a16e-82d3960202e3\") " pod="kuadrant-system/authorino-78596c8ffd-kp5v8" Apr 20 19:36:25.761748 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.761647 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-78596c8ffd-kp5v8" Apr 20 19:36:25.896390 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:25.896359 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-78596c8ffd-kp5v8"] Apr 20 19:36:25.898532 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:36:25.898502 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdfe2ca0_c39e_42c1_a16e_82d3960202e3.slice/crio-555e488b3dc8af058d53c12d6acf31ddfaf331e2865378e83c49ec773357b2ba WatchSource:0}: Error finding container 555e488b3dc8af058d53c12d6acf31ddfaf331e2865378e83c49ec773357b2ba: Status 404 returned error can't find the container with id 555e488b3dc8af058d53c12d6acf31ddfaf331e2865378e83c49ec773357b2ba Apr 20 19:36:26.282210 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.281968 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-576cffc9f8-65n9h" event={"ID":"6178ac8c-e6a1-4920-8e7a-0319af994c36","Type":"ContainerStarted","Data":"93f580686c9428459f770d57eca5a1bcb15f60241c84715c6441b49c56e55e4d"} Apr 20 19:36:26.282210 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.282016 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-576cffc9f8-65n9h" event={"ID":"6178ac8c-e6a1-4920-8e7a-0319af994c36","Type":"ContainerStarted","Data":"6faf4bf087107f1c1163865be33ebc8a2b96d8ef7b39fb8d57dabd1c507a634c"} Apr 20 19:36:26.282210 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.282154 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-576cffc9f8-65n9h" podUID="6178ac8c-e6a1-4920-8e7a-0319af994c36" containerName="authorino" containerID="cri-o://93f580686c9428459f770d57eca5a1bcb15f60241c84715c6441b49c56e55e4d" gracePeriod=30 Apr 20 19:36:26.283948 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.283913 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-78596c8ffd-kp5v8" event={"ID":"fdfe2ca0-c39e-42c1-a16e-82d3960202e3","Type":"ContainerStarted","Data":"d05b66fb5c6974369cf41c77d30e4c55ae723337b95a098be009646fef79d558"} Apr 20 19:36:26.284097 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.283954 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-78596c8ffd-kp5v8" event={"ID":"fdfe2ca0-c39e-42c1-a16e-82d3960202e3","Type":"ContainerStarted","Data":"555e488b3dc8af058d53c12d6acf31ddfaf331e2865378e83c49ec773357b2ba"} Apr 20 19:36:26.286003 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.285968 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-lq5v6" event={"ID":"661a56e3-9af8-4ef6-9907-f4df0bb0bdc5","Type":"ContainerStarted","Data":"84ff52bcedb2e55e673ea61191895bb0143c26d8fc7e33b1a2d2a838ef046415"} Apr 20 19:36:26.286154 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.286017 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-lq5v6" podUID="661a56e3-9af8-4ef6-9907-f4df0bb0bdc5" containerName="authorino" containerID="cri-o://84ff52bcedb2e55e673ea61191895bb0143c26d8fc7e33b1a2d2a838ef046415" gracePeriod=30 Apr 20 19:36:26.297664 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.297531 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-576cffc9f8-65n9h" podStartSLOduration=0.927638461 podStartE2EDuration="1.297511134s" podCreationTimestamp="2026-04-20 19:36:25 +0000 UTC" firstStartedPulling="2026-04-20 19:36:25.537700702 +0000 UTC m=+641.228113954" lastFinishedPulling="2026-04-20 19:36:25.907573372 +0000 UTC m=+641.597986627" observedRunningTime="2026-04-20 19:36:26.296683082 +0000 UTC m=+641.987096356" watchObservedRunningTime="2026-04-20 19:36:26.297511134 +0000 UTC m=+641.987924402" Apr 20 19:36:26.314018 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.313950 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-lq5v6" podStartSLOduration=1.946714108 podStartE2EDuration="2.31392835s" podCreationTimestamp="2026-04-20 19:36:24 +0000 UTC" firstStartedPulling="2026-04-20 19:36:25.197719195 +0000 UTC m=+640.888132447" lastFinishedPulling="2026-04-20 19:36:25.564933435 +0000 UTC m=+641.255346689" observedRunningTime="2026-04-20 19:36:26.311780908 +0000 UTC m=+642.002194183" watchObservedRunningTime="2026-04-20 19:36:26.31392835 +0000 UTC m=+642.004341626" Apr 20 19:36:26.329906 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.329846 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-78596c8ffd-kp5v8" podStartSLOduration=1.035816911 podStartE2EDuration="1.329829289s" podCreationTimestamp="2026-04-20 19:36:25 +0000 UTC" firstStartedPulling="2026-04-20 19:36:25.89990892 +0000 UTC m=+641.590322175" lastFinishedPulling="2026-04-20 19:36:26.193921299 +0000 UTC m=+641.884334553" observedRunningTime="2026-04-20 19:36:26.326582227 +0000 UTC m=+642.016995501" watchObservedRunningTime="2026-04-20 19:36:26.329829289 +0000 UTC m=+642.020242599" Apr 20 19:36:26.542310 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.542282 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-576cffc9f8-65n9h" Apr 20 19:36:26.551439 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.551405 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-lq5v6" Apr 20 19:36:26.602590 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.602551 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd69t\" (UniqueName: \"kubernetes.io/projected/661a56e3-9af8-4ef6-9907-f4df0bb0bdc5-kube-api-access-fd69t\") pod \"661a56e3-9af8-4ef6-9907-f4df0bb0bdc5\" (UID: \"661a56e3-9af8-4ef6-9907-f4df0bb0bdc5\") " Apr 20 19:36:26.602796 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.602664 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dwsm\" (UniqueName: \"kubernetes.io/projected/6178ac8c-e6a1-4920-8e7a-0319af994c36-kube-api-access-7dwsm\") pod \"6178ac8c-e6a1-4920-8e7a-0319af994c36\" (UID: \"6178ac8c-e6a1-4920-8e7a-0319af994c36\") " Apr 20 19:36:26.605040 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.605006 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/661a56e3-9af8-4ef6-9907-f4df0bb0bdc5-kube-api-access-fd69t" (OuterVolumeSpecName: "kube-api-access-fd69t") pod "661a56e3-9af8-4ef6-9907-f4df0bb0bdc5" (UID: "661a56e3-9af8-4ef6-9907-f4df0bb0bdc5"). InnerVolumeSpecName "kube-api-access-fd69t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:36:26.605097 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.605006 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6178ac8c-e6a1-4920-8e7a-0319af994c36-kube-api-access-7dwsm" (OuterVolumeSpecName: "kube-api-access-7dwsm") pod "6178ac8c-e6a1-4920-8e7a-0319af994c36" (UID: "6178ac8c-e6a1-4920-8e7a-0319af994c36"). InnerVolumeSpecName "kube-api-access-7dwsm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:36:26.703570 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.703528 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fd69t\" (UniqueName: \"kubernetes.io/projected/661a56e3-9af8-4ef6-9907-f4df0bb0bdc5-kube-api-access-fd69t\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:36:26.703570 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:26.703562 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7dwsm\" (UniqueName: \"kubernetes.io/projected/6178ac8c-e6a1-4920-8e7a-0319af994c36-kube-api-access-7dwsm\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:36:27.290743 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.290650 2575 generic.go:358] "Generic (PLEG): container finished" podID="661a56e3-9af8-4ef6-9907-f4df0bb0bdc5" containerID="84ff52bcedb2e55e673ea61191895bb0143c26d8fc7e33b1a2d2a838ef046415" exitCode=0 Apr 20 19:36:27.290743 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.290702 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-lq5v6" Apr 20 19:36:27.291251 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.290745 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-lq5v6" event={"ID":"661a56e3-9af8-4ef6-9907-f4df0bb0bdc5","Type":"ContainerDied","Data":"84ff52bcedb2e55e673ea61191895bb0143c26d8fc7e33b1a2d2a838ef046415"} Apr 20 19:36:27.291251 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.290789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-lq5v6" event={"ID":"661a56e3-9af8-4ef6-9907-f4df0bb0bdc5","Type":"ContainerDied","Data":"ebe3ce4b8e45c35b6ce78ac76355bb08a2471901361bf2adc7b3a6fa6ee9cbb5"} Apr 20 19:36:27.291251 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.290810 2575 scope.go:117] "RemoveContainer" containerID="84ff52bcedb2e55e673ea61191895bb0143c26d8fc7e33b1a2d2a838ef046415" Apr 20 19:36:27.292208 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.292188 2575 generic.go:358] "Generic (PLEG): container finished" podID="6178ac8c-e6a1-4920-8e7a-0319af994c36" containerID="93f580686c9428459f770d57eca5a1bcb15f60241c84715c6441b49c56e55e4d" exitCode=0 Apr 20 19:36:27.292287 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.292246 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-576cffc9f8-65n9h" Apr 20 19:36:27.292346 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.292275 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-576cffc9f8-65n9h" event={"ID":"6178ac8c-e6a1-4920-8e7a-0319af994c36","Type":"ContainerDied","Data":"93f580686c9428459f770d57eca5a1bcb15f60241c84715c6441b49c56e55e4d"} Apr 20 19:36:27.292397 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.292360 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-576cffc9f8-65n9h" event={"ID":"6178ac8c-e6a1-4920-8e7a-0319af994c36","Type":"ContainerDied","Data":"6faf4bf087107f1c1163865be33ebc8a2b96d8ef7b39fb8d57dabd1c507a634c"} Apr 20 19:36:27.303262 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.303236 2575 scope.go:117] "RemoveContainer" containerID="84ff52bcedb2e55e673ea61191895bb0143c26d8fc7e33b1a2d2a838ef046415" Apr 20 19:36:27.303592 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:36:27.303572 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84ff52bcedb2e55e673ea61191895bb0143c26d8fc7e33b1a2d2a838ef046415\": container with ID starting with 84ff52bcedb2e55e673ea61191895bb0143c26d8fc7e33b1a2d2a838ef046415 not found: ID does not exist" containerID="84ff52bcedb2e55e673ea61191895bb0143c26d8fc7e33b1a2d2a838ef046415" Apr 20 19:36:27.303714 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.303642 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84ff52bcedb2e55e673ea61191895bb0143c26d8fc7e33b1a2d2a838ef046415"} err="failed to get container status \"84ff52bcedb2e55e673ea61191895bb0143c26d8fc7e33b1a2d2a838ef046415\": rpc error: code = NotFound desc = could not find container \"84ff52bcedb2e55e673ea61191895bb0143c26d8fc7e33b1a2d2a838ef046415\": container with ID starting with 84ff52bcedb2e55e673ea61191895bb0143c26d8fc7e33b1a2d2a838ef046415 not found: ID does not exist" Apr 20 19:36:27.303714 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.303675 2575 scope.go:117] "RemoveContainer" containerID="93f580686c9428459f770d57eca5a1bcb15f60241c84715c6441b49c56e55e4d" Apr 20 19:36:27.310282 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.310237 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-lq5v6"] Apr 20 19:36:27.314427 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.314407 2575 scope.go:117] "RemoveContainer" containerID="93f580686c9428459f770d57eca5a1bcb15f60241c84715c6441b49c56e55e4d" Apr 20 19:36:27.314652 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.314600 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-lq5v6"] Apr 20 19:36:27.314785 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:36:27.314764 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f580686c9428459f770d57eca5a1bcb15f60241c84715c6441b49c56e55e4d\": container with ID starting with 93f580686c9428459f770d57eca5a1bcb15f60241c84715c6441b49c56e55e4d not found: ID does not exist" containerID="93f580686c9428459f770d57eca5a1bcb15f60241c84715c6441b49c56e55e4d" Apr 20 19:36:27.320629 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.315216 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f580686c9428459f770d57eca5a1bcb15f60241c84715c6441b49c56e55e4d"} err="failed to get container status \"93f580686c9428459f770d57eca5a1bcb15f60241c84715c6441b49c56e55e4d\": rpc error: code = NotFound desc = could not find container \"93f580686c9428459f770d57eca5a1bcb15f60241c84715c6441b49c56e55e4d\": container with ID starting with 93f580686c9428459f770d57eca5a1bcb15f60241c84715c6441b49c56e55e4d not found: ID does not exist" Apr 20 19:36:27.324746 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.324712 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-576cffc9f8-65n9h"] Apr 20 19:36:27.327541 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:27.327514 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-576cffc9f8-65n9h"] Apr 20 19:36:28.908790 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:28.908753 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6178ac8c-e6a1-4920-8e7a-0319af994c36" path="/var/lib/kubelet/pods/6178ac8c-e6a1-4920-8e7a-0319af994c36/volumes" Apr 20 19:36:28.909173 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:28.909155 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="661a56e3-9af8-4ef6-9907-f4df0bb0bdc5" path="/var/lib/kubelet/pods/661a56e3-9af8-4ef6-9907-f4df0bb0bdc5/volumes" Apr 20 19:36:33.261749 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:36:33.261711 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx6q2"] Apr 20 19:37:34.316218 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.316122 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7685fdfdcd-nvb6x"] Apr 20 19:37:34.316659 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.316464 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="661a56e3-9af8-4ef6-9907-f4df0bb0bdc5" containerName="authorino" Apr 20 19:37:34.316659 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.316476 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="661a56e3-9af8-4ef6-9907-f4df0bb0bdc5" containerName="authorino" Apr 20 19:37:34.316659 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.316497 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6178ac8c-e6a1-4920-8e7a-0319af994c36" containerName="authorino" Apr 20 19:37:34.316659 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.316503 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6178ac8c-e6a1-4920-8e7a-0319af994c36" containerName="authorino" Apr 20 19:37:34.316659 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.316553 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="661a56e3-9af8-4ef6-9907-f4df0bb0bdc5" containerName="authorino" Apr 20 19:37:34.316659 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.316563 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6178ac8c-e6a1-4920-8e7a-0319af994c36" containerName="authorino" Apr 20 19:37:34.319744 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.319721 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" Apr 20 19:37:34.323337 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.323311 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"authorino-oidc-ca\"" Apr 20 19:37:34.330037 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.330004 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7685fdfdcd-nvb6x"] Apr 20 19:37:34.387605 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.387558 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-tls-cert\") pod \"authorino-7685fdfdcd-nvb6x\" (UID: \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\") " pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" Apr 20 19:37:34.387815 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.387662 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-oidc-ca\") pod \"authorino-7685fdfdcd-nvb6x\" (UID: \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\") " pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" Apr 20 19:37:34.387815 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.387708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn9rd\" (UniqueName: \"kubernetes.io/projected/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-kube-api-access-dn9rd\") pod \"authorino-7685fdfdcd-nvb6x\" (UID: \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\") " pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" Apr 20 19:37:34.488713 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.488669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-oidc-ca\") pod \"authorino-7685fdfdcd-nvb6x\" (UID: \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\") " pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" Apr 20 19:37:34.488884 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.488730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn9rd\" (UniqueName: \"kubernetes.io/projected/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-kube-api-access-dn9rd\") pod \"authorino-7685fdfdcd-nvb6x\" (UID: \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\") " pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" Apr 20 19:37:34.488884 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.488783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-tls-cert\") pod \"authorino-7685fdfdcd-nvb6x\" (UID: \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\") " pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" Apr 20 19:37:34.489369 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.489344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-oidc-ca\") pod \"authorino-7685fdfdcd-nvb6x\" (UID: \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\") " pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" Apr 20 19:37:34.491370 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.491337 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-tls-cert\") pod \"authorino-7685fdfdcd-nvb6x\" (UID: \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\") " pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" Apr 20 19:37:34.497757 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.497718 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn9rd\" (UniqueName: \"kubernetes.io/projected/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-kube-api-access-dn9rd\") pod \"authorino-7685fdfdcd-nvb6x\" (UID: \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\") " pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" Apr 20 19:37:34.630808 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.630707 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" Apr 20 19:37:34.765697 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:34.765667 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7685fdfdcd-nvb6x"] Apr 20 19:37:34.767950 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:37:34.767915 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdc2b373_33f7_45de_9911_ae3d2cdb7cf8.slice/crio-2d59b843b2b9b046aa7caa1515aff1b1767ad2180e7f64acb4a9a84c95048087 WatchSource:0}: Error finding container 2d59b843b2b9b046aa7caa1515aff1b1767ad2180e7f64acb4a9a84c95048087: Status 404 returned error can't find the container with id 2d59b843b2b9b046aa7caa1515aff1b1767ad2180e7f64acb4a9a84c95048087 Apr 20 19:37:35.572545 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:35.572442 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" event={"ID":"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8","Type":"ContainerStarted","Data":"af64ef4b63f02b60467e115d62695a2eab107fe3275ae375349d93d34ed52250"} Apr 20 19:37:35.572545 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:35.572482 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" event={"ID":"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8","Type":"ContainerStarted","Data":"2d59b843b2b9b046aa7caa1515aff1b1767ad2180e7f64acb4a9a84c95048087"} Apr 20 19:37:35.593711 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:35.593650 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" podStartSLOduration=1.203481404 podStartE2EDuration="1.593631238s" podCreationTimestamp="2026-04-20 19:37:34 +0000 UTC" firstStartedPulling="2026-04-20 19:37:34.769198761 +0000 UTC m=+710.459612013" lastFinishedPulling="2026-04-20 19:37:35.159348594 +0000 UTC m=+710.849761847" observedRunningTime="2026-04-20 19:37:35.591789749 +0000 UTC m=+711.282203035" watchObservedRunningTime="2026-04-20 19:37:35.593631238 +0000 UTC m=+711.284044508" Apr 20 19:37:35.623626 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:35.623565 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-78596c8ffd-kp5v8"] Apr 20 19:37:35.623825 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:35.623803 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-78596c8ffd-kp5v8" podUID="fdfe2ca0-c39e-42c1-a16e-82d3960202e3" containerName="authorino" containerID="cri-o://d05b66fb5c6974369cf41c77d30e4c55ae723337b95a098be009646fef79d558" gracePeriod=30 Apr 20 19:37:35.874307 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:35.874279 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-78596c8ffd-kp5v8" Apr 20 19:37:35.900864 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:35.900829 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/fdfe2ca0-c39e-42c1-a16e-82d3960202e3-tls-cert\") pod \"fdfe2ca0-c39e-42c1-a16e-82d3960202e3\" (UID: \"fdfe2ca0-c39e-42c1-a16e-82d3960202e3\") " Apr 20 19:37:35.901052 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:35.900878 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98xfc\" (UniqueName: \"kubernetes.io/projected/fdfe2ca0-c39e-42c1-a16e-82d3960202e3-kube-api-access-98xfc\") pod \"fdfe2ca0-c39e-42c1-a16e-82d3960202e3\" (UID: \"fdfe2ca0-c39e-42c1-a16e-82d3960202e3\") " Apr 20 19:37:35.903269 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:35.903230 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdfe2ca0-c39e-42c1-a16e-82d3960202e3-kube-api-access-98xfc" (OuterVolumeSpecName: "kube-api-access-98xfc") pod "fdfe2ca0-c39e-42c1-a16e-82d3960202e3" (UID: "fdfe2ca0-c39e-42c1-a16e-82d3960202e3"). InnerVolumeSpecName "kube-api-access-98xfc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:37:35.912518 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:35.912478 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdfe2ca0-c39e-42c1-a16e-82d3960202e3-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "fdfe2ca0-c39e-42c1-a16e-82d3960202e3" (UID: "fdfe2ca0-c39e-42c1-a16e-82d3960202e3"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:37:36.002257 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:36.002223 2575 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/fdfe2ca0-c39e-42c1-a16e-82d3960202e3-tls-cert\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:37:36.002257 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:36.002260 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-98xfc\" (UniqueName: \"kubernetes.io/projected/fdfe2ca0-c39e-42c1-a16e-82d3960202e3-kube-api-access-98xfc\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:37:36.577301 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:36.577263 2575 generic.go:358] "Generic (PLEG): container finished" podID="fdfe2ca0-c39e-42c1-a16e-82d3960202e3" containerID="d05b66fb5c6974369cf41c77d30e4c55ae723337b95a098be009646fef79d558" exitCode=0 Apr 20 19:37:36.577792 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:36.577314 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-78596c8ffd-kp5v8" Apr 20 19:37:36.577792 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:36.577347 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-78596c8ffd-kp5v8" event={"ID":"fdfe2ca0-c39e-42c1-a16e-82d3960202e3","Type":"ContainerDied","Data":"d05b66fb5c6974369cf41c77d30e4c55ae723337b95a098be009646fef79d558"} Apr 20 19:37:36.577792 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:36.577386 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-78596c8ffd-kp5v8" event={"ID":"fdfe2ca0-c39e-42c1-a16e-82d3960202e3","Type":"ContainerDied","Data":"555e488b3dc8af058d53c12d6acf31ddfaf331e2865378e83c49ec773357b2ba"} Apr 20 19:37:36.577792 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:36.577402 2575 scope.go:117] "RemoveContainer" containerID="d05b66fb5c6974369cf41c77d30e4c55ae723337b95a098be009646fef79d558" Apr 20 19:37:36.586978 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:36.586955 2575 scope.go:117] "RemoveContainer" containerID="d05b66fb5c6974369cf41c77d30e4c55ae723337b95a098be009646fef79d558" Apr 20 19:37:36.587305 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:37:36.587279 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d05b66fb5c6974369cf41c77d30e4c55ae723337b95a098be009646fef79d558\": container with ID starting with d05b66fb5c6974369cf41c77d30e4c55ae723337b95a098be009646fef79d558 not found: ID does not exist" containerID="d05b66fb5c6974369cf41c77d30e4c55ae723337b95a098be009646fef79d558" Apr 20 19:37:36.587360 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:36.587320 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d05b66fb5c6974369cf41c77d30e4c55ae723337b95a098be009646fef79d558"} err="failed to get container status \"d05b66fb5c6974369cf41c77d30e4c55ae723337b95a098be009646fef79d558\": rpc error: code = NotFound desc = could not find container \"d05b66fb5c6974369cf41c77d30e4c55ae723337b95a098be009646fef79d558\": container with ID starting with d05b66fb5c6974369cf41c77d30e4c55ae723337b95a098be009646fef79d558 not found: ID does not exist" Apr 20 19:37:36.606314 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:36.606265 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-78596c8ffd-kp5v8"] Apr 20 19:37:36.611514 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:36.611474 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-78596c8ffd-kp5v8"] Apr 20 19:37:36.908091 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:36.908055 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdfe2ca0-c39e-42c1-a16e-82d3960202e3" path="/var/lib/kubelet/pods/fdfe2ca0-c39e-42c1-a16e-82d3960202e3/volumes" Apr 20 19:37:51.886211 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:51.886160 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx6q2"] Apr 20 19:37:59.758128 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:37:59.758084 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx6q2"] Apr 20 19:38:10.953407 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:38:10.953370 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx6q2"] Apr 20 19:38:22.352589 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:38:22.352487 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx6q2"] Apr 20 19:38:39.862779 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:38:39.862741 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-cx6q2"] Apr 20 19:39:09.553178 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.553135 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-86d894b9c9-qj7dp"] Apr 20 19:39:09.553573 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.553489 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fdfe2ca0-c39e-42c1-a16e-82d3960202e3" containerName="authorino" Apr 20 19:39:09.553573 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.553501 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdfe2ca0-c39e-42c1-a16e-82d3960202e3" containerName="authorino" Apr 20 19:39:09.553573 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.553556 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="fdfe2ca0-c39e-42c1-a16e-82d3960202e3" containerName="authorino" Apr 20 19:39:09.556591 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.556569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-86d894b9c9-qj7dp" Apr 20 19:39:09.564629 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.564564 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-86d894b9c9-qj7dp"] Apr 20 19:39:09.627893 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.627848 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/942bbed1-b7d1-491b-890a-99cc9dad95cf-oidc-ca\") pod \"authorino-86d894b9c9-qj7dp\" (UID: \"942bbed1-b7d1-491b-890a-99cc9dad95cf\") " pod="kuadrant-system/authorino-86d894b9c9-qj7dp" Apr 20 19:39:09.628079 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.627939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/942bbed1-b7d1-491b-890a-99cc9dad95cf-tls-cert\") pod \"authorino-86d894b9c9-qj7dp\" (UID: \"942bbed1-b7d1-491b-890a-99cc9dad95cf\") " pod="kuadrant-system/authorino-86d894b9c9-qj7dp" Apr 20 19:39:09.628079 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.627964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx5v2\" (UniqueName: \"kubernetes.io/projected/942bbed1-b7d1-491b-890a-99cc9dad95cf-kube-api-access-rx5v2\") pod \"authorino-86d894b9c9-qj7dp\" (UID: \"942bbed1-b7d1-491b-890a-99cc9dad95cf\") " pod="kuadrant-system/authorino-86d894b9c9-qj7dp" Apr 20 19:39:09.728883 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.728845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/942bbed1-b7d1-491b-890a-99cc9dad95cf-oidc-ca\") pod \"authorino-86d894b9c9-qj7dp\" (UID: \"942bbed1-b7d1-491b-890a-99cc9dad95cf\") " pod="kuadrant-system/authorino-86d894b9c9-qj7dp" Apr 20 19:39:09.729069 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.728912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/942bbed1-b7d1-491b-890a-99cc9dad95cf-tls-cert\") pod \"authorino-86d894b9c9-qj7dp\" (UID: \"942bbed1-b7d1-491b-890a-99cc9dad95cf\") " pod="kuadrant-system/authorino-86d894b9c9-qj7dp" Apr 20 19:39:09.729069 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.728930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rx5v2\" (UniqueName: \"kubernetes.io/projected/942bbed1-b7d1-491b-890a-99cc9dad95cf-kube-api-access-rx5v2\") pod \"authorino-86d894b9c9-qj7dp\" (UID: \"942bbed1-b7d1-491b-890a-99cc9dad95cf\") " pod="kuadrant-system/authorino-86d894b9c9-qj7dp" Apr 20 19:39:09.729658 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.729597 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/942bbed1-b7d1-491b-890a-99cc9dad95cf-oidc-ca\") pod \"authorino-86d894b9c9-qj7dp\" (UID: \"942bbed1-b7d1-491b-890a-99cc9dad95cf\") " pod="kuadrant-system/authorino-86d894b9c9-qj7dp" Apr 20 19:39:09.731780 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.731752 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/942bbed1-b7d1-491b-890a-99cc9dad95cf-tls-cert\") pod \"authorino-86d894b9c9-qj7dp\" (UID: \"942bbed1-b7d1-491b-890a-99cc9dad95cf\") " pod="kuadrant-system/authorino-86d894b9c9-qj7dp" Apr 20 19:39:09.737312 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.737282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx5v2\" (UniqueName: \"kubernetes.io/projected/942bbed1-b7d1-491b-890a-99cc9dad95cf-kube-api-access-rx5v2\") pod \"authorino-86d894b9c9-qj7dp\" (UID: \"942bbed1-b7d1-491b-890a-99cc9dad95cf\") " pod="kuadrant-system/authorino-86d894b9c9-qj7dp" Apr 20 19:39:09.868408 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:09.868283 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-86d894b9c9-qj7dp" Apr 20 19:39:10.014518 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:10.014480 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-86d894b9c9-qj7dp"] Apr 20 19:39:10.021535 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:39:10.021502 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod942bbed1_b7d1_491b_890a_99cc9dad95cf.slice/crio-e0d1a3ee3e2b6f45d99c613f159638f9c7179999607c3f8c1288f2b3b1edde5e WatchSource:0}: Error finding container e0d1a3ee3e2b6f45d99c613f159638f9c7179999607c3f8c1288f2b3b1edde5e: Status 404 returned error can't find the container with id e0d1a3ee3e2b6f45d99c613f159638f9c7179999607c3f8c1288f2b3b1edde5e Apr 20 19:39:10.966840 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:10.966804 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-86d894b9c9-qj7dp" event={"ID":"942bbed1-b7d1-491b-890a-99cc9dad95cf","Type":"ContainerStarted","Data":"be7f2e20a10bee3b5a65a9fb233bd52b917d4c9472a7cef94aada7ac4e575795"} Apr 20 19:39:10.967353 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:10.966847 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-86d894b9c9-qj7dp" event={"ID":"942bbed1-b7d1-491b-890a-99cc9dad95cf","Type":"ContainerStarted","Data":"e0d1a3ee3e2b6f45d99c613f159638f9c7179999607c3f8c1288f2b3b1edde5e"} Apr 20 19:39:10.988071 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:10.988011 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-86d894b9c9-qj7dp" podStartSLOduration=1.633797765 podStartE2EDuration="1.987994192s" podCreationTimestamp="2026-04-20 19:39:09 +0000 UTC" firstStartedPulling="2026-04-20 19:39:10.022946405 +0000 UTC m=+805.713359657" lastFinishedPulling="2026-04-20 19:39:10.377142813 +0000 UTC m=+806.067556084" observedRunningTime="2026-04-20 19:39:10.985222044 +0000 UTC m=+806.675635319" watchObservedRunningTime="2026-04-20 19:39:10.987994192 +0000 UTC m=+806.678407465" Apr 20 19:39:11.040730 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.040696 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7685fdfdcd-nvb6x"] Apr 20 19:39:11.040963 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.040937 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" podUID="bdc2b373-33f7-45de-9911-ae3d2cdb7cf8" containerName="authorino" containerID="cri-o://af64ef4b63f02b60467e115d62695a2eab107fe3275ae375349d93d34ed52250" gracePeriod=30 Apr 20 19:39:11.301333 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.301305 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" Apr 20 19:39:11.341906 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.341864 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-oidc-ca\") pod \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\" (UID: \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\") " Apr 20 19:39:11.342109 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.341944 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn9rd\" (UniqueName: \"kubernetes.io/projected/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-kube-api-access-dn9rd\") pod \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\" (UID: \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\") " Apr 20 19:39:11.342109 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.342009 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-tls-cert\") pod \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\" (UID: \"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8\") " Apr 20 19:39:11.344280 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.344241 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-kube-api-access-dn9rd" (OuterVolumeSpecName: "kube-api-access-dn9rd") pod "bdc2b373-33f7-45de-9911-ae3d2cdb7cf8" (UID: "bdc2b373-33f7-45de-9911-ae3d2cdb7cf8"). InnerVolumeSpecName "kube-api-access-dn9rd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:39:11.346938 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.346907 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-oidc-ca" (OuterVolumeSpecName: "oidc-ca") pod "bdc2b373-33f7-45de-9911-ae3d2cdb7cf8" (UID: "bdc2b373-33f7-45de-9911-ae3d2cdb7cf8"). InnerVolumeSpecName "oidc-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:39:11.353097 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.353060 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "bdc2b373-33f7-45de-9911-ae3d2cdb7cf8" (UID: "bdc2b373-33f7-45de-9911-ae3d2cdb7cf8"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:39:11.443578 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.443539 2575 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-tls-cert\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:39:11.443578 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.443572 2575 reconciler_common.go:299] "Volume detached for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-oidc-ca\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:39:11.443578 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.443582 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dn9rd\" (UniqueName: \"kubernetes.io/projected/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8-kube-api-access-dn9rd\") on node \"ip-10-0-135-55.ec2.internal\" DevicePath \"\"" Apr 20 19:39:11.971976 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.971935 2575 generic.go:358] "Generic (PLEG): container finished" podID="bdc2b373-33f7-45de-9911-ae3d2cdb7cf8" containerID="af64ef4b63f02b60467e115d62695a2eab107fe3275ae375349d93d34ed52250" exitCode=0 Apr 20 19:39:11.972445 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.972007 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" Apr 20 19:39:11.972445 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.972020 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" event={"ID":"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8","Type":"ContainerDied","Data":"af64ef4b63f02b60467e115d62695a2eab107fe3275ae375349d93d34ed52250"} Apr 20 19:39:11.972445 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.972059 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7685fdfdcd-nvb6x" event={"ID":"bdc2b373-33f7-45de-9911-ae3d2cdb7cf8","Type":"ContainerDied","Data":"2d59b843b2b9b046aa7caa1515aff1b1767ad2180e7f64acb4a9a84c95048087"} Apr 20 19:39:11.972445 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.972074 2575 scope.go:117] "RemoveContainer" containerID="af64ef4b63f02b60467e115d62695a2eab107fe3275ae375349d93d34ed52250" Apr 20 19:39:11.981265 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.981245 2575 scope.go:117] "RemoveContainer" containerID="af64ef4b63f02b60467e115d62695a2eab107fe3275ae375349d93d34ed52250" Apr 20 19:39:11.981586 ip-10-0-135-55 kubenswrapper[2575]: E0420 19:39:11.981562 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af64ef4b63f02b60467e115d62695a2eab107fe3275ae375349d93d34ed52250\": container with ID starting with af64ef4b63f02b60467e115d62695a2eab107fe3275ae375349d93d34ed52250 not found: ID does not exist" containerID="af64ef4b63f02b60467e115d62695a2eab107fe3275ae375349d93d34ed52250" Apr 20 19:39:11.981718 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.981594 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af64ef4b63f02b60467e115d62695a2eab107fe3275ae375349d93d34ed52250"} err="failed to get container status \"af64ef4b63f02b60467e115d62695a2eab107fe3275ae375349d93d34ed52250\": rpc error: code = NotFound desc = could not find container \"af64ef4b63f02b60467e115d62695a2eab107fe3275ae375349d93d34ed52250\": container with ID starting with af64ef4b63f02b60467e115d62695a2eab107fe3275ae375349d93d34ed52250 not found: ID does not exist" Apr 20 19:39:11.995998 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.995901 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7685fdfdcd-nvb6x"] Apr 20 19:39:11.998229 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:11.998187 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7685fdfdcd-nvb6x"] Apr 20 19:39:12.909701 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:12.909654 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc2b373-33f7-45de-9911-ae3d2cdb7cf8" path="/var/lib/kubelet/pods/bdc2b373-33f7-45de-9911-ae3d2cdb7cf8/volumes" Apr 20 19:39:49.163185 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:49.163139 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-86d894b9c9-qj7dp_942bbed1-b7d1-491b-890a-99cc9dad95cf/authorino/0.log" Apr 20 19:39:53.603017 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:53.602982 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9f747d685-cdf22_39b9a935-d4c4-4f7d-b2f1-461da5d3c126/manager/0.log" Apr 20 19:39:54.458278 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:54.458239 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj_77968d31-953a-4f8b-942d-87fcb75f5352/pull/0.log" Apr 20 19:39:54.465090 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:54.465039 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj_77968d31-953a-4f8b-942d-87fcb75f5352/extract/0.log" Apr 20 19:39:54.471383 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:54.471351 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj_77968d31-953a-4f8b-942d-87fcb75f5352/util/0.log" Apr 20 19:39:54.577450 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:54.577421 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg_c1273a89-da0f-4ea8-b5bf-d04231bca953/util/0.log" Apr 20 19:39:54.584223 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:54.584202 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg_c1273a89-da0f-4ea8-b5bf-d04231bca953/pull/0.log" Apr 20 19:39:54.590956 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:54.590932 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg_c1273a89-da0f-4ea8-b5bf-d04231bca953/extract/0.log" Apr 20 19:39:54.698020 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:54.697991 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf_48fab0b8-61f2-4a90-9db9-fca6dedd5126/pull/0.log" Apr 20 19:39:54.705191 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:54.705166 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf_48fab0b8-61f2-4a90-9db9-fca6dedd5126/extract/0.log" Apr 20 19:39:54.712634 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:54.712556 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf_48fab0b8-61f2-4a90-9db9-fca6dedd5126/util/0.log" Apr 20 19:39:54.826724 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:54.826696 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9_436fc7c2-2792-4cd5-93aa-53b635a2ab18/util/0.log" Apr 20 19:39:54.834480 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:54.834453 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9_436fc7c2-2792-4cd5-93aa-53b635a2ab18/pull/0.log" Apr 20 19:39:54.842450 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:54.842423 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9_436fc7c2-2792-4cd5-93aa-53b635a2ab18/extract/0.log" Apr 20 19:39:54.955170 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:54.955133 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-86d894b9c9-qj7dp_942bbed1-b7d1-491b-890a-99cc9dad95cf/authorino/0.log" Apr 20 19:39:55.298091 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:55.298059 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-r85fq_ca8d07b0-2a3f-4077-8471-8a969c792dcf/kuadrant-console-plugin/0.log" Apr 20 19:39:55.638688 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:55.638662 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-cx6q2_b636e4cb-4980-4c0a-84a8-ba712bb3d0b0/limitador/0.log" Apr 20 19:39:56.298948 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:56.298917 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-79f76cb8cc-9vbzt_e4e71888-9f98-483b-ae51-52a227f9b41c/kube-auth-proxy/0.log" Apr 20 19:39:56.609783 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:39:56.609702 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-65d6b6f7bd-fw8wn_28176d29-9406-4440-8156-fe54a5e5596e/router/0.log" Apr 20 19:40:01.266139 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.266104 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qqfns/must-gather-zxkf4"] Apr 20 19:40:01.266638 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.266486 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdc2b373-33f7-45de-9911-ae3d2cdb7cf8" containerName="authorino" Apr 20 19:40:01.266638 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.266500 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdc2b373-33f7-45de-9911-ae3d2cdb7cf8" containerName="authorino" Apr 20 19:40:01.266638 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.266569 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdc2b373-33f7-45de-9911-ae3d2cdb7cf8" containerName="authorino" Apr 20 19:40:01.269833 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.269791 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqfns/must-gather-zxkf4" Apr 20 19:40:01.273162 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.273128 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qqfns\"/\"kube-root-ca.crt\"" Apr 20 19:40:01.279256 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.279225 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qqfns\"/\"openshift-service-ca.crt\"" Apr 20 19:40:01.279995 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.279961 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qqfns\"/\"default-dockercfg-gfb8d\"" Apr 20 19:40:01.283790 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.283764 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qqfns/must-gather-zxkf4"] Apr 20 19:40:01.379875 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.379840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/683df1d2-55dd-4867-abfa-9e4f9bebc728-must-gather-output\") pod \"must-gather-zxkf4\" (UID: \"683df1d2-55dd-4867-abfa-9e4f9bebc728\") " pod="openshift-must-gather-qqfns/must-gather-zxkf4" Apr 20 19:40:01.380056 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.379933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkx5w\" (UniqueName: \"kubernetes.io/projected/683df1d2-55dd-4867-abfa-9e4f9bebc728-kube-api-access-bkx5w\") pod \"must-gather-zxkf4\" (UID: \"683df1d2-55dd-4867-abfa-9e4f9bebc728\") " pod="openshift-must-gather-qqfns/must-gather-zxkf4" Apr 20 19:40:01.480924 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.480883 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/683df1d2-55dd-4867-abfa-9e4f9bebc728-must-gather-output\") pod \"must-gather-zxkf4\" (UID: \"683df1d2-55dd-4867-abfa-9e4f9bebc728\") " pod="openshift-must-gather-qqfns/must-gather-zxkf4" Apr 20 19:40:01.481127 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.480987 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkx5w\" (UniqueName: \"kubernetes.io/projected/683df1d2-55dd-4867-abfa-9e4f9bebc728-kube-api-access-bkx5w\") pod \"must-gather-zxkf4\" (UID: \"683df1d2-55dd-4867-abfa-9e4f9bebc728\") " pod="openshift-must-gather-qqfns/must-gather-zxkf4" Apr 20 19:40:01.481252 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.481230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/683df1d2-55dd-4867-abfa-9e4f9bebc728-must-gather-output\") pod \"must-gather-zxkf4\" (UID: \"683df1d2-55dd-4867-abfa-9e4f9bebc728\") " pod="openshift-must-gather-qqfns/must-gather-zxkf4" Apr 20 19:40:01.495058 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.495018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkx5w\" (UniqueName: \"kubernetes.io/projected/683df1d2-55dd-4867-abfa-9e4f9bebc728-kube-api-access-bkx5w\") pod \"must-gather-zxkf4\" (UID: \"683df1d2-55dd-4867-abfa-9e4f9bebc728\") " pod="openshift-must-gather-qqfns/must-gather-zxkf4" Apr 20 19:40:01.586360 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.586248 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqfns/must-gather-zxkf4" Apr 20 19:40:01.723190 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:01.723154 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qqfns/must-gather-zxkf4"] Apr 20 19:40:01.724174 ip-10-0-135-55 kubenswrapper[2575]: W0420 19:40:01.724135 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod683df1d2_55dd_4867_abfa_9e4f9bebc728.slice/crio-d7686d9dc54ed59c62e60d2a3c49ef770a167d95522a06a9311d5fe62d50974e WatchSource:0}: Error finding container d7686d9dc54ed59c62e60d2a3c49ef770a167d95522a06a9311d5fe62d50974e: Status 404 returned error can't find the container with id d7686d9dc54ed59c62e60d2a3c49ef770a167d95522a06a9311d5fe62d50974e Apr 20 19:40:02.164115 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:02.164071 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqfns/must-gather-zxkf4" event={"ID":"683df1d2-55dd-4867-abfa-9e4f9bebc728","Type":"ContainerStarted","Data":"d7686d9dc54ed59c62e60d2a3c49ef770a167d95522a06a9311d5fe62d50974e"} Apr 20 19:40:03.172273 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:03.172229 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqfns/must-gather-zxkf4" event={"ID":"683df1d2-55dd-4867-abfa-9e4f9bebc728","Type":"ContainerStarted","Data":"d5afa848427d64d525ef33e211b7d380211e2e44655bb682cd1d73aff724507c"} Apr 20 19:40:03.172273 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:03.172278 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqfns/must-gather-zxkf4" event={"ID":"683df1d2-55dd-4867-abfa-9e4f9bebc728","Type":"ContainerStarted","Data":"4dbcf635d27945d4b546d50d5a3e23b88e5ce9896580d603eea2220443cfdd90"} Apr 20 19:40:03.189906 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:03.189846 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qqfns/must-gather-zxkf4" podStartSLOduration=1.235294511 podStartE2EDuration="2.189828492s" podCreationTimestamp="2026-04-20 19:40:01 +0000 UTC" firstStartedPulling="2026-04-20 19:40:01.72609054 +0000 UTC m=+857.416503792" lastFinishedPulling="2026-04-20 19:40:02.680624503 +0000 UTC m=+858.371037773" observedRunningTime="2026-04-20 19:40:03.188064975 +0000 UTC m=+858.878478248" watchObservedRunningTime="2026-04-20 19:40:03.189828492 +0000 UTC m=+858.880241765" Apr 20 19:40:04.345434 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:04.345375 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-85qqw_18c8a371-8917-4525-b6a4-7df091337b68/global-pull-secret-syncer/0.log" Apr 20 19:40:04.492589 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:04.492529 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-724ct_7d5c4e0a-2236-4bab-82b8-acad605e74cb/konnectivity-agent/0.log" Apr 20 19:40:04.629884 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:04.629792 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-55.ec2.internal_60834707d5f53c027199cd5bc82fb1f6/haproxy/0.log" Apr 20 19:40:08.993001 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:08.992965 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj_77968d31-953a-4f8b-942d-87fcb75f5352/extract/0.log" Apr 20 19:40:09.018561 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:09.018518 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj_77968d31-953a-4f8b-942d-87fcb75f5352/util/0.log" Apr 20 19:40:09.048656 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:09.048597 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s2grj_77968d31-953a-4f8b-942d-87fcb75f5352/pull/0.log" Apr 20 19:40:09.081065 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:09.081029 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg_c1273a89-da0f-4ea8-b5bf-d04231bca953/extract/0.log" Apr 20 19:40:09.109133 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:09.109102 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg_c1273a89-da0f-4ea8-b5bf-d04231bca953/util/0.log" Apr 20 19:40:09.143224 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:09.143189 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0crgwg_c1273a89-da0f-4ea8-b5bf-d04231bca953/pull/0.log" Apr 20 19:40:09.176666 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:09.176627 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf_48fab0b8-61f2-4a90-9db9-fca6dedd5126/extract/0.log" Apr 20 19:40:09.204395 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:09.204367 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf_48fab0b8-61f2-4a90-9db9-fca6dedd5126/util/0.log" Apr 20 19:40:09.231386 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:09.231354 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed736h4jf_48fab0b8-61f2-4a90-9db9-fca6dedd5126/pull/0.log" Apr 20 19:40:09.264688 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:09.264568 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9_436fc7c2-2792-4cd5-93aa-53b635a2ab18/extract/0.log" Apr 20 19:40:09.291735 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:09.291683 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9_436fc7c2-2792-4cd5-93aa-53b635a2ab18/util/0.log" Apr 20 19:40:09.326448 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:09.326416 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12pbl9_436fc7c2-2792-4cd5-93aa-53b635a2ab18/pull/0.log" Apr 20 19:40:09.374702 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:09.374667 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-86d894b9c9-qj7dp_942bbed1-b7d1-491b-890a-99cc9dad95cf/authorino/0.log" Apr 20 19:40:09.479917 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:09.479840 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-r85fq_ca8d07b0-2a3f-4077-8471-8a969c792dcf/kuadrant-console-plugin/0.log" Apr 20 19:40:09.590000 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:09.589895 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-cx6q2_b636e4cb-4980-4c0a-84a8-ba712bb3d0b0/limitador/0.log" Apr 20 19:40:11.111044 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.111012 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d871b7e8-cab0-4215-80e3-01acd23fbf7a/alertmanager/0.log" Apr 20 19:40:11.141409 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.141363 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d871b7e8-cab0-4215-80e3-01acd23fbf7a/config-reloader/0.log" Apr 20 19:40:11.170021 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.169992 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d871b7e8-cab0-4215-80e3-01acd23fbf7a/kube-rbac-proxy-web/0.log" Apr 20 19:40:11.194804 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.194773 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d871b7e8-cab0-4215-80e3-01acd23fbf7a/kube-rbac-proxy/0.log" Apr 20 19:40:11.227774 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.227739 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d871b7e8-cab0-4215-80e3-01acd23fbf7a/kube-rbac-proxy-metric/0.log" Apr 20 19:40:11.293765 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.293739 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d871b7e8-cab0-4215-80e3-01acd23fbf7a/prom-label-proxy/0.log" Apr 20 19:40:11.343663 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.343631 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_d871b7e8-cab0-4215-80e3-01acd23fbf7a/init-config-reloader/0.log" Apr 20 19:40:11.528025 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.527975 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-9dc95554d-27nnz_85576550-47ec-4bdb-8e87-c3dc63086f67/metrics-server/0.log" Apr 20 19:40:11.554144 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.554110 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-tgvkt_dd1e918e-d7fe-4a79-99b4-8b594bac54b0/monitoring-plugin/0.log" Apr 20 19:40:11.585758 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.585730 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-677ws_02f4c19d-e14b-45e5-8c19-5ad304dc953b/node-exporter/0.log" Apr 20 19:40:11.610761 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.610734 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-677ws_02f4c19d-e14b-45e5-8c19-5ad304dc953b/kube-rbac-proxy/0.log" Apr 20 19:40:11.636293 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.636247 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-677ws_02f4c19d-e14b-45e5-8c19-5ad304dc953b/init-textfile/0.log" Apr 20 19:40:11.851763 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.851672 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-pfkt2_4ed2128b-fc5f-4c17-b148-fae03c419a6d/kube-rbac-proxy-main/0.log" Apr 20 19:40:11.880641 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.880591 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-pfkt2_4ed2128b-fc5f-4c17-b148-fae03c419a6d/kube-rbac-proxy-self/0.log" Apr 20 19:40:11.910457 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:11.910419 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-pfkt2_4ed2128b-fc5f-4c17-b148-fae03c419a6d/openshift-state-metrics/0.log" Apr 20 19:40:12.160851 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:12.160818 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-g4s4x_b2f72a49-7b0b-4e66-806b-91f6b151dcc1/prometheus-operator/0.log" Apr 20 19:40:12.183986 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:12.183958 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-g4s4x_b2f72a49-7b0b-4e66-806b-91f6b151dcc1/kube-rbac-proxy/0.log" Apr 20 19:40:13.253505 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.253467 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2"] Apr 20 19:40:13.260333 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.260303 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.266048 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.266016 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2"] Apr 20 19:40:13.301472 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.301435 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-podres\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.301676 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.301534 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-lib-modules\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.301676 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.301578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f495t\" (UniqueName: \"kubernetes.io/projected/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-kube-api-access-f495t\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.301676 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.301644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-proc\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.301808 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.301677 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-sys\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.402337 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.402295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-proc\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.402547 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.402347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-sys\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.402547 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.402390 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-podres\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.402547 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.402437 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-lib-modules\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.402547 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.402441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-proc\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.402547 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.402467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f495t\" (UniqueName: \"kubernetes.io/projected/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-kube-api-access-f495t\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.402547 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.402489 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-sys\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.402547 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.402541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-podres\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.402909 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.402657 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-lib-modules\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.412812 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.412735 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f495t\" (UniqueName: \"kubernetes.io/projected/abfc8c61-abc0-4f8e-b170-3ab861e40b3f-kube-api-access-f495t\") pod \"perf-node-gather-daemonset-564j2\" (UID: \"abfc8c61-abc0-4f8e-b170-3ab861e40b3f\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.574420 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.574304 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:13.759993 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:13.756664 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2"] Apr 20 19:40:14.240079 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:14.239983 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" event={"ID":"abfc8c61-abc0-4f8e-b170-3ab861e40b3f","Type":"ContainerStarted","Data":"98276638563b705058a84259147654f47c68f0c9d87f20193e3d395a2cf95e96"} Apr 20 19:40:14.240352 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:14.240320 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" event={"ID":"abfc8c61-abc0-4f8e-b170-3ab861e40b3f","Type":"ContainerStarted","Data":"c23bc6f950a61f009664e7027d2b7f1a3d9239392dff7bf6f34e984adca5a9b4"} Apr 20 19:40:14.241374 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:14.241346 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:14.260604 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:14.260524 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" podStartSLOduration=1.260505048 podStartE2EDuration="1.260505048s" podCreationTimestamp="2026-04-20 19:40:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:40:14.258870023 +0000 UTC m=+869.949283302" watchObservedRunningTime="2026-04-20 19:40:14.260505048 +0000 UTC m=+869.950918322" Apr 20 19:40:16.046534 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:16.046500 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-d6zt6_5b292e05-658a-4efc-8b35-8f64c0071f73/dns/0.log" Apr 20 19:40:16.069961 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:16.069929 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-d6zt6_5b292e05-658a-4efc-8b35-8f64c0071f73/kube-rbac-proxy/0.log" Apr 20 19:40:16.198365 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:16.198334 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bttgn_9ce44109-8d3b-4499-b0fb-fa475f90e132/dns-node-resolver/0.log" Apr 20 19:40:16.736352 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:16.736319 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-b6xwm_148d86dd-6cae-42a6-9e0e-44b0a13baa33/node-ca/0.log" Apr 20 19:40:17.754974 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:17.754944 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-79f76cb8cc-9vbzt_e4e71888-9f98-483b-ae51-52a227f9b41c/kube-auth-proxy/0.log" Apr 20 19:40:17.844906 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:17.844873 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-65d6b6f7bd-fw8wn_28176d29-9406-4440-8156-fe54a5e5596e/router/0.log" Apr 20 19:40:18.399377 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:18.399342 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hsc5z_e89b8bbe-eeb8-48dd-8e7e-9cb9544b9425/serve-healthcheck-canary/0.log" Apr 20 19:40:18.946225 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:18.946191 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-j9gnb_eb4b3625-678c-45ba-87f5-25257a97723a/insights-operator/1.log" Apr 20 19:40:18.946701 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:18.946269 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-j9gnb_eb4b3625-678c-45ba-87f5-25257a97723a/insights-operator/0.log" Apr 20 19:40:19.052816 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:19.052787 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-btnm6_4b58e213-637f-47ba-8916-6ce23705dc6f/kube-rbac-proxy/0.log" Apr 20 19:40:19.079896 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:19.079869 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-btnm6_4b58e213-637f-47ba-8916-6ce23705dc6f/exporter/0.log" Apr 20 19:40:19.114959 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:19.114926 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-btnm6_4b58e213-637f-47ba-8916-6ce23705dc6f/extractor/0.log" Apr 20 19:40:21.267944 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:21.267912 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-564j2" Apr 20 19:40:21.405002 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:21.404974 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9f747d685-cdf22_39b9a935-d4c4-4f7d-b2f1-461da5d3c126/manager/0.log" Apr 20 19:40:22.950115 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:22.949589 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6ddf46b867-qzppn_244b2580-8d8a-40f8-915e-e44bcee11364/manager/0.log" Apr 20 19:40:27.485337 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:27.485290 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-xwrgr_7ccf1368-39c1-4d86-9a71-be00cf3fa5f8/migrator/0.log" Apr 20 19:40:27.509669 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:27.509629 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-xwrgr_7ccf1368-39c1-4d86-9a71-be00cf3fa5f8/graceful-termination/0.log" Apr 20 19:40:28.905933 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:28.905894 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-545qf_ddf66c79-6b7a-4d5c-93f0-e2b401bede8d/kube-multus/0.log" Apr 20 19:40:29.295262 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:29.295180 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wbqgg_3e3c3be7-736a-4d5e-ae42-c0f7e318af44/kube-multus-additional-cni-plugins/0.log" Apr 20 19:40:29.329707 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:29.329675 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wbqgg_3e3c3be7-736a-4d5e-ae42-c0f7e318af44/egress-router-binary-copy/0.log" Apr 20 19:40:29.358589 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:29.358561 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wbqgg_3e3c3be7-736a-4d5e-ae42-c0f7e318af44/cni-plugins/0.log" Apr 20 19:40:29.383520 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:29.383490 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wbqgg_3e3c3be7-736a-4d5e-ae42-c0f7e318af44/bond-cni-plugin/0.log" Apr 20 19:40:29.415567 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:29.415527 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wbqgg_3e3c3be7-736a-4d5e-ae42-c0f7e318af44/routeoverride-cni/0.log" Apr 20 19:40:29.459731 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:29.459698 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wbqgg_3e3c3be7-736a-4d5e-ae42-c0f7e318af44/whereabouts-cni-bincopy/0.log" Apr 20 19:40:29.515841 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:29.515813 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wbqgg_3e3c3be7-736a-4d5e-ae42-c0f7e318af44/whereabouts-cni/0.log" Apr 20 19:40:29.733474 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:29.733446 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j9xp6_234402da-caaa-48f3-8a69-400f12f55eb6/network-metrics-daemon/0.log" Apr 20 19:40:29.755966 ip-10-0-135-55 kubenswrapper[2575]: I0420 19:40:29.755933 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j9xp6_234402da-caaa-48f3-8a69-400f12f55eb6/kube-rbac-proxy/0.log"