Apr 21 17:33:35.166493 ip-10-0-134-77 systemd[1]: Starting Kubernetes Kubelet... Apr 21 17:33:35.636991 ip-10-0-134-77 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 17:33:35.636991 ip-10-0-134-77 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 17:33:35.636991 ip-10-0-134-77 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 17:33:35.636991 ip-10-0-134-77 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 17:33:35.636991 ip-10-0-134-77 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 17:33:35.639974 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.639865 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 17:33:35.642354 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642333 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 17:33:35.642354 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642353 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 17:33:35.642354 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642356 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642360 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642363 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642367 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642370 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642373 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642375 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642380 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642383 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642386 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642388 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642391 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642394 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642397 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642401 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642404 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642406 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642409 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642411 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642414 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 17:33:35.642447 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642417 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642420 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642423 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642426 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642429 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642431 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642434 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642437 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642441 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642443 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642447 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642449 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642452 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642455 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642458 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642461 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642464 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642466 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642469 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 17:33:35.642981 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642471 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642474 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642476 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642479 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642481 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642484 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642486 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642489 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642491 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642494 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642496 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642499 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642501 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642504 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642507 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642510 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642513 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642516 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642519 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642521 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 17:33:35.643475 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642524 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642526 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642529 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642531 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642534 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642537 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642539 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642544 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642549 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642551 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642554 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642557 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642559 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642563 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642567 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642570 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642573 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642576 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642579 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 17:33:35.643959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642581 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 17:33:35.644437 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642584 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 17:33:35.644437 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642586 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 17:33:35.644437 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642589 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 17:33:35.644437 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642591 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 17:33:35.644437 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.642594 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 17:33:35.645359 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645345 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 17:33:35.645359 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645357 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 17:33:35.645359 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645361 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645364 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645368 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645371 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645374 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645376 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645380 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645382 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645385 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645388 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645391 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645394 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645396 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645400 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645404 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645407 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645409 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645412 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645414 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 17:33:35.645453 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645417 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645419 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645422 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645425 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645427 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645430 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645433 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645435 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645438 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645440 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645443 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645445 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645448 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645450 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645453 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645457 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645460 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645462 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645465 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645468 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 17:33:35.645950 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645471 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645473 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645476 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645478 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645485 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645487 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645490 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645492 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645495 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645497 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645500 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645503 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645505 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645508 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645510 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645513 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645515 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645519 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645523 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645526 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 17:33:35.646479 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645530 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645533 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645537 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645539 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645543 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645545 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645548 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645553 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645556 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645559 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645562 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645564 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645567 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645570 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645572 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645575 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645578 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645581 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645583 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645586 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 17:33:35.646972 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645588 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645591 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645593 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645596 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.645598 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645676 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645691 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645698 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645703 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645707 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645711 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645717 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645721 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645725 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645728 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645732 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645735 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645739 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645742 2573 flags.go:64] FLAG: --cgroup-root="" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645745 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645748 2573 flags.go:64] FLAG: --client-ca-file="" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645751 2573 flags.go:64] FLAG: --cloud-config="" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645754 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 21 17:33:35.647485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645757 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645763 2573 flags.go:64] FLAG: --cluster-domain="" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645766 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645769 2573 flags.go:64] FLAG: --config-dir="" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645772 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645776 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645780 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645786 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645789 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645792 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645796 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645799 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645802 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645806 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645809 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645813 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645816 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645819 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645822 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645826 2573 flags.go:64] FLAG: --enable-server="true" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645829 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645834 2573 flags.go:64] FLAG: --event-burst="100" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645837 2573 flags.go:64] FLAG: --event-qps="50" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645840 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645844 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 17:33:35.648058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645847 2573 flags.go:64] FLAG: --eviction-hard="" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645851 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645854 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645857 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645860 2573 flags.go:64] FLAG: --eviction-soft="" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645864 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645867 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645870 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645873 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645876 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645879 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645882 2573 flags.go:64] FLAG: --feature-gates="" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645886 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645889 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645892 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645896 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645899 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645902 2573 flags.go:64] FLAG: --help="false" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645905 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-134-77.ec2.internal" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645908 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645911 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645914 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645917 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 17:33:35.648688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645920 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645923 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645926 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645929 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645933 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645936 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645939 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645942 2573 flags.go:64] FLAG: --kube-reserved="" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645945 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645948 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645951 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645959 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645963 2573 flags.go:64] FLAG: --lock-file="" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645966 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645969 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645972 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645978 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645981 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645984 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645987 2573 flags.go:64] FLAG: --logging-format="text" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645990 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645993 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645996 2573 flags.go:64] FLAG: --manifest-url="" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.645999 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646004 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 17:33:35.649264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646007 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646011 2573 flags.go:64] FLAG: --max-pods="110" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646015 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646019 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646022 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646025 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646028 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646031 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646034 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646041 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646044 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646048 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646051 2573 flags.go:64] FLAG: --pod-cidr="" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646053 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646060 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646063 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646066 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646069 2573 flags.go:64] FLAG: --port="10250" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646072 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646077 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ba4e510b44aff185" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646080 2573 flags.go:64] FLAG: --qos-reserved="" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646083 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646086 2573 flags.go:64] FLAG: --register-node="true" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646089 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 21 17:33:35.649920 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646092 2573 flags.go:64] FLAG: --register-with-taints="" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646096 2573 flags.go:64] FLAG: --registry-burst="10" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646099 2573 flags.go:64] FLAG: --registry-qps="5" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646102 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646105 2573 flags.go:64] FLAG: --reserved-memory="" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646109 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646113 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646116 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646119 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646122 2573 flags.go:64] FLAG: --runonce="false" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646125 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646142 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646145 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646148 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646151 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646154 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646157 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646161 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646164 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646167 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646170 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646173 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646176 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646179 2573 flags.go:64] FLAG: --system-cgroups="" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646182 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 17:33:35.650511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646188 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646191 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646195 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646200 2573 flags.go:64] FLAG: --tls-min-version="" Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646203 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646206 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646210 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646213 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646216 2573 flags.go:64] FLAG: --v="2" Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646221 2573 flags.go:64] FLAG: --version="false" Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646225 2573 flags.go:64] FLAG: --vmodule="" Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646230 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.646233 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646334 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646338 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646341 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646344 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646350 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646352 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646355 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646358 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646362 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 17:33:35.651190 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646366 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646369 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646371 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646377 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646380 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646384 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646387 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646390 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646393 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646395 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646398 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646401 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646406 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646409 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646411 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646414 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646416 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646419 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646422 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646424 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 17:33:35.651723 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646427 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646430 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646433 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646436 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646439 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646441 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646444 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646446 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646454 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646456 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646459 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646462 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646464 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646467 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646470 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646473 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646476 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646479 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646481 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646484 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 17:33:35.652293 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646487 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646490 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646493 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646495 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646499 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646502 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646505 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646508 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646510 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646513 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646516 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646518 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646521 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646524 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646527 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646530 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646532 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646535 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646537 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646540 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 17:33:35.652788 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646542 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646551 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646554 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646556 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646559 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646562 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646564 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646572 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646574 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646577 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646579 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646582 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646584 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646587 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646589 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646592 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 17:33:35.653318 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.646596 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 17:33:35.653732 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.647304 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 17:33:35.654302 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.654278 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 17:33:35.654340 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.654303 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 17:33:35.654368 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654352 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 17:33:35.654368 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654358 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 17:33:35.654368 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654361 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 17:33:35.654368 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654364 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 17:33:35.654368 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654367 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 17:33:35.654368 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654370 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654373 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654376 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654379 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654382 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654385 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654388 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654390 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654393 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654396 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654399 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654402 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654404 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654407 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654410 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654414 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654416 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654419 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654422 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654424 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 17:33:35.654521 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654427 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654430 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654433 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654435 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654438 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654441 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654445 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654447 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654450 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654453 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654455 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654458 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654461 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654463 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654468 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654470 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654473 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654476 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654478 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654481 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 17:33:35.655016 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654484 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654486 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654489 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654492 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654494 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654497 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654500 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654502 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654505 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654507 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654510 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654512 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654515 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654518 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654520 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654523 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654526 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654528 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654532 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654535 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 17:33:35.655512 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654540 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654544 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654547 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654549 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654552 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654555 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654558 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654561 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654563 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654566 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654568 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654571 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654574 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654576 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654579 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654581 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654583 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654586 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654590 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 17:33:35.655993 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654595 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654598 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.654603 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654730 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654735 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654738 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654742 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654745 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654748 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654751 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654753 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654756 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654759 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654762 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654765 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 17:33:35.656509 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654768 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654771 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654773 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654776 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654779 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654782 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654785 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654788 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654791 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654793 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654796 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654798 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654802 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654806 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654808 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654811 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654814 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654816 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654819 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654821 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 17:33:35.656886 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654824 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654826 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654829 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654832 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654836 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654839 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654842 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654844 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654847 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654850 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654852 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654855 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654858 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654860 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654862 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654865 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654868 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654871 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654873 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 17:33:35.657439 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654876 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654878 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654881 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654884 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654886 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654889 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654891 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654894 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654896 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654899 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654901 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654904 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654907 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654909 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654912 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654914 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654916 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654919 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654921 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654924 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 17:33:35.658158 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654927 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654930 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654932 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654935 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654937 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654940 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654943 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654945 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654948 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654951 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654953 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654956 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654958 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654961 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:35.654964 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.654968 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 17:33:35.658794 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.655854 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 17:33:35.662505 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.662480 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 17:33:35.663458 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.663445 2573 server.go:1019] "Starting client certificate rotation" Apr 21 17:33:35.663562 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.663541 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 17:33:35.663593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.663589 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 17:33:35.690886 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.690859 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 17:33:35.694032 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.694006 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 17:33:35.712841 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.712804 2573 log.go:25] "Validated CRI v1 runtime API" Apr 21 17:33:35.719561 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.719536 2573 log.go:25] "Validated CRI v1 image API" Apr 21 17:33:35.721079 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.721059 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 17:33:35.723665 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.723642 2573 fs.go:135] Filesystem UUIDs: map[744b43d5-521b-4338-a9e6-8acf20e39f16:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 7ddd5e18-cec5-468a-bece-9190c2a2aa16:/dev/nvme0n1p3] Apr 21 17:33:35.723737 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.723664 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 17:33:35.730776 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.730644 2573 manager.go:217] Machine: {Timestamp:2026-04-21 17:33:35.72901067 +0000 UTC m=+0.439704769 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3092432 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ac30c31630919889dd66483543b50 SystemUUID:ec2ac30c-3163-0919-889d-d66483543b50 BootID:324ee393-234b-49e1-92d9-c01aa2710d4a Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4b:e8:f4:8e:5d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4b:e8:f4:8e:5d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:46:89:ad:06:92:8c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 17:33:35.730776 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.730757 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 17:33:35.730917 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.730855 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 17:33:35.732490 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.732457 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 17:33:35.733279 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.733243 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 17:33:35.733426 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.733282 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-77.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 17:33:35.733474 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.733437 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 17:33:35.733474 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.733446 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 17:33:35.733474 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.733459 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 17:33:35.734511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.734498 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 17:33:35.735971 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.735958 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 21 17:33:35.736102 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.736093 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 17:33:35.739022 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.739006 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 21 17:33:35.739081 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.739027 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 17:33:35.739081 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.739043 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 17:33:35.739081 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.739054 2573 kubelet.go:397] "Adding apiserver pod source" Apr 21 17:33:35.739081 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.739063 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 17:33:35.740255 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.740237 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 17:33:35.740302 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.740266 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 17:33:35.743739 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.743719 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 17:33:35.745733 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.745717 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 17:33:35.747228 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.747215 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 17:33:35.747291 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.747234 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 17:33:35.747291 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.747241 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 17:33:35.747291 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.747246 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 17:33:35.747291 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.747252 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 17:33:35.747291 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.747259 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 17:33:35.747291 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.747273 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 17:33:35.747291 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.747278 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 17:33:35.747291 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.747286 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 17:33:35.747291 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.747292 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 17:33:35.747544 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.747305 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 17:33:35.747544 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.747315 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 17:33:35.748220 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.748210 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 17:33:35.748220 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.748221 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 17:33:35.752061 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.752045 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 17:33:35.752166 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.752092 2573 server.go:1295] "Started kubelet" Apr 21 17:33:35.752277 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.752210 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 17:33:35.752337 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.752268 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 17:33:35.752382 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.752365 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 17:33:35.752751 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.752732 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-77.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 17:33:35.752817 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.752782 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-77.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 17:33:35.752902 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.752877 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 17:33:35.752985 ip-10-0-134-77 systemd[1]: Started Kubernetes Kubelet. Apr 21 17:33:35.754050 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.754036 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 17:33:35.755497 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.755480 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 21 17:33:35.758731 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.758703 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 17:33:35.759346 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.759332 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 17:33:35.759668 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.758428 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-77.ec2.internal.18a86fa6ee5e20ad default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-77.ec2.internal,UID:ip-10-0-134-77.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-77.ec2.internal,},FirstTimestamp:2026-04-21 17:33:35.752061101 +0000 UTC m=+0.462755199,LastTimestamp:2026-04-21 17:33:35.752061101 +0000 UTC m=+0.462755199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-77.ec2.internal,}" Apr 21 17:33:35.760034 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.759995 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:35.760181 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.760168 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 17:33:35.761532 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.760539 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 17:33:35.761532 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.761535 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 17:33:35.761671 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.760670 2573 factory.go:55] Registering systemd factory Apr 21 17:33:35.761671 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.761649 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 21 17:33:35.761671 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.761655 2573 factory.go:223] Registration of the systemd container factory successfully Apr 21 17:33:35.761798 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.761661 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 21 17:33:35.763064 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.763038 2573 factory.go:153] Registering CRI-O factory Apr 21 17:33:35.763064 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.763065 2573 factory.go:223] Registration of the crio container factory successfully Apr 21 17:33:35.763212 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.763119 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 17:33:35.763212 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.763178 2573 factory.go:103] Registering Raw factory Apr 21 17:33:35.763212 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.763203 2573 manager.go:1196] Started watching for new ooms in manager Apr 21 17:33:35.763532 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.763494 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 17:33:35.763625 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.763612 2573 manager.go:319] Starting recovery of all containers Apr 21 17:33:35.775490 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.775287 2573 manager.go:324] Recovery completed Apr 21 17:33:35.776443 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.776405 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-77.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 17:33:35.776443 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.776417 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 17:33:35.780954 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.780937 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 17:33:35.783756 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.783735 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientMemory" Apr 21 17:33:35.783836 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.783769 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 17:33:35.783836 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.783780 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientPID" Apr 21 17:33:35.784345 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.784327 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 17:33:35.784345 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.784341 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 17:33:35.784502 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.784358 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 21 17:33:35.786673 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.786658 2573 policy_none.go:49] "None policy: Start" Apr 21 17:33:35.786731 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.786676 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 17:33:35.786731 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.786687 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 21 17:33:35.795618 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.795528 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-77.ec2.internal.18a86fa6f041bb77 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-77.ec2.internal,UID:ip-10-0-134-77.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-77.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-77.ec2.internal,},FirstTimestamp:2026-04-21 17:33:35.783754615 +0000 UTC m=+0.494448713,LastTimestamp:2026-04-21 17:33:35.783754615 +0000 UTC m=+0.494448713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-77.ec2.internal,}" Apr 21 17:33:35.808431 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.808349 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-77.ec2.internal.18a86fa6f0420924 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-77.ec2.internal,UID:ip-10-0-134-77.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-134-77.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-134-77.ec2.internal,},FirstTimestamp:2026-04-21 17:33:35.7837745 +0000 UTC m=+0.494468600,LastTimestamp:2026-04-21 17:33:35.7837745 +0000 UTC m=+0.494468600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-77.ec2.internal,}" Apr 21 17:33:35.831695 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.818401 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-77.ec2.internal.18a86fa6f0423109 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-77.ec2.internal,UID:ip-10-0-134-77.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-134-77.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-134-77.ec2.internal,},FirstTimestamp:2026-04-21 17:33:35.783784713 +0000 UTC m=+0.494478812,LastTimestamp:2026-04-21 17:33:35.783784713 +0000 UTC m=+0.494478812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-77.ec2.internal,}" Apr 21 17:33:35.831695 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.826838 2573 manager.go:341] "Starting Device Plugin manager" Apr 21 17:33:35.831695 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.826873 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 17:33:35.831695 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.826883 2573 server.go:85] "Starting device plugin registration server" Apr 21 17:33:35.831695 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.827245 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 17:33:35.831695 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.827259 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 17:33:35.831695 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.827348 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 17:33:35.831695 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.827434 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 17:33:35.831695 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.827443 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 17:33:35.831695 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.827996 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 17:33:35.831695 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.828038 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:35.841535 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.841453 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-77.ec2.internal.18a86fa6f30218ab default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-77.ec2.internal,UID:ip-10-0-134-77.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-134-77.ec2.internal,},FirstTimestamp:2026-04-21 17:33:35.829915819 +0000 UTC m=+0.540609906,LastTimestamp:2026-04-21 17:33:35.829915819 +0000 UTC m=+0.540609906,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-77.ec2.internal,}" Apr 21 17:33:35.870034 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.870001 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dd2z4" Apr 21 17:33:35.886015 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.885989 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-dd2z4" Apr 21 17:33:35.903150 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.903054 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 17:33:35.904414 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.904386 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 17:33:35.904521 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.904423 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 17:33:35.904521 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.904447 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 17:33:35.904521 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.904458 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 17:33:35.904521 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.904494 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 17:33:35.928395 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.928354 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 17:33:35.929592 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.929571 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientMemory" Apr 21 17:33:35.929704 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.929603 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 17:33:35.929704 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.929617 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientPID" Apr 21 17:33:35.929704 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.929642 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-77.ec2.internal" Apr 21 17:33:35.939285 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.939266 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 17:33:35.952261 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:35.952235 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-77.ec2.internal" Apr 21 17:33:35.952315 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:35.952267 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-77.ec2.internal\": node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:36.001270 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:36.001239 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:36.005266 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.005219 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal"] Apr 21 17:33:36.005333 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.005323 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 17:33:36.007624 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.007593 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientMemory" Apr 21 17:33:36.007747 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.007636 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 17:33:36.007747 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.007647 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientPID" Apr 21 17:33:36.009111 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.009095 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 17:33:36.009277 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.009250 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.009338 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.009299 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 17:33:36.009928 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.009912 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientMemory" Apr 21 17:33:36.010005 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.009946 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 17:33:36.010005 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.009961 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientPID" Apr 21 17:33:36.010005 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.009913 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientMemory" Apr 21 17:33:36.010165 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.010016 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 17:33:36.010165 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.010027 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientPID" Apr 21 17:33:36.011351 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.011335 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.011413 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.011365 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 17:33:36.012275 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.012259 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientMemory" Apr 21 17:33:36.012359 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.012285 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 17:33:36.012359 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.012294 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeHasSufficientPID" Apr 21 17:33:36.036118 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:36.036086 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-77.ec2.internal\" not found" node="ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.040551 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:36.040528 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-77.ec2.internal\" not found" node="ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.064310 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.064278 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/816568b9527d9455f848c001abfac64a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal\" (UID: \"816568b9527d9455f848c001abfac64a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.064474 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.064316 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/454bf7e88903cb3fed5cc9e7d8cf5d0d-config\") pod \"kube-apiserver-proxy-ip-10-0-134-77.ec2.internal\" (UID: \"454bf7e88903cb3fed5cc9e7d8cf5d0d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.064474 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.064337 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/816568b9527d9455f848c001abfac64a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal\" (UID: \"816568b9527d9455f848c001abfac64a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.101780 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:36.101739 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:36.165070 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.164979 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/454bf7e88903cb3fed5cc9e7d8cf5d0d-config\") pod \"kube-apiserver-proxy-ip-10-0-134-77.ec2.internal\" (UID: \"454bf7e88903cb3fed5cc9e7d8cf5d0d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.165070 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.165024 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/816568b9527d9455f848c001abfac64a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal\" (UID: \"816568b9527d9455f848c001abfac64a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.165070 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.165041 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/816568b9527d9455f848c001abfac64a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal\" (UID: \"816568b9527d9455f848c001abfac64a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.165308 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.165091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/816568b9527d9455f848c001abfac64a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal\" (UID: \"816568b9527d9455f848c001abfac64a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.165308 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.165104 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/454bf7e88903cb3fed5cc9e7d8cf5d0d-config\") pod \"kube-apiserver-proxy-ip-10-0-134-77.ec2.internal\" (UID: \"454bf7e88903cb3fed5cc9e7d8cf5d0d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.165308 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.165165 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/816568b9527d9455f848c001abfac64a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal\" (UID: \"816568b9527d9455f848c001abfac64a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.201886 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:36.201855 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:36.302499 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:36.302465 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:36.337723 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.337687 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.342271 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.342253 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" Apr 21 17:33:36.403454 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:36.403401 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:36.504038 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:36.503948 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:36.604578 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:36.604538 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:36.662905 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.662880 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 17:33:36.663480 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.663042 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 17:33:36.705381 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:36.705332 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:36.759172 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.759113 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 17:33:36.806280 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:36.806249 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:36.887633 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.887581 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 17:28:35 +0000 UTC" deadline="2027-11-18 10:03:44.569062265 +0000 UTC" Apr 21 17:33:36.887633 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.887627 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13816h30m7.681441025s" Apr 21 17:33:36.902085 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:36.902048 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod454bf7e88903cb3fed5cc9e7d8cf5d0d.slice/crio-b00aec2754bf9e0469a443c0963a7e7c16dbb699969c247eae72b138643d6501 WatchSource:0}: Error finding container b00aec2754bf9e0469a443c0963a7e7c16dbb699969c247eae72b138643d6501: Status 404 returned error can't find the container with id b00aec2754bf9e0469a443c0963a7e7c16dbb699969c247eae72b138643d6501 Apr 21 17:33:36.902493 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:36.902471 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod816568b9527d9455f848c001abfac64a.slice/crio-56c6930f45968eb2bb03c2611df506617c603e3ead0e13c607a607954a34b6b3 WatchSource:0}: Error finding container 56c6930f45968eb2bb03c2611df506617c603e3ead0e13c607a607954a34b6b3: Status 404 returned error can't find the container with id 56c6930f45968eb2bb03c2611df506617c603e3ead0e13c607a607954a34b6b3 Apr 21 17:33:36.906354 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:36.906334 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:36.907604 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.907552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" event={"ID":"816568b9527d9455f848c001abfac64a","Type":"ContainerStarted","Data":"56c6930f45968eb2bb03c2611df506617c603e3ead0e13c607a607954a34b6b3"} Apr 21 17:33:36.907823 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.907800 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 17:33:36.908473 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.908441 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" event={"ID":"454bf7e88903cb3fed5cc9e7d8cf5d0d","Type":"ContainerStarted","Data":"b00aec2754bf9e0469a443c0963a7e7c16dbb699969c247eae72b138643d6501"} Apr 21 17:33:36.941814 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.941775 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 17:33:36.995435 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:36.995400 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 17:33:37.006633 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:37.006603 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:37.043301 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.043217 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 17:33:37.056337 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.056307 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vmxl5" Apr 21 17:33:37.077090 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.077062 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vmxl5" Apr 21 17:33:37.106773 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:37.106736 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:37.207518 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:37.207486 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-77.ec2.internal\" not found" Apr 21 17:33:37.244292 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.244265 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 17:33:37.260270 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.260241 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" Apr 21 17:33:37.279208 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.279177 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 17:33:37.279378 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.279323 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" Apr 21 17:33:37.330525 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.330450 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 17:33:37.739896 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.739792 2573 apiserver.go:52] "Watching apiserver" Apr 21 17:33:37.751919 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.751890 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 17:33:37.754086 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.754058 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-8rkhb","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal","openshift-multus/multus-additional-cni-plugins-zcgfw","openshift-network-diagnostics/network-check-target-8f9r2","openshift-network-operator/iptables-alerter-np76c","openshift-dns/node-resolver-zvdf9","openshift-image-registry/node-ca-6fqcj","openshift-multus/multus-vmg6z","openshift-multus/network-metrics-daemon-wtk7c","openshift-ovn-kubernetes/ovnkube-node-74gml","kube-system/konnectivity-agent-ct269","kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh"] Apr 21 17:33:37.755648 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.755630 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.756828 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.756809 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.758147 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.758106 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6fqcj" Apr 21 17:33:37.759242 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.759222 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:37.759332 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:37.759290 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:33:37.759627 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.759571 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8xq45\"" Apr 21 17:33:37.759858 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.759840 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 17:33:37.760385 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.760359 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-np76c" Apr 21 17:33:37.762953 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.762040 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 17:33:37.762953 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.762325 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-shh2b\"" Apr 21 17:33:37.762953 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.762386 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 17:33:37.762953 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.762406 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 17:33:37.762953 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.762514 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 17:33:37.762953 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.762582 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 17:33:37.762953 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.762869 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 17:33:37.763337 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.763215 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jgxnc\"" Apr 21 17:33:37.763498 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.763465 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 17:33:37.764332 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.764313 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zvdf9" Apr 21 17:33:37.764439 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.764398 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 17:33:37.764963 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.764936 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hp4ht\"" Apr 21 17:33:37.765100 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.765080 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 17:33:37.765424 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.765405 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 17:33:37.766781 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.766763 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.768080 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.768060 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.768237 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.768218 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:37.768311 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:37.768294 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:33:37.769578 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.769558 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.770801 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.770783 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ct269" Apr 21 17:33:37.773925 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.773896 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-sys-fs\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.774032 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.773940 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.774032 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.773968 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgj2j\" (UniqueName: \"kubernetes.io/projected/755e2315-552e-4e00-ba7b-cf1e07a5c8d1-kube-api-access-kgj2j\") pod \"node-ca-6fqcj\" (UID: \"755e2315-552e-4e00-ba7b-cf1e07a5c8d1\") " pod="openshift-image-registry/node-ca-6fqcj" Apr 21 17:33:37.774032 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.773999 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-lib-modules\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.774032 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774024 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-sysconfig\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.774259 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774063 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 17:33:37.774259 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774092 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d4691871-0174-4f20-9333-c53e5c44774f-iptables-alerter-script\") pod \"iptables-alerter-np76c\" (UID: \"d4691871-0174-4f20-9333-c53e5c44774f\") " pod="openshift-network-operator/iptables-alerter-np76c" Apr 21 17:33:37.774259 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774146 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-modprobe-d\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.774259 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774174 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/755e2315-552e-4e00-ba7b-cf1e07a5c8d1-host\") pod \"node-ca-6fqcj\" (UID: \"755e2315-552e-4e00-ba7b-cf1e07a5c8d1\") " pod="openshift-image-registry/node-ca-6fqcj" Apr 21 17:33:37.774259 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774202 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-sysctl-conf\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.774259 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774227 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-sys\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.774259 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774257 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84cw2\" (UniqueName: \"kubernetes.io/projected/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-kube-api-access-84cw2\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.774584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774304 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 17:33:37.774584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774299 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-kubernetes\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.774584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774329 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 17:33:37.774584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774348 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpc8p\" (UniqueName: \"kubernetes.io/projected/5dd32873-afbe-4cda-ad81-5c8d17abaeb9-kube-api-access-tpc8p\") pod \"node-resolver-zvdf9\" (UID: \"5dd32873-afbe-4cda-ad81-5c8d17abaeb9\") " pod="openshift-dns/node-resolver-zvdf9" Apr 21 17:33:37.774584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774366 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 17:33:37.774584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774382 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-host\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.774584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774409 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-tuned\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.774584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774419 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 17:33:37.774584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774436 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-tmp\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.774584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774491 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 17:33:37.774584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774499 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq9jh\" (UniqueName: \"kubernetes.io/projected/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-kube-api-access-gq9jh\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.774584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-device-dir\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.774584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774568 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-etc-selinux\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774601 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-sysctl-d\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774645 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774658 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774657 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-systemd\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774707 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-run\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774720 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nzsmm\"" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774733 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-var-lib-kubelet\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/755e2315-552e-4e00-ba7b-cf1e07a5c8d1-serviceca\") pod \"node-ca-6fqcj\" (UID: \"755e2315-552e-4e00-ba7b-cf1e07a5c8d1\") " pod="openshift-image-registry/node-ca-6fqcj" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774791 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-socket-dir\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774825 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d4691871-0174-4f20-9333-c53e5c44774f-host-slash\") pod \"iptables-alerter-np76c\" (UID: \"d4691871-0174-4f20-9333-c53e5c44774f\") " pod="openshift-network-operator/iptables-alerter-np76c" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774851 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpch5\" (UniqueName: \"kubernetes.io/projected/d4691871-0174-4f20-9333-c53e5c44774f-kube-api-access-cpch5\") pod \"iptables-alerter-np76c\" (UID: \"d4691871-0174-4f20-9333-c53e5c44774f\") " pod="openshift-network-operator/iptables-alerter-np76c" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774874 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5dd32873-afbe-4cda-ad81-5c8d17abaeb9-hosts-file\") pod \"node-resolver-zvdf9\" (UID: \"5dd32873-afbe-4cda-ad81-5c8d17abaeb9\") " pod="openshift-dns/node-resolver-zvdf9" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774905 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2gvp\" (UniqueName: \"kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp\") pod \"network-check-target-8f9r2\" (UID: \"b9e16605-c885-47fb-ba9d-4b218cc44030\") " pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774929 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5dd32873-afbe-4cda-ad81-5c8d17abaeb9-tmp-dir\") pod \"node-resolver-zvdf9\" (UID: \"5dd32873-afbe-4cda-ad81-5c8d17abaeb9\") " pod="openshift-dns/node-resolver-zvdf9" Apr 21 17:33:37.775194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.774956 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-registration-dir\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.775828 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.775500 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 17:33:37.775828 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.775500 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 17:33:37.775828 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.775692 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 17:33:37.775828 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.775701 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-n5nk4\"" Apr 21 17:33:37.775828 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.775705 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 17:33:37.775828 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.775774 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 17:33:37.776964 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.776939 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-g24gs\"" Apr 21 17:33:37.777065 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.777050 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 17:33:37.777121 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.777065 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-95lrc\"" Apr 21 17:33:37.777216 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.777168 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 17:33:37.777277 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.777237 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 17:33:37.777361 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.777345 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-q4hxb\"" Apr 21 17:33:37.862264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.862227 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 17:33:37.876024 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.875985 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5dd32873-afbe-4cda-ad81-5c8d17abaeb9-hosts-file\") pod \"node-resolver-zvdf9\" (UID: \"5dd32873-afbe-4cda-ad81-5c8d17abaeb9\") " pod="openshift-dns/node-resolver-zvdf9" Apr 21 17:33:37.876024 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876028 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2gvp\" (UniqueName: \"kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp\") pod \"network-check-target-8f9r2\" (UID: \"b9e16605-c885-47fb-ba9d-4b218cc44030\") " pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:37.876275 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876056 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5dd32873-afbe-4cda-ad81-5c8d17abaeb9-tmp-dir\") pod \"node-resolver-zvdf9\" (UID: \"5dd32873-afbe-4cda-ad81-5c8d17abaeb9\") " pod="openshift-dns/node-resolver-zvdf9" Apr 21 17:33:37.876275 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876080 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-sys-fs\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.876275 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876109 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-ovnkube-script-lib\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.876275 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876126 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5dd32873-afbe-4cda-ad81-5c8d17abaeb9-hosts-file\") pod \"node-resolver-zvdf9\" (UID: \"5dd32873-afbe-4cda-ad81-5c8d17abaeb9\") " pod="openshift-dns/node-resolver-zvdf9" Apr 21 17:33:37.876275 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876147 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-run-multus-certs\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.876275 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876171 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-multus-cni-dir\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.876275 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876197 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.876275 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876214 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-sys-fs\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.876275 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876223 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-run-ovn\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.876275 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876247 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-log-socket\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.876275 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876268 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876292 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97f71749-bd4e-478c-9148-e08f6598c072-cni-binary-copy\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876313 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-multus-socket-dir-parent\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876334 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmfq5\" (UniqueName: \"kubernetes.io/projected/97f71749-bd4e-478c-9148-e08f6598c072-kube-api-access-tmfq5\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876354 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dvwv\" (UniqueName: \"kubernetes.io/projected/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-kube-api-access-5dvwv\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876378 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-sysconfig\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876404 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-cni-netd\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876426 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-env-overrides\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876448 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-cnibin\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876472 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-hostroot\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876505 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-sysctl-conf\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-cni-binary-copy\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876555 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dksbn\" (UniqueName: \"kubernetes.io/projected/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-kube-api-access-dksbn\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876578 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876605 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-kubernetes\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876628 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpc8p\" (UniqueName: \"kubernetes.io/projected/5dd32873-afbe-4cda-ad81-5c8d17abaeb9-kube-api-access-tpc8p\") pod \"node-resolver-zvdf9\" (UID: \"5dd32873-afbe-4cda-ad81-5c8d17abaeb9\") " pod="openshift-dns/node-resolver-zvdf9" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876651 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-host\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.876802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876677 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-tuned\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876704 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gq9jh\" (UniqueName: \"kubernetes.io/projected/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-kube-api-access-gq9jh\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-device-dir\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/755e2315-552e-4e00-ba7b-cf1e07a5c8d1-serviceca\") pod \"node-ca-6fqcj\" (UID: \"755e2315-552e-4e00-ba7b-cf1e07a5c8d1\") " pod="openshift-image-registry/node-ca-6fqcj" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876777 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-kubelet\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876806 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-run-ovn-kubernetes\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876831 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-var-lib-cni-multus\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876854 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-etc-kubernetes\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876878 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr2jw\" (UniqueName: \"kubernetes.io/projected/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-kube-api-access-cr2jw\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876906 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-socket-dir\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876939 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-kubelet-dir\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876945 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d4691871-0174-4f20-9333-c53e5c44774f-host-slash\") pod \"iptables-alerter-np76c\" (UID: \"d4691871-0174-4f20-9333-c53e5c44774f\") " pod="openshift-network-operator/iptables-alerter-np76c" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876993 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpch5\" (UniqueName: \"kubernetes.io/projected/d4691871-0174-4f20-9333-c53e5c44774f-kube-api-access-cpch5\") pod \"iptables-alerter-np76c\" (UID: \"d4691871-0174-4f20-9333-c53e5c44774f\") " pod="openshift-network-operator/iptables-alerter-np76c" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-ovnkube-config\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.876832 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5dd32873-afbe-4cda-ad81-5c8d17abaeb9-tmp-dir\") pod \"node-resolver-zvdf9\" (UID: \"5dd32873-afbe-4cda-ad81-5c8d17abaeb9\") " pod="openshift-dns/node-resolver-zvdf9" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877268 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-sysconfig\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.877593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877523 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877610 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/755e2315-552e-4e00-ba7b-cf1e07a5c8d1-serviceca\") pod \"node-ca-6fqcj\" (UID: \"755e2315-552e-4e00-ba7b-cf1e07a5c8d1\") " pod="openshift-image-registry/node-ca-6fqcj" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877639 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-sysctl-conf\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877705 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-registration-dir\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877714 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-socket-dir\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877749 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-system-cni-dir\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877768 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-device-dir\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877778 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-systemd-units\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877805 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-run-openvswitch\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877810 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d4691871-0174-4f20-9333-c53e5c44774f-host-slash\") pod \"iptables-alerter-np76c\" (UID: \"d4691871-0174-4f20-9333-c53e5c44774f\") " pod="openshift-network-operator/iptables-alerter-np76c" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877854 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-node-log\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877873 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/065ac800-604a-42e9-b0cc-e56598f56081-agent-certs\") pod \"konnectivity-agent-ct269\" (UID: \"065ac800-604a-42e9-b0cc-e56598f56081\") " pod="kube-system/konnectivity-agent-ct269" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877891 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-system-cni-dir\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877916 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgj2j\" (UniqueName: \"kubernetes.io/projected/755e2315-552e-4e00-ba7b-cf1e07a5c8d1-kube-api-access-kgj2j\") pod \"node-ca-6fqcj\" (UID: \"755e2315-552e-4e00-ba7b-cf1e07a5c8d1\") " pod="openshift-image-registry/node-ca-6fqcj" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.877939 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-slash\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878010 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-cnibin\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878036 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-registration-dir\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.878367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878045 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/065ac800-604a-42e9-b0cc-e56598f56081-konnectivity-ca\") pod \"konnectivity-agent-ct269\" (UID: \"065ac800-604a-42e9-b0cc-e56598f56081\") " pod="kube-system/konnectivity-agent-ct269" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878074 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-run-k8s-cni-cncf-io\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878101 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-run-netns\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878124 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-var-lib-cni-bin\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878164 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-var-lib-kubelet\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-lib-modules\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878201 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d4691871-0174-4f20-9333-c53e5c44774f-iptables-alerter-script\") pod \"iptables-alerter-np76c\" (UID: \"d4691871-0174-4f20-9333-c53e5c44774f\") " pod="openshift-network-operator/iptables-alerter-np76c" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878248 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-host\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878277 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-kubernetes\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878405 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-lib-modules\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-modprobe-d\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878455 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/755e2315-552e-4e00-ba7b-cf1e07a5c8d1-host\") pod \"node-ca-6fqcj\" (UID: \"755e2315-552e-4e00-ba7b-cf1e07a5c8d1\") " pod="openshift-image-registry/node-ca-6fqcj" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878481 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-run-systemd\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878501 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-modprobe-d\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878509 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-etc-openvswitch\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878534 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-cni-bin\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878530 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/755e2315-552e-4e00-ba7b-cf1e07a5c8d1-host\") pod \"node-ca-6fqcj\" (UID: \"755e2315-552e-4e00-ba7b-cf1e07a5c8d1\") " pod="openshift-image-registry/node-ca-6fqcj" Apr 21 17:33:37.879098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878566 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-os-release\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-ovn-node-metrics-cert\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-sys\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84cw2\" (UniqueName: \"kubernetes.io/projected/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-kube-api-access-84cw2\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878684 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-sys\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878686 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-run-netns\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878747 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-var-lib-openvswitch\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878781 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97f71749-bd4e-478c-9148-e08f6598c072-multus-daemon-config\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-tmp\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878853 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d4691871-0174-4f20-9333-c53e5c44774f-iptables-alerter-script\") pod \"iptables-alerter-np76c\" (UID: \"d4691871-0174-4f20-9333-c53e5c44774f\") " pod="openshift-network-operator/iptables-alerter-np76c" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878867 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-etc-selinux\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878891 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-os-release\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878916 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878942 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878964 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-etc-selinux\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.879775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.878966 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-multus-conf-dir\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.880475 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.879008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-sysctl-d\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.880475 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.879034 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-systemd\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.880475 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.879059 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-run\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.880475 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.879083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-var-lib-kubelet\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.880475 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.879127 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-systemd\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.880475 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.879084 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-sysctl-d\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.880475 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.879145 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-run\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.880475 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.879193 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-var-lib-kubelet\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.881081 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.881059 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-tmp\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.881206 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.881120 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-etc-tuned\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.889797 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:37.889771 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 17:33:37.889797 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:37.889798 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 17:33:37.889797 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:37.889812 2573 projected.go:194] Error preparing data for projected volume kube-api-access-z2gvp for pod openshift-network-diagnostics/network-check-target-8f9r2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:37.890058 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:37.889920 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp podName:b9e16605-c885-47fb-ba9d-4b218cc44030 nodeName:}" failed. No retries permitted until 2026-04-21 17:33:38.389892626 +0000 UTC m=+3.100586734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-z2gvp" (UniqueName: "kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp") pod "network-check-target-8f9r2" (UID: "b9e16605-c885-47fb-ba9d-4b218cc44030") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:37.893852 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.893823 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpch5\" (UniqueName: \"kubernetes.io/projected/d4691871-0174-4f20-9333-c53e5c44774f-kube-api-access-cpch5\") pod \"iptables-alerter-np76c\" (UID: \"d4691871-0174-4f20-9333-c53e5c44774f\") " pod="openshift-network-operator/iptables-alerter-np76c" Apr 21 17:33:37.899751 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.899723 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpc8p\" (UniqueName: \"kubernetes.io/projected/5dd32873-afbe-4cda-ad81-5c8d17abaeb9-kube-api-access-tpc8p\") pod \"node-resolver-zvdf9\" (UID: \"5dd32873-afbe-4cda-ad81-5c8d17abaeb9\") " pod="openshift-dns/node-resolver-zvdf9" Apr 21 17:33:37.901943 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.901919 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgj2j\" (UniqueName: \"kubernetes.io/projected/755e2315-552e-4e00-ba7b-cf1e07a5c8d1-kube-api-access-kgj2j\") pod \"node-ca-6fqcj\" (UID: \"755e2315-552e-4e00-ba7b-cf1e07a5c8d1\") " pod="openshift-image-registry/node-ca-6fqcj" Apr 21 17:33:37.902768 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.902744 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84cw2\" (UniqueName: \"kubernetes.io/projected/fba4ce9e-e0ea-4092-a0eb-1f8a684daf98-kube-api-access-84cw2\") pod \"aws-ebs-csi-driver-node-7kvmh\" (UID: \"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:37.904171 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.904151 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq9jh\" (UniqueName: \"kubernetes.io/projected/6a95a69b-a0e2-4e0f-a650-b9b615190bfe-kube-api-access-gq9jh\") pod \"tuned-8rkhb\" (UID: \"6a95a69b-a0e2-4e0f-a650-b9b615190bfe\") " pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:37.979924 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.979893 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-run-ovn\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.979924 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.979926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-log-socket\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.980145 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.979946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.980145 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.979963 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97f71749-bd4e-478c-9148-e08f6598c072-cni-binary-copy\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.980145 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-run-ovn\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.980145 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980010 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-multus-socket-dir-parent\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.980145 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980061 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-multus-socket-dir-parent\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.980145 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980065 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-log-socket\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.980145 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980067 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmfq5\" (UniqueName: \"kubernetes.io/projected/97f71749-bd4e-478c-9148-e08f6598c072-kube-api-access-tmfq5\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.980145 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dvwv\" (UniqueName: \"kubernetes.io/projected/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-kube-api-access-5dvwv\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:37.980145 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980146 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-cni-netd\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-env-overrides\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980182 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-cnibin\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-hostroot\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980213 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-cni-binary-copy\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980237 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dksbn\" (UniqueName: \"kubernetes.io/projected/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-kube-api-access-dksbn\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980275 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-cni-netd\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980293 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-kubelet\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980305 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-hostroot\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-run-ovn-kubernetes\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980351 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-run-ovn-kubernetes\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980358 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-var-lib-cni-multus\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-etc-kubernetes\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cr2jw\" (UniqueName: \"kubernetes.io/projected/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-kube-api-access-cr2jw\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980421 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-kubelet\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980422 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.980837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980458 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-var-lib-cni-multus\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980461 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-etc-kubernetes\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-ovnkube-config\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980522 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-system-cni-dir\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:37.980535 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980547 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-systemd-units\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980572 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-run-openvswitch\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980594 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-node-log\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:37.980609 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs podName:dcd8a3eb-e25c-4dcb-9468-d578a60a826c nodeName:}" failed. No retries permitted until 2026-04-21 17:33:38.48059098 +0000 UTC m=+3.191285086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs") pod "network-metrics-daemon-wtk7c" (UID: "dcd8a3eb-e25c-4dcb-9468-d578a60a826c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-node-log\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980634 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/065ac800-604a-42e9-b0cc-e56598f56081-agent-certs\") pod \"konnectivity-agent-ct269\" (UID: \"065ac800-604a-42e9-b0cc-e56598f56081\") " pod="kube-system/konnectivity-agent-ct269" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-system-cni-dir\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980668 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-env-overrides\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980674 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-systemd-units\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980686 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-slash\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980714 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-cnibin\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980714 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-run-openvswitch\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.981753 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980744 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/065ac800-604a-42e9-b0cc-e56598f56081-konnectivity-ca\") pod \"konnectivity-agent-ct269\" (UID: \"065ac800-604a-42e9-b0cc-e56598f56081\") " pod="kube-system/konnectivity-agent-ct269" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980756 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-slash\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980771 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-run-k8s-cni-cncf-io\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-cnibin\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980798 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-run-netns\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980823 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-var-lib-cni-bin\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980848 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-var-lib-kubelet\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980847 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-cni-binary-copy\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980879 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-run-systemd\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980878 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-run-k8s-cni-cncf-io\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980932 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-run-netns\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980939 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-var-lib-cni-bin\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980904 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-etc-openvswitch\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980977 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-var-lib-kubelet\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980981 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-cni-bin\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980979 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-run-systemd\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.980878 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97f71749-bd4e-478c-9148-e08f6598c072-cni-binary-copy\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981011 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-os-release\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.982595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981013 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-etc-openvswitch\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981039 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-cni-bin\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981039 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-ovn-node-metrics-cert\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981077 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-os-release\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981079 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-run-netns\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981087 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-cnibin\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981110 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-var-lib-openvswitch\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981119 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-system-cni-dir\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981154 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-run-netns\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981180 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-var-lib-openvswitch\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981181 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97f71749-bd4e-478c-9148-e08f6598c072-multus-daemon-config\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981219 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981224 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-os-release\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981271 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-os-release\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981295 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981322 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-system-cni-dir\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.983419 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981325 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-multus-conf-dir\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981356 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-ovnkube-config\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981397 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/065ac800-604a-42e9-b0cc-e56598f56081-konnectivity-ca\") pod \"konnectivity-agent-ct269\" (UID: \"065ac800-604a-42e9-b0cc-e56598f56081\") " pod="kube-system/konnectivity-agent-ct269" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981406 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-multus-conf-dir\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981411 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-ovnkube-script-lib\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981465 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-run-multus-certs\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981490 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-multus-cni-dir\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981571 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-multus-cni-dir\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981619 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/97f71749-bd4e-478c-9148-e08f6598c072-host-run-multus-certs\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981664 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/97f71749-bd4e-478c-9148-e08f6598c072-multus-daemon-config\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981827 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981862 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.981977 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-ovnkube-script-lib\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.983163 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/065ac800-604a-42e9-b0cc-e56598f56081-agent-certs\") pod \"konnectivity-agent-ct269\" (UID: \"065ac800-604a-42e9-b0cc-e56598f56081\") " pod="kube-system/konnectivity-agent-ct269" Apr 21 17:33:37.983899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.983555 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-ovn-node-metrics-cert\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:37.988075 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:37.988052 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 17:33:38.007487 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.007410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmfq5\" (UniqueName: \"kubernetes.io/projected/97f71749-bd4e-478c-9148-e08f6598c072-kube-api-access-tmfq5\") pod \"multus-vmg6z\" (UID: \"97f71749-bd4e-478c-9148-e08f6598c072\") " pod="openshift-multus/multus-vmg6z" Apr 21 17:33:38.019042 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.019013 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dksbn\" (UniqueName: \"kubernetes.io/projected/d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0-kube-api-access-dksbn\") pod \"multus-additional-cni-plugins-zcgfw\" (UID: \"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0\") " pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:38.020426 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.020395 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dvwv\" (UniqueName: \"kubernetes.io/projected/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-kube-api-access-5dvwv\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:38.021747 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.021724 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr2jw\" (UniqueName: \"kubernetes.io/projected/3f956cbc-c15f-455e-8caf-a1b6e26e74ca-kube-api-access-cr2jw\") pod \"ovnkube-node-74gml\" (UID: \"3f956cbc-c15f-455e-8caf-a1b6e26e74ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:38.069714 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.069676 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" Apr 21 17:33:38.076581 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.076552 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" Apr 21 17:33:38.078209 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.078181 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 17:28:37 +0000 UTC" deadline="2027-11-08 19:48:55.620243378 +0000 UTC" Apr 21 17:33:38.078312 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.078212 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13586h15m17.54203787s" Apr 21 17:33:38.090523 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.090493 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6fqcj" Apr 21 17:33:38.096592 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.096560 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-np76c" Apr 21 17:33:38.105290 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.105265 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zvdf9" Apr 21 17:33:38.113044 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.113019 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zcgfw" Apr 21 17:33:38.118952 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.118927 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vmg6z" Apr 21 17:33:38.125687 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.125661 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:33:38.131328 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.131301 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-ct269" Apr 21 17:33:38.485121 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.485013 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:38.485121 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.485099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2gvp\" (UniqueName: \"kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp\") pod \"network-check-target-8f9r2\" (UID: \"b9e16605-c885-47fb-ba9d-4b218cc44030\") " pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:38.485356 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:38.485172 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:38.485356 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:38.485226 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 17:33:38.485356 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:38.485246 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 17:33:38.485356 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:38.485254 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs podName:dcd8a3eb-e25c-4dcb-9468-d578a60a826c nodeName:}" failed. No retries permitted until 2026-04-21 17:33:39.485232357 +0000 UTC m=+4.195926466 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs") pod "network-metrics-daemon-wtk7c" (UID: "dcd8a3eb-e25c-4dcb-9468-d578a60a826c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:38.485356 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:38.485259 2573 projected.go:194] Error preparing data for projected volume kube-api-access-z2gvp for pod openshift-network-diagnostics/network-check-target-8f9r2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:38.485356 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:38.485304 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp podName:b9e16605-c885-47fb-ba9d-4b218cc44030 nodeName:}" failed. No retries permitted until 2026-04-21 17:33:39.485289765 +0000 UTC m=+4.195983865 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-z2gvp" (UniqueName: "kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp") pod "network-check-target-8f9r2" (UID: "b9e16605-c885-47fb-ba9d-4b218cc44030") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:38.542643 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:38.542613 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd12b2fdf_28ef_4cd3_8fe8_ff7631cfc4b0.slice/crio-245c7d27467fc278692e5d28b021eb31f9d65ff9cb53355ccdc4bc29bd8c6dcf WatchSource:0}: Error finding container 245c7d27467fc278692e5d28b021eb31f9d65ff9cb53355ccdc4bc29bd8c6dcf: Status 404 returned error can't find the container with id 245c7d27467fc278692e5d28b021eb31f9d65ff9cb53355ccdc4bc29bd8c6dcf Apr 21 17:33:38.544218 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:38.544175 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod065ac800_604a_42e9_b0cc_e56598f56081.slice/crio-0ed07e5a14d43d6eb93e96492c08b3ac7ea56f05850be96a1e6f616bec63b302 WatchSource:0}: Error finding container 0ed07e5a14d43d6eb93e96492c08b3ac7ea56f05850be96a1e6f616bec63b302: Status 404 returned error can't find the container with id 0ed07e5a14d43d6eb93e96492c08b3ac7ea56f05850be96a1e6f616bec63b302 Apr 21 17:33:38.547958 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:38.547935 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod755e2315_552e_4e00_ba7b_cf1e07a5c8d1.slice/crio-c5ef6e8d1d4a6d62ff4404a12342a10388c12843ae2dfb94d6e97efd15519a7e WatchSource:0}: Error finding container c5ef6e8d1d4a6d62ff4404a12342a10388c12843ae2dfb94d6e97efd15519a7e: Status 404 returned error can't find the container with id c5ef6e8d1d4a6d62ff4404a12342a10388c12843ae2dfb94d6e97efd15519a7e Apr 21 17:33:38.549018 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:38.548991 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f956cbc_c15f_455e_8caf_a1b6e26e74ca.slice/crio-f0933a5729e820238947702f43f9e2639c096e4ed06181326be4996fe9bb0534 WatchSource:0}: Error finding container f0933a5729e820238947702f43f9e2639c096e4ed06181326be4996fe9bb0534: Status 404 returned error can't find the container with id f0933a5729e820238947702f43f9e2639c096e4ed06181326be4996fe9bb0534 Apr 21 17:33:38.550079 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:38.550051 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a95a69b_a0e2_4e0f_a650_b9b615190bfe.slice/crio-81713460ac0792541efc866000224cf1c8676713ee74fbbbf95b7ada03d26839 WatchSource:0}: Error finding container 81713460ac0792541efc866000224cf1c8676713ee74fbbbf95b7ada03d26839: Status 404 returned error can't find the container with id 81713460ac0792541efc866000224cf1c8676713ee74fbbbf95b7ada03d26839 Apr 21 17:33:38.550824 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:38.550788 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dd32873_afbe_4cda_ad81_5c8d17abaeb9.slice/crio-01364201adc38d14c20b0a0f234b6ecc349b5032ac8453a574eecba7220f961d WatchSource:0}: Error finding container 01364201adc38d14c20b0a0f234b6ecc349b5032ac8453a574eecba7220f961d: Status 404 returned error can't find the container with id 01364201adc38d14c20b0a0f234b6ecc349b5032ac8453a574eecba7220f961d Apr 21 17:33:38.552919 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:38.552889 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfba4ce9e_e0ea_4092_a0eb_1f8a684daf98.slice/crio-5d48f4b5a57476a365a42ef824faf4a104a5d89605cc5d0b80b119af519e4f28 WatchSource:0}: Error finding container 5d48f4b5a57476a365a42ef824faf4a104a5d89605cc5d0b80b119af519e4f28: Status 404 returned error can't find the container with id 5d48f4b5a57476a365a42ef824faf4a104a5d89605cc5d0b80b119af519e4f28 Apr 21 17:33:38.553191 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:38.553121 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4691871_0174_4f20_9333_c53e5c44774f.slice/crio-d1639ae1fb4e334c735b2c7d3c4c379c7213ee3c74ceb58faedd99a24e288b64 WatchSource:0}: Error finding container d1639ae1fb4e334c735b2c7d3c4c379c7213ee3c74ceb58faedd99a24e288b64: Status 404 returned error can't find the container with id d1639ae1fb4e334c735b2c7d3c4c379c7213ee3c74ceb58faedd99a24e288b64 Apr 21 17:33:38.554901 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:33:38.554877 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f71749_bd4e_478c_9148_e08f6598c072.slice/crio-cf10ec3820cbf3107a563f71143e589b3ec2e23696e29e35f3c5462972ae92ec WatchSource:0}: Error finding container cf10ec3820cbf3107a563f71143e589b3ec2e23696e29e35f3c5462972ae92ec: Status 404 returned error can't find the container with id cf10ec3820cbf3107a563f71143e589b3ec2e23696e29e35f3c5462972ae92ec Apr 21 17:33:38.904961 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.904854 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:38.905446 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:38.904972 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:33:38.913176 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.913126 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vmg6z" event={"ID":"97f71749-bd4e-478c-9148-e08f6598c072","Type":"ContainerStarted","Data":"cf10ec3820cbf3107a563f71143e589b3ec2e23696e29e35f3c5462972ae92ec"} Apr 21 17:33:38.916702 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.916670 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" event={"ID":"3f956cbc-c15f-455e-8caf-a1b6e26e74ca","Type":"ContainerStarted","Data":"f0933a5729e820238947702f43f9e2639c096e4ed06181326be4996fe9bb0534"} Apr 21 17:33:38.917951 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.917921 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6fqcj" event={"ID":"755e2315-552e-4e00-ba7b-cf1e07a5c8d1","Type":"ContainerStarted","Data":"c5ef6e8d1d4a6d62ff4404a12342a10388c12843ae2dfb94d6e97efd15519a7e"} Apr 21 17:33:38.919109 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.919079 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ct269" event={"ID":"065ac800-604a-42e9-b0cc-e56598f56081","Type":"ContainerStarted","Data":"0ed07e5a14d43d6eb93e96492c08b3ac7ea56f05850be96a1e6f616bec63b302"} Apr 21 17:33:38.920211 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.920166 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-np76c" event={"ID":"d4691871-0174-4f20-9333-c53e5c44774f","Type":"ContainerStarted","Data":"d1639ae1fb4e334c735b2c7d3c4c379c7213ee3c74ceb58faedd99a24e288b64"} Apr 21 17:33:38.923826 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.923785 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zvdf9" event={"ID":"5dd32873-afbe-4cda-ad81-5c8d17abaeb9","Type":"ContainerStarted","Data":"01364201adc38d14c20b0a0f234b6ecc349b5032ac8453a574eecba7220f961d"} Apr 21 17:33:38.925481 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.925438 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" event={"ID":"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98","Type":"ContainerStarted","Data":"5d48f4b5a57476a365a42ef824faf4a104a5d89605cc5d0b80b119af519e4f28"} Apr 21 17:33:38.928996 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.928959 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" event={"ID":"6a95a69b-a0e2-4e0f-a650-b9b615190bfe","Type":"ContainerStarted","Data":"81713460ac0792541efc866000224cf1c8676713ee74fbbbf95b7ada03d26839"} Apr 21 17:33:38.930964 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.930936 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcgfw" event={"ID":"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0","Type":"ContainerStarted","Data":"245c7d27467fc278692e5d28b021eb31f9d65ff9cb53355ccdc4bc29bd8c6dcf"} Apr 21 17:33:38.932626 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:38.932601 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" event={"ID":"454bf7e88903cb3fed5cc9e7d8cf5d0d","Type":"ContainerStarted","Data":"2d763d85426815308eef69a0c8c52fb48cbd3993b3bc85e7fd9db33485b341d1"} Apr 21 17:33:39.079309 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:39.079256 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 17:28:37 +0000 UTC" deadline="2028-02-06 12:43:34.724135692 +0000 UTC" Apr 21 17:33:39.079309 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:39.079307 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15739h9m55.644834566s" Apr 21 17:33:39.502477 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:39.502442 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:39.502667 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:39.502524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2gvp\" (UniqueName: \"kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp\") pod \"network-check-target-8f9r2\" (UID: \"b9e16605-c885-47fb-ba9d-4b218cc44030\") " pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:39.502667 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:39.502662 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 17:33:39.502778 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:39.502680 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 17:33:39.502778 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:39.502695 2573 projected.go:194] Error preparing data for projected volume kube-api-access-z2gvp for pod openshift-network-diagnostics/network-check-target-8f9r2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:39.502778 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:39.502757 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp podName:b9e16605-c885-47fb-ba9d-4b218cc44030 nodeName:}" failed. No retries permitted until 2026-04-21 17:33:41.502737676 +0000 UTC m=+6.213431782 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-z2gvp" (UniqueName: "kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp") pod "network-check-target-8f9r2" (UID: "b9e16605-c885-47fb-ba9d-4b218cc44030") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:39.502933 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:39.502841 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:39.502933 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:39.502893 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs podName:dcd8a3eb-e25c-4dcb-9468-d578a60a826c nodeName:}" failed. No retries permitted until 2026-04-21 17:33:41.502864885 +0000 UTC m=+6.213558977 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs") pod "network-metrics-daemon-wtk7c" (UID: "dcd8a3eb-e25c-4dcb-9468-d578a60a826c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:39.906329 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:39.906295 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:39.906809 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:39.906440 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:33:39.956592 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:39.956215 2573 generic.go:358] "Generic (PLEG): container finished" podID="816568b9527d9455f848c001abfac64a" containerID="22ec694f549ad0d33d3586dffc4902f5b23e473ba65f8af7a1a36a3b1b7dbf8a" exitCode=0 Apr 21 17:33:39.956592 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:39.956377 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" event={"ID":"816568b9527d9455f848c001abfac64a","Type":"ContainerDied","Data":"22ec694f549ad0d33d3586dffc4902f5b23e473ba65f8af7a1a36a3b1b7dbf8a"} Apr 21 17:33:40.005982 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:40.005918 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-77.ec2.internal" podStartSLOduration=3.005850357 podStartE2EDuration="3.005850357s" podCreationTimestamp="2026-04-21 17:33:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:33:39.012785264 +0000 UTC m=+3.723479373" watchObservedRunningTime="2026-04-21 17:33:40.005850357 +0000 UTC m=+4.716544467" Apr 21 17:33:40.905467 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:40.905409 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:40.905672 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:40.905553 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:33:40.970507 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:40.970466 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" event={"ID":"816568b9527d9455f848c001abfac64a","Type":"ContainerStarted","Data":"1d982d2f4eaafa731e70d55c4dd78388e964d5fec158fcc00c42db1a76b40238"} Apr 21 17:33:41.523040 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:41.522998 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:41.523298 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:41.523097 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2gvp\" (UniqueName: \"kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp\") pod \"network-check-target-8f9r2\" (UID: \"b9e16605-c885-47fb-ba9d-4b218cc44030\") " pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:41.523298 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:41.523269 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 17:33:41.523298 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:41.523289 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 17:33:41.523482 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:41.523303 2573 projected.go:194] Error preparing data for projected volume kube-api-access-z2gvp for pod openshift-network-diagnostics/network-check-target-8f9r2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:41.523482 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:41.523364 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp podName:b9e16605-c885-47fb-ba9d-4b218cc44030 nodeName:}" failed. No retries permitted until 2026-04-21 17:33:45.52334425 +0000 UTC m=+10.234038350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-z2gvp" (UniqueName: "kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp") pod "network-check-target-8f9r2" (UID: "b9e16605-c885-47fb-ba9d-4b218cc44030") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:41.523789 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:41.523773 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:41.523880 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:41.523822 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs podName:dcd8a3eb-e25c-4dcb-9468-d578a60a826c nodeName:}" failed. No retries permitted until 2026-04-21 17:33:45.523805774 +0000 UTC m=+10.234499865 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs") pod "network-metrics-daemon-wtk7c" (UID: "dcd8a3eb-e25c-4dcb-9468-d578a60a826c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:41.907848 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:41.907756 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:41.907998 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:41.907897 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:33:42.905058 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:42.905019 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:42.905569 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:42.905169 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:33:43.905375 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:43.905156 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:43.905375 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:43.905314 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:33:44.905026 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:44.904983 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:44.905223 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:44.905146 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:33:45.557582 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:45.557532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2gvp\" (UniqueName: \"kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp\") pod \"network-check-target-8f9r2\" (UID: \"b9e16605-c885-47fb-ba9d-4b218cc44030\") " pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:45.558016 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:45.557601 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:45.558016 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:45.557731 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 17:33:45.558016 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:45.557762 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 17:33:45.558016 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:45.557775 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:45.558016 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:45.557840 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs podName:dcd8a3eb-e25c-4dcb-9468-d578a60a826c nodeName:}" failed. No retries permitted until 2026-04-21 17:33:53.557827008 +0000 UTC m=+18.268521093 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs") pod "network-metrics-daemon-wtk7c" (UID: "dcd8a3eb-e25c-4dcb-9468-d578a60a826c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:45.558016 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:45.557778 2573 projected.go:194] Error preparing data for projected volume kube-api-access-z2gvp for pod openshift-network-diagnostics/network-check-target-8f9r2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:45.558016 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:45.557909 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp podName:b9e16605-c885-47fb-ba9d-4b218cc44030 nodeName:}" failed. No retries permitted until 2026-04-21 17:33:53.557891959 +0000 UTC m=+18.268586048 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-z2gvp" (UniqueName: "kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp") pod "network-check-target-8f9r2" (UID: "b9e16605-c885-47fb-ba9d-4b218cc44030") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:45.906732 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:45.906622 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:45.906899 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:45.906823 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:33:46.905149 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:46.905097 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:46.905616 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:46.905283 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:33:47.905585 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:47.905545 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:47.906054 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:47.905691 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:33:48.905070 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:48.905030 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:48.905284 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:48.905178 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:33:49.905208 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:49.905165 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:49.905680 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:49.905323 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:33:50.905221 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:50.905182 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:50.905707 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:50.905303 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:33:51.905070 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:51.905029 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:51.905383 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:51.905183 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:33:52.905369 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:52.905335 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:52.905558 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:52.905444 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:33:53.619180 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:53.619126 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2gvp\" (UniqueName: \"kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp\") pod \"network-check-target-8f9r2\" (UID: \"b9e16605-c885-47fb-ba9d-4b218cc44030\") " pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:53.619388 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:53.619200 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:53.619388 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:53.619300 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:53.619388 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:53.619323 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 17:33:53.619388 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:53.619354 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 17:33:53.619388 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:53.619368 2573 projected.go:194] Error preparing data for projected volume kube-api-access-z2gvp for pod openshift-network-diagnostics/network-check-target-8f9r2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:53.619699 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:53.619368 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs podName:dcd8a3eb-e25c-4dcb-9468-d578a60a826c nodeName:}" failed. No retries permitted until 2026-04-21 17:34:09.619349197 +0000 UTC m=+34.330043285 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs") pod "network-metrics-daemon-wtk7c" (UID: "dcd8a3eb-e25c-4dcb-9468-d578a60a826c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:33:53.619699 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:53.619427 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp podName:b9e16605-c885-47fb-ba9d-4b218cc44030 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:09.619409991 +0000 UTC m=+34.330104083 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-z2gvp" (UniqueName: "kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp") pod "network-check-target-8f9r2" (UID: "b9e16605-c885-47fb-ba9d-4b218cc44030") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:33:53.904799 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:53.904717 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:53.905020 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:53.904858 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:33:54.905386 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:54.905345 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:54.905912 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:54.905469 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:33:55.907618 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:55.907587 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:55.907997 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:55.907717 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:33:56.003351 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:56.003086 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-ct269" event={"ID":"065ac800-604a-42e9-b0cc-e56598f56081","Type":"ContainerStarted","Data":"967f07c4b67c518f95695e70a150af04d77c48040f82e8be7607e31d6b67819d"} Apr 21 17:33:56.904727 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:56.904639 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:56.904888 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:56.904755 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:33:57.006508 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.006474 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zvdf9" event={"ID":"5dd32873-afbe-4cda-ad81-5c8d17abaeb9","Type":"ContainerStarted","Data":"827d6b750d26880dab60ebf029104bf42b2b88fb0a411e6ae8e58fc707d47fd4"} Apr 21 17:33:57.007758 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.007735 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" event={"ID":"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98","Type":"ContainerStarted","Data":"8a18e0cc87ba9d3192bdd97c5ebd993ab6158492fb4700da0d1dd5304c851fbd"} Apr 21 17:33:57.008973 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.008949 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" event={"ID":"6a95a69b-a0e2-4e0f-a650-b9b615190bfe","Type":"ContainerStarted","Data":"db43139b88f01cad715bb43e36741fea35811e5a624c40809e487a46113afc3a"} Apr 21 17:33:57.015501 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.015468 2573 generic.go:358] "Generic (PLEG): container finished" podID="d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0" containerID="4fa86c0f5ff427080b011b320211ef306f0501d405f36d2007e1a4e86240499e" exitCode=0 Apr 21 17:33:57.015631 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.015550 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcgfw" event={"ID":"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0","Type":"ContainerDied","Data":"4fa86c0f5ff427080b011b320211ef306f0501d405f36d2007e1a4e86240499e"} Apr 21 17:33:57.016858 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.016829 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vmg6z" event={"ID":"97f71749-bd4e-478c-9148-e08f6598c072","Type":"ContainerStarted","Data":"a14192f479986f05f769c9bb2a20904fa6bdc312a35ec4664c2f699d46f39a2d"} Apr 21 17:33:57.019206 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.019189 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 17:33:57.019475 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.019454 2573 generic.go:358] "Generic (PLEG): container finished" podID="3f956cbc-c15f-455e-8caf-a1b6e26e74ca" containerID="a95541502cbcfd41b6301e5071e8e41b645f6bf5657487aa88fef0d360966508" exitCode=1 Apr 21 17:33:57.019531 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.019513 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" event={"ID":"3f956cbc-c15f-455e-8caf-a1b6e26e74ca","Type":"ContainerStarted","Data":"ea8978b5b4c4bd2638d26357d0a3492dbf542d91320296377f5611779e502d4c"} Apr 21 17:33:57.019609 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.019535 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" event={"ID":"3f956cbc-c15f-455e-8caf-a1b6e26e74ca","Type":"ContainerStarted","Data":"37e7d01f231f769e9e316c16ab32779b218caf6ebd41e6e58ea5f5916b200a0b"} Apr 21 17:33:57.019609 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.019544 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" event={"ID":"3f956cbc-c15f-455e-8caf-a1b6e26e74ca","Type":"ContainerStarted","Data":"0e5b8b35c9354cd1e435f24e9a2e7dab2c2aeac344228ebd9eb37bdcd54ebd9b"} Apr 21 17:33:57.019609 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.019553 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" event={"ID":"3f956cbc-c15f-455e-8caf-a1b6e26e74ca","Type":"ContainerStarted","Data":"16b1525cdc127607c25466ddfd9d664bc1ba4fd3e0f5e68d6b7dd33a64f6c1ad"} Apr 21 17:33:57.019609 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.019567 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" event={"ID":"3f956cbc-c15f-455e-8caf-a1b6e26e74ca","Type":"ContainerDied","Data":"a95541502cbcfd41b6301e5071e8e41b645f6bf5657487aa88fef0d360966508"} Apr 21 17:33:57.019609 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.019581 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" event={"ID":"3f956cbc-c15f-455e-8caf-a1b6e26e74ca","Type":"ContainerStarted","Data":"f23c7d1f6909734e2695cb7ba5aea3e698b4582e06cdcd225f3f8023e8814eae"} Apr 21 17:33:57.020626 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.020605 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6fqcj" event={"ID":"755e2315-552e-4e00-ba7b-cf1e07a5c8d1","Type":"ContainerStarted","Data":"debb665238c5dcfa2b2842b1ee4a640b38ce0c52d4697a7f844ffe16179cf66e"} Apr 21 17:33:57.022374 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.022340 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-77.ec2.internal" podStartSLOduration=20.02232951 podStartE2EDuration="20.02232951s" podCreationTimestamp="2026-04-21 17:33:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:33:41.007021649 +0000 UTC m=+5.717715759" watchObservedRunningTime="2026-04-21 17:33:57.02232951 +0000 UTC m=+21.733023643" Apr 21 17:33:57.022546 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.022528 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zvdf9" podStartSLOduration=4.150657524 podStartE2EDuration="21.022523525s" podCreationTimestamp="2026-04-21 17:33:36 +0000 UTC" firstStartedPulling="2026-04-21 17:33:38.553067156 +0000 UTC m=+3.263761250" lastFinishedPulling="2026-04-21 17:33:55.424933165 +0000 UTC m=+20.135627251" observedRunningTime="2026-04-21 17:33:57.022147938 +0000 UTC m=+21.732842040" watchObservedRunningTime="2026-04-21 17:33:57.022523525 +0000 UTC m=+21.733217633" Apr 21 17:33:57.040830 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.040775 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8rkhb" podStartSLOduration=4.865544907 podStartE2EDuration="22.040757948s" podCreationTimestamp="2026-04-21 17:33:35 +0000 UTC" firstStartedPulling="2026-04-21 17:33:38.552945184 +0000 UTC m=+3.263639277" lastFinishedPulling="2026-04-21 17:33:55.728158231 +0000 UTC m=+20.438852318" observedRunningTime="2026-04-21 17:33:57.040421394 +0000 UTC m=+21.751115530" watchObservedRunningTime="2026-04-21 17:33:57.040757948 +0000 UTC m=+21.751452056" Apr 21 17:33:57.057059 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.057012 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-ct269" podStartSLOduration=4.178394847 podStartE2EDuration="21.057000656s" podCreationTimestamp="2026-04-21 17:33:36 +0000 UTC" firstStartedPulling="2026-04-21 17:33:38.546325862 +0000 UTC m=+3.257019948" lastFinishedPulling="2026-04-21 17:33:55.424931672 +0000 UTC m=+20.135625757" observedRunningTime="2026-04-21 17:33:57.056505491 +0000 UTC m=+21.767199599" watchObservedRunningTime="2026-04-21 17:33:57.057000656 +0000 UTC m=+21.767694763" Apr 21 17:33:57.073065 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.073021 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6fqcj" podStartSLOduration=4.197594472 podStartE2EDuration="21.073006328s" podCreationTimestamp="2026-04-21 17:33:36 +0000 UTC" firstStartedPulling="2026-04-21 17:33:38.549522371 +0000 UTC m=+3.260216458" lastFinishedPulling="2026-04-21 17:33:55.424934214 +0000 UTC m=+20.135628314" observedRunningTime="2026-04-21 17:33:57.072674816 +0000 UTC m=+21.783368936" watchObservedRunningTime="2026-04-21 17:33:57.073006328 +0000 UTC m=+21.783700437" Apr 21 17:33:57.132178 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.132101 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vmg6z" podStartSLOduration=3.926299468 podStartE2EDuration="21.132085822s" podCreationTimestamp="2026-04-21 17:33:36 +0000 UTC" firstStartedPulling="2026-04-21 17:33:38.558531208 +0000 UTC m=+3.269225295" lastFinishedPulling="2026-04-21 17:33:55.764317559 +0000 UTC m=+20.475011649" observedRunningTime="2026-04-21 17:33:57.131776878 +0000 UTC m=+21.842470987" watchObservedRunningTime="2026-04-21 17:33:57.132085822 +0000 UTC m=+21.842779929" Apr 21 17:33:57.341709 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.341681 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 17:33:57.841387 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.841277 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T17:33:57.341701517Z","UUID":"bd278755-8f68-456e-8751-6192e2f95487","Handler":null,"Name":"","Endpoint":""} Apr 21 17:33:57.843349 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.843324 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 17:33:57.843477 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.843359 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 17:33:57.904887 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:57.904853 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:57.905069 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:57.905017 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:33:58.024190 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:58.024146 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" event={"ID":"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98","Type":"ContainerStarted","Data":"c84df8ee45306827b8f7ff5f77c5b629a2bd6c13f8fbb3bcfc225aecb158ee7c"} Apr 21 17:33:58.025828 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:58.025646 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-np76c" event={"ID":"d4691871-0174-4f20-9333-c53e5c44774f","Type":"ContainerStarted","Data":"7d2e1e20dc50e076f8be73083351e32e94ed566b9e41675851229aa74cc62d59"} Apr 21 17:33:58.904645 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:58.904611 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:33:58.904832 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:58.904712 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:33:59.030027 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:59.029927 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" event={"ID":"fba4ce9e-e0ea-4092-a0eb-1f8a684daf98","Type":"ContainerStarted","Data":"4988015d0a027082bb4aa07c3a8a76bb3d39acc3780f5d869a9ad78674be38fb"} Apr 21 17:33:59.033293 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:59.033268 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 17:33:59.033695 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:59.033665 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" event={"ID":"3f956cbc-c15f-455e-8caf-a1b6e26e74ca","Type":"ContainerStarted","Data":"4def2e8cb2832e9f4af020dbd91f3e59130353613972c968ac2b3db5f827aed6"} Apr 21 17:33:59.055840 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:59.055780 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-7kvmh" podStartSLOduration=4.112548729 podStartE2EDuration="24.055762331s" podCreationTimestamp="2026-04-21 17:33:35 +0000 UTC" firstStartedPulling="2026-04-21 17:33:38.556947454 +0000 UTC m=+3.267641546" lastFinishedPulling="2026-04-21 17:33:58.50016106 +0000 UTC m=+23.210855148" observedRunningTime="2026-04-21 17:33:59.055470725 +0000 UTC m=+23.766164834" watchObservedRunningTime="2026-04-21 17:33:59.055762331 +0000 UTC m=+23.766456439" Apr 21 17:33:59.056246 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:59.056213 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-np76c" podStartSLOduration=5.884463237 podStartE2EDuration="23.056205356s" podCreationTimestamp="2026-04-21 17:33:36 +0000 UTC" firstStartedPulling="2026-04-21 17:33:38.556417435 +0000 UTC m=+3.267111527" lastFinishedPulling="2026-04-21 17:33:55.728159554 +0000 UTC m=+20.438853646" observedRunningTime="2026-04-21 17:33:58.050905753 +0000 UTC m=+22.761599864" watchObservedRunningTime="2026-04-21 17:33:59.056205356 +0000 UTC m=+23.766899465" Apr 21 17:33:59.560497 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:59.560272 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-ct269" Apr 21 17:33:59.561007 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:59.560987 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-ct269" Apr 21 17:33:59.905030 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:33:59.904958 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:33:59.905221 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:33:59.905067 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:34:00.035538 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:00.035467 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-ct269" Apr 21 17:34:00.035948 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:00.035913 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-ct269" Apr 21 17:34:00.905303 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:00.905266 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:34:00.905441 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:00.905401 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:34:01.041745 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:01.041039 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 17:34:01.041745 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:01.041477 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" event={"ID":"3f956cbc-c15f-455e-8caf-a1b6e26e74ca","Type":"ContainerStarted","Data":"b871063aa95897fc38117d26e0fd57d6b4833e845fc67592ae374e0378eee5de"} Apr 21 17:34:01.043241 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:01.041934 2573 scope.go:117] "RemoveContainer" containerID="a95541502cbcfd41b6301e5071e8e41b645f6bf5657487aa88fef0d360966508" Apr 21 17:34:01.043241 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:01.042058 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:34:01.043241 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:01.042081 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:34:01.043241 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:01.042090 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:34:01.062641 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:01.062560 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:34:01.063123 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:01.062900 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:34:01.905828 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:01.905626 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:34:01.906009 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:01.905938 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:34:02.905094 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:02.905040 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:34:02.905646 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:02.905207 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:34:02.946874 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:02.946844 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wtk7c"] Apr 21 17:34:02.947023 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:02.946952 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:34:02.947059 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:02.947038 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:34:02.949123 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:02.949093 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8f9r2"] Apr 21 17:34:03.048122 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:03.048043 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 17:34:03.048460 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:03.048427 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" event={"ID":"3f956cbc-c15f-455e-8caf-a1b6e26e74ca","Type":"ContainerStarted","Data":"a988a85d403fb6141ba107789a8301fc8526e454d757c1dc9ad4427a8773f23d"} Apr 21 17:34:03.050215 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:03.050191 2573 generic.go:358] "Generic (PLEG): container finished" podID="d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0" containerID="044c9723eb901c8c9ebb22aaf25ab9e69a048096a0f051f82dce0371072a045d" exitCode=0 Apr 21 17:34:03.050305 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:03.050246 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcgfw" event={"ID":"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0","Type":"ContainerDied","Data":"044c9723eb901c8c9ebb22aaf25ab9e69a048096a0f051f82dce0371072a045d"} Apr 21 17:34:03.050305 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:03.050267 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:34:03.050451 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:03.050435 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:34:03.082967 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:03.082920 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" podStartSLOduration=9.836190391 podStartE2EDuration="27.082906052s" podCreationTimestamp="2026-04-21 17:33:36 +0000 UTC" firstStartedPulling="2026-04-21 17:33:38.551704684 +0000 UTC m=+3.262398785" lastFinishedPulling="2026-04-21 17:33:55.798420346 +0000 UTC m=+20.509114446" observedRunningTime="2026-04-21 17:34:03.082789802 +0000 UTC m=+27.793483920" watchObservedRunningTime="2026-04-21 17:34:03.082906052 +0000 UTC m=+27.793600137" Apr 21 17:34:04.054038 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:04.054003 2573 generic.go:358] "Generic (PLEG): container finished" podID="d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0" containerID="ac29bd2642855386ff6b4249bfd51ee9668b725e35ae68b671bf720570aa1a45" exitCode=0 Apr 21 17:34:04.054502 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:04.054092 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcgfw" event={"ID":"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0","Type":"ContainerDied","Data":"ac29bd2642855386ff6b4249bfd51ee9668b725e35ae68b671bf720570aa1a45"} Apr 21 17:34:04.905462 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:04.905380 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:34:04.905600 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:04.905492 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:34:04.905600 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:04.905541 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:34:04.905665 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:04.905636 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:34:05.058681 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:05.058649 2573 generic.go:358] "Generic (PLEG): container finished" podID="d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0" containerID="d06b20e6c9f8928eb891bb3409c5ff822f6770a9e26cfb0822108c730a07f972" exitCode=0 Apr 21 17:34:05.059092 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:05.058716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcgfw" event={"ID":"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0","Type":"ContainerDied","Data":"d06b20e6c9f8928eb891bb3409c5ff822f6770a9e26cfb0822108c730a07f972"} Apr 21 17:34:06.904879 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:06.904840 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:34:06.905369 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:06.904840 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:34:06.905369 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:06.904979 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:34:06.905369 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:06.905088 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:34:08.905182 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:08.905070 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:34:08.905182 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:08.905077 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:34:08.905752 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:08.905221 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8f9r2" podUID="b9e16605-c885-47fb-ba9d-4b218cc44030" Apr 21 17:34:08.905752 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:08.905264 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:34:09.160570 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.160494 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-77.ec2.internal" event="NodeReady" Apr 21 17:34:09.160711 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.160651 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 17:34:09.213382 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.213321 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4vw75"] Apr 21 17:34:09.216152 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.216103 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:09.217787 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.217756 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dzd2m"] Apr 21 17:34:09.219584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.219536 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mlvk8\"" Apr 21 17:34:09.219727 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.219539 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 17:34:09.219782 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.219739 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:34:09.219884 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.219825 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 17:34:09.223094 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.223064 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4fbh6\"" Apr 21 17:34:09.223094 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.223075 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 17:34:09.223374 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.223354 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 17:34:09.223595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.223558 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 17:34:09.233153 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.233108 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4vw75"] Apr 21 17:34:09.235963 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.235918 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dzd2m"] Apr 21 17:34:09.343321 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.343280 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/480585ac-7f87-43b8-98a2-398a61a10ad3-tmp-dir\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:09.343520 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.343355 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/480585ac-7f87-43b8-98a2-398a61a10ad3-config-volume\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:09.343520 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.343441 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:09.343520 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.343492 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4grp\" (UniqueName: \"kubernetes.io/projected/480585ac-7f87-43b8-98a2-398a61a10ad3-kube-api-access-v4grp\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:09.343661 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.343558 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r48kw\" (UniqueName: \"kubernetes.io/projected/85cdcbba-06ac-4178-a25d-5d6eb221a155-kube-api-access-r48kw\") pod \"ingress-canary-dzd2m\" (UID: \"85cdcbba-06ac-4178-a25d-5d6eb221a155\") " pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:34:09.343661 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.343603 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert\") pod \"ingress-canary-dzd2m\" (UID: \"85cdcbba-06ac-4178-a25d-5d6eb221a155\") " pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:34:09.444657 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.444564 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:09.444657 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.444623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4grp\" (UniqueName: \"kubernetes.io/projected/480585ac-7f87-43b8-98a2-398a61a10ad3-kube-api-access-v4grp\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:09.444657 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.444653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r48kw\" (UniqueName: \"kubernetes.io/projected/85cdcbba-06ac-4178-a25d-5d6eb221a155-kube-api-access-r48kw\") pod \"ingress-canary-dzd2m\" (UID: \"85cdcbba-06ac-4178-a25d-5d6eb221a155\") " pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:34:09.444940 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.444693 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert\") pod \"ingress-canary-dzd2m\" (UID: \"85cdcbba-06ac-4178-a25d-5d6eb221a155\") " pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:34:09.444940 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.444725 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/480585ac-7f87-43b8-98a2-398a61a10ad3-tmp-dir\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:09.444940 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:09.444728 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:34:09.444940 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.444759 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/480585ac-7f87-43b8-98a2-398a61a10ad3-config-volume\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:09.444940 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:09.444793 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls podName:480585ac-7f87-43b8-98a2-398a61a10ad3 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:09.944773851 +0000 UTC m=+34.655467949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls") pod "dns-default-4vw75" (UID: "480585ac-7f87-43b8-98a2-398a61a10ad3") : secret "dns-default-metrics-tls" not found Apr 21 17:34:09.445235 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:09.444987 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:34:09.445235 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:09.445035 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert podName:85cdcbba-06ac-4178-a25d-5d6eb221a155 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:09.945020702 +0000 UTC m=+34.655714803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert") pod "ingress-canary-dzd2m" (UID: "85cdcbba-06ac-4178-a25d-5d6eb221a155") : secret "canary-serving-cert" not found Apr 21 17:34:09.445369 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.445345 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/480585ac-7f87-43b8-98a2-398a61a10ad3-tmp-dir\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:09.445983 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.445956 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/480585ac-7f87-43b8-98a2-398a61a10ad3-config-volume\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:09.457560 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.457527 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4grp\" (UniqueName: \"kubernetes.io/projected/480585ac-7f87-43b8-98a2-398a61a10ad3-kube-api-access-v4grp\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:09.457728 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.457594 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r48kw\" (UniqueName: \"kubernetes.io/projected/85cdcbba-06ac-4178-a25d-5d6eb221a155-kube-api-access-r48kw\") pod \"ingress-canary-dzd2m\" (UID: \"85cdcbba-06ac-4178-a25d-5d6eb221a155\") " pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:34:09.646422 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.646383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2gvp\" (UniqueName: \"kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp\") pod \"network-check-target-8f9r2\" (UID: \"b9e16605-c885-47fb-ba9d-4b218cc44030\") " pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:34:09.646612 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.646446 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:34:09.646612 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:09.646557 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 17:34:09.646612 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:09.646577 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:34:09.646734 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:09.646631 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs podName:dcd8a3eb-e25c-4dcb-9468-d578a60a826c nodeName:}" failed. No retries permitted until 2026-04-21 17:34:41.646617909 +0000 UTC m=+66.357311995 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs") pod "network-metrics-daemon-wtk7c" (UID: "dcd8a3eb-e25c-4dcb-9468-d578a60a826c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 17:34:09.646734 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:09.646580 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 17:34:09.646734 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:09.646676 2573 projected.go:194] Error preparing data for projected volume kube-api-access-z2gvp for pod openshift-network-diagnostics/network-check-target-8f9r2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:34:09.646734 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:09.646718 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp podName:b9e16605-c885-47fb-ba9d-4b218cc44030 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:41.64670665 +0000 UTC m=+66.357400736 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-z2gvp" (UniqueName: "kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp") pod "network-check-target-8f9r2" (UID: "b9e16605-c885-47fb-ba9d-4b218cc44030") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 17:34:09.948689 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.948651 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:09.949240 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:09.948724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert\") pod \"ingress-canary-dzd2m\" (UID: \"85cdcbba-06ac-4178-a25d-5d6eb221a155\") " pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:34:09.949240 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:09.948814 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:34:09.949240 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:09.948816 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:34:09.949240 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:09.948898 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls podName:480585ac-7f87-43b8-98a2-398a61a10ad3 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:10.948869378 +0000 UTC m=+35.659563475 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls") pod "dns-default-4vw75" (UID: "480585ac-7f87-43b8-98a2-398a61a10ad3") : secret "dns-default-metrics-tls" not found Apr 21 17:34:09.949240 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:09.948918 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert podName:85cdcbba-06ac-4178-a25d-5d6eb221a155 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:10.948908863 +0000 UTC m=+35.659602952 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert") pod "ingress-canary-dzd2m" (UID: "85cdcbba-06ac-4178-a25d-5d6eb221a155") : secret "canary-serving-cert" not found Apr 21 17:34:10.904768 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:10.904730 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:34:10.904768 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:10.904756 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:34:10.909368 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:10.909341 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 17:34:10.909539 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:10.909341 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 17:34:10.909539 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:10.909349 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pqc26\"" Apr 21 17:34:10.909539 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:10.909355 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 17:34:10.909539 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:10.909352 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5lbmh\"" Apr 21 17:34:10.957274 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:10.957247 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert\") pod \"ingress-canary-dzd2m\" (UID: \"85cdcbba-06ac-4178-a25d-5d6eb221a155\") " pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:34:10.957812 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:10.957312 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:10.957812 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:10.957403 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:34:10.957812 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:10.957414 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:34:10.957812 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:10.957480 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert podName:85cdcbba-06ac-4178-a25d-5d6eb221a155 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:12.957455732 +0000 UTC m=+37.668149838 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert") pod "ingress-canary-dzd2m" (UID: "85cdcbba-06ac-4178-a25d-5d6eb221a155") : secret "canary-serving-cert" not found Apr 21 17:34:10.957812 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:10.957503 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls podName:480585ac-7f87-43b8-98a2-398a61a10ad3 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:12.957493656 +0000 UTC m=+37.668187746 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls") pod "dns-default-4vw75" (UID: "480585ac-7f87-43b8-98a2-398a61a10ad3") : secret "dns-default-metrics-tls" not found Apr 21 17:34:11.074203 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:11.074171 2573 generic.go:358] "Generic (PLEG): container finished" podID="d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0" containerID="1477a49b5f2bed404a5dfdecba25a45d60925b336a0baf75cf302eca7683f083" exitCode=0 Apr 21 17:34:11.074203 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:11.074209 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcgfw" event={"ID":"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0","Type":"ContainerDied","Data":"1477a49b5f2bed404a5dfdecba25a45d60925b336a0baf75cf302eca7683f083"} Apr 21 17:34:12.078722 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:12.078680 2573 generic.go:358] "Generic (PLEG): container finished" podID="d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0" containerID="aec3571003722471ebc07dfd2f068ca9b1e7bd9679ea1d7e787f80b8412de987" exitCode=0 Apr 21 17:34:12.079209 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:12.078748 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcgfw" event={"ID":"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0","Type":"ContainerDied","Data":"aec3571003722471ebc07dfd2f068ca9b1e7bd9679ea1d7e787f80b8412de987"} Apr 21 17:34:12.973930 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:12.973687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:12.974090 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:12.973838 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:34:12.974090 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:12.973996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert\") pod \"ingress-canary-dzd2m\" (UID: \"85cdcbba-06ac-4178-a25d-5d6eb221a155\") " pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:34:12.974090 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:12.974055 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls podName:480585ac-7f87-43b8-98a2-398a61a10ad3 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:16.974034236 +0000 UTC m=+41.684728324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls") pod "dns-default-4vw75" (UID: "480585ac-7f87-43b8-98a2-398a61a10ad3") : secret "dns-default-metrics-tls" not found Apr 21 17:34:12.974235 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:12.974093 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:34:12.974235 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:12.974146 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert podName:85cdcbba-06ac-4178-a25d-5d6eb221a155 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:16.974118264 +0000 UTC m=+41.684812365 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert") pod "ingress-canary-dzd2m" (UID: "85cdcbba-06ac-4178-a25d-5d6eb221a155") : secret "canary-serving-cert" not found Apr 21 17:34:13.083509 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:13.083472 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zcgfw" event={"ID":"d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0","Type":"ContainerStarted","Data":"906ec080422a67414cc6ca786add2768c5d4fd857ea84cf7bfe38b65523ca8d5"} Apr 21 17:34:13.108578 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:13.108512 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zcgfw" podStartSLOduration=4.951393616 podStartE2EDuration="37.108495124s" podCreationTimestamp="2026-04-21 17:33:36 +0000 UTC" firstStartedPulling="2026-04-21 17:33:38.545333621 +0000 UTC m=+3.256027712" lastFinishedPulling="2026-04-21 17:34:10.702435106 +0000 UTC m=+35.413129220" observedRunningTime="2026-04-21 17:34:13.106815029 +0000 UTC m=+37.817509138" watchObservedRunningTime="2026-04-21 17:34:13.108495124 +0000 UTC m=+37.819189268" Apr 21 17:34:17.004121 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:17.004059 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert\") pod \"ingress-canary-dzd2m\" (UID: \"85cdcbba-06ac-4178-a25d-5d6eb221a155\") " pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:34:17.004547 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:17.004181 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:17.004547 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:17.004236 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:34:17.004547 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:17.004271 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:34:17.004547 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:17.004309 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert podName:85cdcbba-06ac-4178-a25d-5d6eb221a155 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:25.004289942 +0000 UTC m=+49.714984043 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert") pod "ingress-canary-dzd2m" (UID: "85cdcbba-06ac-4178-a25d-5d6eb221a155") : secret "canary-serving-cert" not found Apr 21 17:34:17.004547 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:17.004324 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls podName:480585ac-7f87-43b8-98a2-398a61a10ad3 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:25.00431859 +0000 UTC m=+49.715012679 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls") pod "dns-default-4vw75" (UID: "480585ac-7f87-43b8-98a2-398a61a10ad3") : secret "dns-default-metrics-tls" not found Apr 21 17:34:25.061870 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:25.061819 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert\") pod \"ingress-canary-dzd2m\" (UID: \"85cdcbba-06ac-4178-a25d-5d6eb221a155\") " pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:34:25.062408 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:25.061886 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:25.062408 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:25.061973 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:34:25.062408 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:25.062024 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:34:25.062408 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:25.062057 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls podName:480585ac-7f87-43b8-98a2-398a61a10ad3 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:41.062042105 +0000 UTC m=+65.772736191 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls") pod "dns-default-4vw75" (UID: "480585ac-7f87-43b8-98a2-398a61a10ad3") : secret "dns-default-metrics-tls" not found Apr 21 17:34:25.062408 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:25.062098 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert podName:85cdcbba-06ac-4178-a25d-5d6eb221a155 nodeName:}" failed. No retries permitted until 2026-04-21 17:34:41.062080518 +0000 UTC m=+65.772774610 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert") pod "ingress-canary-dzd2m" (UID: "85cdcbba-06ac-4178-a25d-5d6eb221a155") : secret "canary-serving-cert" not found Apr 21 17:34:34.065587 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:34.065557 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-74gml" Apr 21 17:34:41.080398 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:41.080354 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:34:41.080888 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:41.080431 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert\") pod \"ingress-canary-dzd2m\" (UID: \"85cdcbba-06ac-4178-a25d-5d6eb221a155\") " pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:34:41.080888 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:41.080527 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:34:41.080888 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:41.080530 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:34:41.080888 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:41.080581 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert podName:85cdcbba-06ac-4178-a25d-5d6eb221a155 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:13.080567265 +0000 UTC m=+97.791261351 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert") pod "ingress-canary-dzd2m" (UID: "85cdcbba-06ac-4178-a25d-5d6eb221a155") : secret "canary-serving-cert" not found Apr 21 17:34:41.080888 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:41.080593 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls podName:480585ac-7f87-43b8-98a2-398a61a10ad3 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:13.08058787 +0000 UTC m=+97.791281956 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls") pod "dns-default-4vw75" (UID: "480585ac-7f87-43b8-98a2-398a61a10ad3") : secret "dns-default-metrics-tls" not found Apr 21 17:34:41.685640 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:41.685598 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2gvp\" (UniqueName: \"kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp\") pod \"network-check-target-8f9r2\" (UID: \"b9e16605-c885-47fb-ba9d-4b218cc44030\") " pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:34:41.685845 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:41.685656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:34:41.688677 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:41.688645 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 17:34:41.688799 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:41.688691 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 17:34:41.696011 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:41.695971 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 17:34:41.696155 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:34:41.696071 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs podName:dcd8a3eb-e25c-4dcb-9468-d578a60a826c nodeName:}" failed. No retries permitted until 2026-04-21 17:35:45.696050027 +0000 UTC m=+130.406744114 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs") pod "network-metrics-daemon-wtk7c" (UID: "dcd8a3eb-e25c-4dcb-9468-d578a60a826c") : secret "metrics-daemon-secret" not found Apr 21 17:34:41.698380 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:41.698356 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 17:34:41.710069 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:41.710037 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2gvp\" (UniqueName: \"kubernetes.io/projected/b9e16605-c885-47fb-ba9d-4b218cc44030-kube-api-access-z2gvp\") pod \"network-check-target-8f9r2\" (UID: \"b9e16605-c885-47fb-ba9d-4b218cc44030\") " pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:34:41.823515 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:41.823481 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5lbmh\"" Apr 21 17:34:41.831553 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:41.831528 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:34:41.978519 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:41.978442 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8f9r2"] Apr 21 17:34:41.982260 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:34:41.982233 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9e16605_c885_47fb_ba9d_4b218cc44030.slice/crio-2cc803d4dd95533bacb4726fc38f199d813f676db9171e73fac8d8e207d76ddb WatchSource:0}: Error finding container 2cc803d4dd95533bacb4726fc38f199d813f676db9171e73fac8d8e207d76ddb: Status 404 returned error can't find the container with id 2cc803d4dd95533bacb4726fc38f199d813f676db9171e73fac8d8e207d76ddb Apr 21 17:34:42.141367 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:42.141327 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8f9r2" event={"ID":"b9e16605-c885-47fb-ba9d-4b218cc44030","Type":"ContainerStarted","Data":"2cc803d4dd95533bacb4726fc38f199d813f676db9171e73fac8d8e207d76ddb"} Apr 21 17:34:45.148363 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:45.148270 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8f9r2" event={"ID":"b9e16605-c885-47fb-ba9d-4b218cc44030","Type":"ContainerStarted","Data":"8811f374ac3c94b9fe0143df8e07a456d5956b07c9489b23246869c49dff1e1a"} Apr 21 17:34:45.148762 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:34:45.148400 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:35:13.106912 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:13.106849 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert\") pod \"ingress-canary-dzd2m\" (UID: \"85cdcbba-06ac-4178-a25d-5d6eb221a155\") " pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:35:13.107372 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:13.107003 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 17:35:13.107372 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:13.107029 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:35:13.107372 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:13.107073 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert podName:85cdcbba-06ac-4178-a25d-5d6eb221a155 nodeName:}" failed. No retries permitted until 2026-04-21 17:36:17.107055877 +0000 UTC m=+161.817749967 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert") pod "ingress-canary-dzd2m" (UID: "85cdcbba-06ac-4178-a25d-5d6eb221a155") : secret "canary-serving-cert" not found Apr 21 17:35:13.107372 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:13.107153 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 17:35:13.107372 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:13.107219 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls podName:480585ac-7f87-43b8-98a2-398a61a10ad3 nodeName:}" failed. No retries permitted until 2026-04-21 17:36:17.107202978 +0000 UTC m=+161.817897068 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls") pod "dns-default-4vw75" (UID: "480585ac-7f87-43b8-98a2-398a61a10ad3") : secret "dns-default-metrics-tls" not found Apr 21 17:35:16.153072 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:16.153038 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-8f9r2" Apr 21 17:35:16.169089 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:16.169040 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-8f9r2" podStartSLOduration=97.300741721 podStartE2EDuration="1m40.169025411s" podCreationTimestamp="2026-04-21 17:33:36 +0000 UTC" firstStartedPulling="2026-04-21 17:34:41.983977696 +0000 UTC m=+66.694671781" lastFinishedPulling="2026-04-21 17:34:44.852261384 +0000 UTC m=+69.562955471" observedRunningTime="2026-04-21 17:34:45.180446288 +0000 UTC m=+69.891140396" watchObservedRunningTime="2026-04-21 17:35:16.169025411 +0000 UTC m=+100.879719518" Apr 21 17:35:40.114358 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.114321 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jb47l"] Apr 21 17:35:40.116245 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.116227 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.119021 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.118989 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 17:35:40.119207 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.119181 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 17:35:40.119301 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.119285 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-rp2d8\"" Apr 21 17:35:40.120530 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.120510 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 17:35:40.120628 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.120541 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 17:35:40.126441 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.126364 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 17:35:40.128103 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.128082 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jb47l"] Apr 21 17:35:40.211418 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.211378 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m"] Apr 21 17:35:40.213273 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.213257 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:35:40.217405 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.217375 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 17:35:40.217405 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.217391 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 17:35:40.217595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.217421 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 17:35:40.217595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.217457 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 17:35:40.217595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.217478 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-5gdkk\"" Apr 21 17:35:40.223330 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.223301 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m"] Apr 21 17:35:40.293490 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.293454 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:35:40.293663 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.293519 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a784f755-6ef2-4edf-993b-25f3e45d1082-tmp\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.293663 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.293542 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a784f755-6ef2-4edf-993b-25f3e45d1082-serving-cert\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.293663 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.293561 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc8xx\" (UniqueName: \"kubernetes.io/projected/433814e5-34ad-4fb0-bc66-58a825bc82e4-kube-api-access-dc8xx\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:35:40.293663 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.293597 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a784f755-6ef2-4edf-993b-25f3e45d1082-service-ca-bundle\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.293663 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.293621 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cg5v\" (UniqueName: \"kubernetes.io/projected/a784f755-6ef2-4edf-993b-25f3e45d1082-kube-api-access-6cg5v\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.293663 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.293637 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/433814e5-34ad-4fb0-bc66-58a825bc82e4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:35:40.293847 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.293666 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a784f755-6ef2-4edf-993b-25f3e45d1082-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.293847 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.293730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a784f755-6ef2-4edf-993b-25f3e45d1082-snapshots\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.394952 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.394868 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a784f755-6ef2-4edf-993b-25f3e45d1082-snapshots\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.394952 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.394912 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:35:40.395185 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.394957 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a784f755-6ef2-4edf-993b-25f3e45d1082-tmp\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.395185 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.394985 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a784f755-6ef2-4edf-993b-25f3e45d1082-serving-cert\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.395185 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.395008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8xx\" (UniqueName: \"kubernetes.io/projected/433814e5-34ad-4fb0-bc66-58a825bc82e4-kube-api-access-dc8xx\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:35:40.395185 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:40.395103 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 17:35:40.395390 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.395197 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a784f755-6ef2-4edf-993b-25f3e45d1082-service-ca-bundle\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.395390 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:40.395205 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls podName:433814e5-34ad-4fb0-bc66-58a825bc82e4 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:40.895184511 +0000 UTC m=+125.605878598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-45p2m" (UID: "433814e5-34ad-4fb0-bc66-58a825bc82e4") : secret "cluster-monitoring-operator-tls" not found Apr 21 17:35:40.395390 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.395244 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cg5v\" (UniqueName: \"kubernetes.io/projected/a784f755-6ef2-4edf-993b-25f3e45d1082-kube-api-access-6cg5v\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.395390 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.395276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/433814e5-34ad-4fb0-bc66-58a825bc82e4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:35:40.395390 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.395328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a784f755-6ef2-4edf-993b-25f3e45d1082-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.395655 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.395555 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a784f755-6ef2-4edf-993b-25f3e45d1082-tmp\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.395655 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.395609 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/a784f755-6ef2-4edf-993b-25f3e45d1082-snapshots\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.395899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.395877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a784f755-6ef2-4edf-993b-25f3e45d1082-service-ca-bundle\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.396024 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.396004 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/433814e5-34ad-4fb0-bc66-58a825bc82e4-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:35:40.396299 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.396280 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a784f755-6ef2-4edf-993b-25f3e45d1082-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.397471 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.397454 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a784f755-6ef2-4edf-993b-25f3e45d1082-serving-cert\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.403962 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.403936 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cg5v\" (UniqueName: \"kubernetes.io/projected/a784f755-6ef2-4edf-993b-25f3e45d1082-kube-api-access-6cg5v\") pod \"insights-operator-585dfdc468-jb47l\" (UID: \"a784f755-6ef2-4edf-993b-25f3e45d1082\") " pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.404057 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.404028 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc8xx\" (UniqueName: \"kubernetes.io/projected/433814e5-34ad-4fb0-bc66-58a825bc82e4-kube-api-access-dc8xx\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:35:40.425904 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.425869 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-jb47l" Apr 21 17:35:40.546880 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.546845 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-jb47l"] Apr 21 17:35:40.550361 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:35:40.550330 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda784f755_6ef2_4edf_993b_25f3e45d1082.slice/crio-44bbe3dedd4a220b31ae58a7bbfdf149d6490e29b9baf22d8404a3b6e6b96801 WatchSource:0}: Error finding container 44bbe3dedd4a220b31ae58a7bbfdf149d6490e29b9baf22d8404a3b6e6b96801: Status 404 returned error can't find the container with id 44bbe3dedd4a220b31ae58a7bbfdf149d6490e29b9baf22d8404a3b6e6b96801 Apr 21 17:35:40.898852 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:40.898816 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:35:40.899036 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:40.898977 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 17:35:40.899078 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:40.899046 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls podName:433814e5-34ad-4fb0-bc66-58a825bc82e4 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:41.899028511 +0000 UTC m=+126.609722617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-45p2m" (UID: "433814e5-34ad-4fb0-bc66-58a825bc82e4") : secret "cluster-monitoring-operator-tls" not found Apr 21 17:35:41.256635 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:41.256596 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jb47l" event={"ID":"a784f755-6ef2-4edf-993b-25f3e45d1082","Type":"ContainerStarted","Data":"44bbe3dedd4a220b31ae58a7bbfdf149d6490e29b9baf22d8404a3b6e6b96801"} Apr 21 17:35:41.906012 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:41.905972 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:35:41.906210 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:41.906108 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 17:35:41.906210 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:41.906184 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls podName:433814e5-34ad-4fb0-bc66-58a825bc82e4 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:43.906167775 +0000 UTC m=+128.616861861 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-45p2m" (UID: "433814e5-34ad-4fb0-bc66-58a825bc82e4") : secret "cluster-monitoring-operator-tls" not found Apr 21 17:35:43.016014 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:43.015973 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lhmd6"] Apr 21 17:35:43.017937 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:43.017916 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lhmd6" Apr 21 17:35:43.020508 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:43.020486 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 17:35:43.021831 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:43.021805 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 17:35:43.021955 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:43.021844 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-64l7t\"" Apr 21 17:35:43.025890 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:43.025864 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lhmd6"] Apr 21 17:35:43.117352 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:43.117314 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qskcs\" (UniqueName: \"kubernetes.io/projected/a4e0f79c-332d-452e-95ac-b42786fe90d2-kube-api-access-qskcs\") pod \"volume-data-source-validator-7c6cbb6c87-lhmd6\" (UID: \"a4e0f79c-332d-452e-95ac-b42786fe90d2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lhmd6" Apr 21 17:35:43.218240 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:43.218198 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qskcs\" (UniqueName: \"kubernetes.io/projected/a4e0f79c-332d-452e-95ac-b42786fe90d2-kube-api-access-qskcs\") pod \"volume-data-source-validator-7c6cbb6c87-lhmd6\" (UID: \"a4e0f79c-332d-452e-95ac-b42786fe90d2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lhmd6" Apr 21 17:35:43.226955 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:43.226929 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qskcs\" (UniqueName: \"kubernetes.io/projected/a4e0f79c-332d-452e-95ac-b42786fe90d2-kube-api-access-qskcs\") pod \"volume-data-source-validator-7c6cbb6c87-lhmd6\" (UID: \"a4e0f79c-332d-452e-95ac-b42786fe90d2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lhmd6" Apr 21 17:35:43.261631 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:43.261598 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jb47l" event={"ID":"a784f755-6ef2-4edf-993b-25f3e45d1082","Type":"ContainerStarted","Data":"b5909a37956215864c18516a35c491ae40e30b6aa64dc0331851216cd91b62b9"} Apr 21 17:35:43.278123 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:43.277990 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-jb47l" podStartSLOduration=1.326965617 podStartE2EDuration="3.277971336s" podCreationTimestamp="2026-04-21 17:35:40 +0000 UTC" firstStartedPulling="2026-04-21 17:35:40.552169332 +0000 UTC m=+125.262863418" lastFinishedPulling="2026-04-21 17:35:42.50317505 +0000 UTC m=+127.213869137" observedRunningTime="2026-04-21 17:35:43.277455415 +0000 UTC m=+127.988149525" watchObservedRunningTime="2026-04-21 17:35:43.277971336 +0000 UTC m=+127.988665443" Apr 21 17:35:43.327639 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:43.327603 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lhmd6" Apr 21 17:35:43.454085 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:43.454045 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lhmd6"] Apr 21 17:35:43.457754 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:35:43.457716 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4e0f79c_332d_452e_95ac_b42786fe90d2.slice/crio-282cfc5e7e5827615c4afd77a871d0ead776bf21f150e8264408be7575c8f727 WatchSource:0}: Error finding container 282cfc5e7e5827615c4afd77a871d0ead776bf21f150e8264408be7575c8f727: Status 404 returned error can't find the container with id 282cfc5e7e5827615c4afd77a871d0ead776bf21f150e8264408be7575c8f727 Apr 21 17:35:43.924204 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:43.924152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:35:43.924384 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:43.924303 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 17:35:43.924384 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:43.924380 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls podName:433814e5-34ad-4fb0-bc66-58a825bc82e4 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:47.924364167 +0000 UTC m=+132.635058254 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-45p2m" (UID: "433814e5-34ad-4fb0-bc66-58a825bc82e4") : secret "cluster-monitoring-operator-tls" not found Apr 21 17:35:44.265202 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:44.265161 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lhmd6" event={"ID":"a4e0f79c-332d-452e-95ac-b42786fe90d2","Type":"ContainerStarted","Data":"282cfc5e7e5827615c4afd77a871d0ead776bf21f150e8264408be7575c8f727"} Apr 21 17:35:45.035536 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.035504 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-pz6fc"] Apr 21 17:35:45.037186 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.037164 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:45.039890 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.039864 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 17:35:45.040029 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.039872 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-wf4j4\"" Apr 21 17:35:45.040029 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.039871 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 17:35:45.040029 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.039933 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 17:35:45.041370 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.041321 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 17:35:45.046164 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.046122 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 17:35:45.049565 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.049542 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-pz6fc"] Apr 21 17:35:45.133423 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.133379 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a1fd903-a226-46d4-8e61-54eac7ea70b3-serving-cert\") pod \"console-operator-9d4b6777b-pz6fc\" (UID: \"8a1fd903-a226-46d4-8e61-54eac7ea70b3\") " pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:45.133607 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.133458 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a1fd903-a226-46d4-8e61-54eac7ea70b3-config\") pod \"console-operator-9d4b6777b-pz6fc\" (UID: \"8a1fd903-a226-46d4-8e61-54eac7ea70b3\") " pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:45.133607 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.133479 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a1fd903-a226-46d4-8e61-54eac7ea70b3-trusted-ca\") pod \"console-operator-9d4b6777b-pz6fc\" (UID: \"8a1fd903-a226-46d4-8e61-54eac7ea70b3\") " pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:45.133607 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.133549 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4jvf\" (UniqueName: \"kubernetes.io/projected/8a1fd903-a226-46d4-8e61-54eac7ea70b3-kube-api-access-s4jvf\") pod \"console-operator-9d4b6777b-pz6fc\" (UID: \"8a1fd903-a226-46d4-8e61-54eac7ea70b3\") " pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:45.234076 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.234027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a1fd903-a226-46d4-8e61-54eac7ea70b3-serving-cert\") pod \"console-operator-9d4b6777b-pz6fc\" (UID: \"8a1fd903-a226-46d4-8e61-54eac7ea70b3\") " pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:45.234322 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.234119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a1fd903-a226-46d4-8e61-54eac7ea70b3-config\") pod \"console-operator-9d4b6777b-pz6fc\" (UID: \"8a1fd903-a226-46d4-8e61-54eac7ea70b3\") " pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:45.234322 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.234218 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a1fd903-a226-46d4-8e61-54eac7ea70b3-trusted-ca\") pod \"console-operator-9d4b6777b-pz6fc\" (UID: \"8a1fd903-a226-46d4-8e61-54eac7ea70b3\") " pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:45.234322 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.234271 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4jvf\" (UniqueName: \"kubernetes.io/projected/8a1fd903-a226-46d4-8e61-54eac7ea70b3-kube-api-access-s4jvf\") pod \"console-operator-9d4b6777b-pz6fc\" (UID: \"8a1fd903-a226-46d4-8e61-54eac7ea70b3\") " pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:45.234790 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.234770 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a1fd903-a226-46d4-8e61-54eac7ea70b3-config\") pod \"console-operator-9d4b6777b-pz6fc\" (UID: \"8a1fd903-a226-46d4-8e61-54eac7ea70b3\") " pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:45.234921 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.234902 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a1fd903-a226-46d4-8e61-54eac7ea70b3-trusted-ca\") pod \"console-operator-9d4b6777b-pz6fc\" (UID: \"8a1fd903-a226-46d4-8e61-54eac7ea70b3\") " pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:45.236480 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.236458 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a1fd903-a226-46d4-8e61-54eac7ea70b3-serving-cert\") pod \"console-operator-9d4b6777b-pz6fc\" (UID: \"8a1fd903-a226-46d4-8e61-54eac7ea70b3\") " pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:45.243459 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.243431 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4jvf\" (UniqueName: \"kubernetes.io/projected/8a1fd903-a226-46d4-8e61-54eac7ea70b3-kube-api-access-s4jvf\") pod \"console-operator-9d4b6777b-pz6fc\" (UID: \"8a1fd903-a226-46d4-8e61-54eac7ea70b3\") " pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:45.268565 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.268526 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lhmd6" event={"ID":"a4e0f79c-332d-452e-95ac-b42786fe90d2","Type":"ContainerStarted","Data":"7ce2fe1d5329984cde4fbf1eae698c508de8a3207ee8446c8869c4adda45c882"} Apr 21 17:35:45.346504 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.346412 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:45.468003 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.467949 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lhmd6" podStartSLOduration=0.860695269 podStartE2EDuration="2.467930581s" podCreationTimestamp="2026-04-21 17:35:43 +0000 UTC" firstStartedPulling="2026-04-21 17:35:43.460183202 +0000 UTC m=+128.170877288" lastFinishedPulling="2026-04-21 17:35:45.06741851 +0000 UTC m=+129.778112600" observedRunningTime="2026-04-21 17:35:45.286797751 +0000 UTC m=+129.997491881" watchObservedRunningTime="2026-04-21 17:35:45.467930581 +0000 UTC m=+130.178624713" Apr 21 17:35:45.468441 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.468421 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-pz6fc"] Apr 21 17:35:45.472561 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:35:45.472537 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a1fd903_a226_46d4_8e61_54eac7ea70b3.slice/crio-1ddfc3abdd440e96a0144a31a3bfb3fc05251c5964ba314084e9b3acff0758f3 WatchSource:0}: Error finding container 1ddfc3abdd440e96a0144a31a3bfb3fc05251c5964ba314084e9b3acff0758f3: Status 404 returned error can't find the container with id 1ddfc3abdd440e96a0144a31a3bfb3fc05251c5964ba314084e9b3acff0758f3 Apr 21 17:35:45.738789 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.738748 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:35:45.739040 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:45.738874 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 17:35:45.739040 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:45.738929 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs podName:dcd8a3eb-e25c-4dcb-9468-d578a60a826c nodeName:}" failed. No retries permitted until 2026-04-21 17:37:47.738913636 +0000 UTC m=+252.449607722 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs") pod "network-metrics-daemon-wtk7c" (UID: "dcd8a3eb-e25c-4dcb-9468-d578a60a826c") : secret "metrics-daemon-secret" not found Apr 21 17:35:45.786251 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.786216 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zvdf9_5dd32873-afbe-4cda-ad81-5c8d17abaeb9/dns-node-resolver/0.log" Apr 21 17:35:45.841717 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.841689 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-cd4c6697c-b2gpw"] Apr 21 17:35:45.844302 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.844285 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:45.847197 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.847169 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-n6n9k\"" Apr 21 17:35:45.847329 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.847195 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 17:35:45.847329 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.847200 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 17:35:45.847329 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.847288 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 17:35:45.853209 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.853184 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 17:35:45.857302 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.857277 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-cd4c6697c-b2gpw"] Apr 21 17:35:45.940796 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.940756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-image-registry-private-configuration\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:45.940999 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.940815 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-ca-trust-extracted\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:45.940999 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.940838 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-installation-pull-secrets\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:45.940999 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.940887 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxclg\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-kube-api-access-vxclg\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:45.940999 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.940978 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-certificates\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:45.941234 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.941008 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-bound-sa-token\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:45.941234 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.941034 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-trusted-ca\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:45.941234 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:45.941094 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.042322 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.042227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-image-registry-private-configuration\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.042322 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.042282 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-ca-trust-extracted\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.042550 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.042403 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-installation-pull-secrets\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.042550 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.042490 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxclg\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-kube-api-access-vxclg\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.042550 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.042530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-certificates\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.042700 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.042562 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-bound-sa-token\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.042700 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.042596 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-trusted-ca\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.042700 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.042624 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-ca-trust-extracted\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.042700 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.042633 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.042885 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:46.042775 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 17:35:46.042885 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:46.042790 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cd4c6697c-b2gpw: secret "image-registry-tls" not found Apr 21 17:35:46.042885 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:46.042864 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls podName:ec9e9129-fbcf-473a-b882-5c7e25ee6b50 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:46.542843029 +0000 UTC m=+131.253537129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls") pod "image-registry-cd4c6697c-b2gpw" (UID: "ec9e9129-fbcf-473a-b882-5c7e25ee6b50") : secret "image-registry-tls" not found Apr 21 17:35:46.043175 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.043156 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-certificates\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.043589 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.043572 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-trusted-ca\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.045333 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.045309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-image-registry-private-configuration\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.045420 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.045337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-installation-pull-secrets\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.052069 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.052041 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxclg\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-kube-api-access-vxclg\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.052204 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.052107 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-bound-sa-token\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.186303 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.186266 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6fqcj_755e2315-552e-4e00-ba7b-cf1e07a5c8d1/node-ca/0.log" Apr 21 17:35:46.272402 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.272365 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" event={"ID":"8a1fd903-a226-46d4-8e61-54eac7ea70b3","Type":"ContainerStarted","Data":"1ddfc3abdd440e96a0144a31a3bfb3fc05251c5964ba314084e9b3acff0758f3"} Apr 21 17:35:46.546493 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:46.546451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:46.546678 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:46.546571 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 17:35:46.546678 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:46.546583 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cd4c6697c-b2gpw: secret "image-registry-tls" not found Apr 21 17:35:46.546678 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:46.546634 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls podName:ec9e9129-fbcf-473a-b882-5c7e25ee6b50 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:47.546620828 +0000 UTC m=+132.257314914 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls") pod "image-registry-cd4c6697c-b2gpw" (UID: "ec9e9129-fbcf-473a-b882-5c7e25ee6b50") : secret "image-registry-tls" not found Apr 21 17:35:47.556194 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:47.555557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:47.556194 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:47.555707 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 17:35:47.556194 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:47.555722 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cd4c6697c-b2gpw: secret "image-registry-tls" not found Apr 21 17:35:47.556194 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:47.555779 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls podName:ec9e9129-fbcf-473a-b882-5c7e25ee6b50 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:49.555760189 +0000 UTC m=+134.266454294 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls") pod "image-registry-cd4c6697c-b2gpw" (UID: "ec9e9129-fbcf-473a-b882-5c7e25ee6b50") : secret "image-registry-tls" not found Apr 21 17:35:47.958325 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:47.958232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:35:47.958475 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:47.958338 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 17:35:47.958475 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:47.958421 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls podName:433814e5-34ad-4fb0-bc66-58a825bc82e4 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:55.958403906 +0000 UTC m=+140.669097991 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-45p2m" (UID: "433814e5-34ad-4fb0-bc66-58a825bc82e4") : secret "cluster-monitoring-operator-tls" not found Apr 21 17:35:48.278715 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:48.278685 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/0.log" Apr 21 17:35:48.278894 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:48.278727 2573 generic.go:358] "Generic (PLEG): container finished" podID="8a1fd903-a226-46d4-8e61-54eac7ea70b3" containerID="3f5fba35d3f0d9c6b3118a1229b5642bb8c41882cfc232befaf7a2cd5f33f1b5" exitCode=255 Apr 21 17:35:48.278894 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:48.278766 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" event={"ID":"8a1fd903-a226-46d4-8e61-54eac7ea70b3","Type":"ContainerDied","Data":"3f5fba35d3f0d9c6b3118a1229b5642bb8c41882cfc232befaf7a2cd5f33f1b5"} Apr 21 17:35:48.279021 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:48.279005 2573 scope.go:117] "RemoveContainer" containerID="3f5fba35d3f0d9c6b3118a1229b5642bb8c41882cfc232befaf7a2cd5f33f1b5" Apr 21 17:35:49.283146 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:49.283115 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/1.log" Apr 21 17:35:49.283590 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:49.283524 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/0.log" Apr 21 17:35:49.283590 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:49.283555 2573 generic.go:358] "Generic (PLEG): container finished" podID="8a1fd903-a226-46d4-8e61-54eac7ea70b3" containerID="77449c6931b74a37adcad8d287c3d967cd5bd91c35d5d6d71b8b4b590856ef88" exitCode=255 Apr 21 17:35:49.283659 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:49.283594 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" event={"ID":"8a1fd903-a226-46d4-8e61-54eac7ea70b3","Type":"ContainerDied","Data":"77449c6931b74a37adcad8d287c3d967cd5bd91c35d5d6d71b8b4b590856ef88"} Apr 21 17:35:49.283659 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:49.283622 2573 scope.go:117] "RemoveContainer" containerID="3f5fba35d3f0d9c6b3118a1229b5642bb8c41882cfc232befaf7a2cd5f33f1b5" Apr 21 17:35:49.283882 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:49.283857 2573 scope.go:117] "RemoveContainer" containerID="77449c6931b74a37adcad8d287c3d967cd5bd91c35d5d6d71b8b4b590856ef88" Apr 21 17:35:49.284090 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:49.284071 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-pz6fc_openshift-console-operator(8a1fd903-a226-46d4-8e61-54eac7ea70b3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" podUID="8a1fd903-a226-46d4-8e61-54eac7ea70b3" Apr 21 17:35:49.570926 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:49.570842 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:49.571077 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:49.570954 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 17:35:49.571077 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:49.570967 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cd4c6697c-b2gpw: secret "image-registry-tls" not found Apr 21 17:35:49.571077 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:49.571016 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls podName:ec9e9129-fbcf-473a-b882-5c7e25ee6b50 nodeName:}" failed. No retries permitted until 2026-04-21 17:35:53.571002972 +0000 UTC m=+138.281697059 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls") pod "image-registry-cd4c6697c-b2gpw" (UID: "ec9e9129-fbcf-473a-b882-5c7e25ee6b50") : secret "image-registry-tls" not found Apr 21 17:35:50.286446 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:50.286412 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/1.log" Apr 21 17:35:50.286839 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:50.286816 2573 scope.go:117] "RemoveContainer" containerID="77449c6931b74a37adcad8d287c3d967cd5bd91c35d5d6d71b8b4b590856ef88" Apr 21 17:35:50.287018 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:50.287000 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-pz6fc_openshift-console-operator(8a1fd903-a226-46d4-8e61-54eac7ea70b3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" podUID="8a1fd903-a226-46d4-8e61-54eac7ea70b3" Apr 21 17:35:51.744054 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:51.744014 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-ndw4b"] Apr 21 17:35:51.746066 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:51.746044 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ndw4b" Apr 21 17:35:51.748854 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:51.748832 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-gtfdt\"" Apr 21 17:35:51.754233 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:51.754211 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-ndw4b"] Apr 21 17:35:51.785759 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:51.785713 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn22n\" (UniqueName: \"kubernetes.io/projected/fc19cc8c-7c6d-4680-948c-31fb8fb0f70e-kube-api-access-sn22n\") pod \"network-check-source-8894fc9bd-ndw4b\" (UID: \"fc19cc8c-7c6d-4680-948c-31fb8fb0f70e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ndw4b" Apr 21 17:35:51.886251 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:51.886219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn22n\" (UniqueName: \"kubernetes.io/projected/fc19cc8c-7c6d-4680-948c-31fb8fb0f70e-kube-api-access-sn22n\") pod \"network-check-source-8894fc9bd-ndw4b\" (UID: \"fc19cc8c-7c6d-4680-948c-31fb8fb0f70e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ndw4b" Apr 21 17:35:51.895560 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:51.895527 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn22n\" (UniqueName: \"kubernetes.io/projected/fc19cc8c-7c6d-4680-948c-31fb8fb0f70e-kube-api-access-sn22n\") pod \"network-check-source-8894fc9bd-ndw4b\" (UID: \"fc19cc8c-7c6d-4680-948c-31fb8fb0f70e\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ndw4b" Apr 21 17:35:52.054593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.054489 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ndw4b" Apr 21 17:35:52.172305 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.172248 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-ndw4b"] Apr 21 17:35:52.175458 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:35:52.175428 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc19cc8c_7c6d_4680_948c_31fb8fb0f70e.slice/crio-23ea328fd9600b9fe99d6448f55c5303a11331a7bc9fc00b8ccccc47d07f2c5d WatchSource:0}: Error finding container 23ea328fd9600b9fe99d6448f55c5303a11331a7bc9fc00b8ccccc47d07f2c5d: Status 404 returned error can't find the container with id 23ea328fd9600b9fe99d6448f55c5303a11331a7bc9fc00b8ccccc47d07f2c5d Apr 21 17:35:52.292006 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.291969 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ndw4b" event={"ID":"fc19cc8c-7c6d-4680-948c-31fb8fb0f70e","Type":"ContainerStarted","Data":"90d43bc05d6fa532a2d74a18c84f46ca2b5ea50b20f91b3a7a7ee574788660c3"} Apr 21 17:35:52.292006 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.292006 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ndw4b" event={"ID":"fc19cc8c-7c6d-4680-948c-31fb8fb0f70e","Type":"ContainerStarted","Data":"23ea328fd9600b9fe99d6448f55c5303a11331a7bc9fc00b8ccccc47d07f2c5d"} Apr 21 17:35:52.308937 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.308827 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-ndw4b" podStartSLOduration=1.308808958 podStartE2EDuration="1.308808958s" podCreationTimestamp="2026-04-21 17:35:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:35:52.307477042 +0000 UTC m=+137.018171151" watchObservedRunningTime="2026-04-21 17:35:52.308808958 +0000 UTC m=+137.019503067" Apr 21 17:35:52.632827 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.632746 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-flx88"] Apr 21 17:35:52.634785 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.634768 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flx88" Apr 21 17:35:52.637942 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.637913 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 17:35:52.637942 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.637930 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-xprp7\"" Apr 21 17:35:52.638122 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.637950 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 17:35:52.646076 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.646048 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-flx88"] Apr 21 17:35:52.692276 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.692243 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4f6l\" (UniqueName: \"kubernetes.io/projected/3892d71e-83cc-4628-8e38-087a4c7c7787-kube-api-access-q4f6l\") pod \"migrator-74bb7799d9-flx88\" (UID: \"3892d71e-83cc-4628-8e38-087a4c7c7787\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flx88" Apr 21 17:35:52.792824 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.792782 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4f6l\" (UniqueName: \"kubernetes.io/projected/3892d71e-83cc-4628-8e38-087a4c7c7787-kube-api-access-q4f6l\") pod \"migrator-74bb7799d9-flx88\" (UID: \"3892d71e-83cc-4628-8e38-087a4c7c7787\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flx88" Apr 21 17:35:52.801045 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.801018 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4f6l\" (UniqueName: \"kubernetes.io/projected/3892d71e-83cc-4628-8e38-087a4c7c7787-kube-api-access-q4f6l\") pod \"migrator-74bb7799d9-flx88\" (UID: \"3892d71e-83cc-4628-8e38-087a4c7c7787\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flx88" Apr 21 17:35:52.944749 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:52.944635 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flx88" Apr 21 17:35:53.068826 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:53.068793 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-flx88"] Apr 21 17:35:53.072772 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:35:53.072743 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3892d71e_83cc_4628_8e38_087a4c7c7787.slice/crio-3afb243af3d58ddf450dc7fa5f3c9065657fa91517d0e92b460d3878f6e9cb26 WatchSource:0}: Error finding container 3afb243af3d58ddf450dc7fa5f3c9065657fa91517d0e92b460d3878f6e9cb26: Status 404 returned error can't find the container with id 3afb243af3d58ddf450dc7fa5f3c9065657fa91517d0e92b460d3878f6e9cb26 Apr 21 17:35:53.295895 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:53.295860 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flx88" event={"ID":"3892d71e-83cc-4628-8e38-087a4c7c7787","Type":"ContainerStarted","Data":"3afb243af3d58ddf450dc7fa5f3c9065657fa91517d0e92b460d3878f6e9cb26"} Apr 21 17:35:53.598277 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:53.598181 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:35:53.598455 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:53.598344 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 17:35:53.598455 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:53.598369 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cd4c6697c-b2gpw: secret "image-registry-tls" not found Apr 21 17:35:53.598455 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:53.598440 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls podName:ec9e9129-fbcf-473a-b882-5c7e25ee6b50 nodeName:}" failed. No retries permitted until 2026-04-21 17:36:01.598419643 +0000 UTC m=+146.309113743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls") pod "image-registry-cd4c6697c-b2gpw" (UID: "ec9e9129-fbcf-473a-b882-5c7e25ee6b50") : secret "image-registry-tls" not found Apr 21 17:35:55.303434 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:55.303395 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flx88" event={"ID":"3892d71e-83cc-4628-8e38-087a4c7c7787","Type":"ContainerStarted","Data":"71fbd88d5f7632bb59a4a876ee138c96a8fb9e90dda46eb098e5c443d2f467e2"} Apr 21 17:35:55.303434 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:55.303435 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flx88" event={"ID":"3892d71e-83cc-4628-8e38-087a4c7c7787","Type":"ContainerStarted","Data":"7c459ddb5f415dcefa29603c3a019afcd96daebddcbe7957c5ee0054206bedad"} Apr 21 17:35:55.318720 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:55.318671 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-flx88" podStartSLOduration=1.9951231790000001 podStartE2EDuration="3.318655291s" podCreationTimestamp="2026-04-21 17:35:52 +0000 UTC" firstStartedPulling="2026-04-21 17:35:53.075098521 +0000 UTC m=+137.785792608" lastFinishedPulling="2026-04-21 17:35:54.398630631 +0000 UTC m=+139.109324720" observedRunningTime="2026-04-21 17:35:55.317746451 +0000 UTC m=+140.028440559" watchObservedRunningTime="2026-04-21 17:35:55.318655291 +0000 UTC m=+140.029349398" Apr 21 17:35:55.347166 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:55.347116 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:55.347259 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:55.347173 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:35:55.347534 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:55.347523 2573 scope.go:117] "RemoveContainer" containerID="77449c6931b74a37adcad8d287c3d967cd5bd91c35d5d6d71b8b4b590856ef88" Apr 21 17:35:55.347696 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:55.347680 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-pz6fc_openshift-console-operator(8a1fd903-a226-46d4-8e61-54eac7ea70b3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" podUID="8a1fd903-a226-46d4-8e61-54eac7ea70b3" Apr 21 17:35:56.015854 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:56.015812 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:35:56.016042 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:56.015956 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 17:35:56.016042 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:56.016032 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls podName:433814e5-34ad-4fb0-bc66-58a825bc82e4 nodeName:}" failed. No retries permitted until 2026-04-21 17:36:12.016017409 +0000 UTC m=+156.726711500 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-45p2m" (UID: "433814e5-34ad-4fb0-bc66-58a825bc82e4") : secret "cluster-monitoring-operator-tls" not found Apr 21 17:35:57.020500 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.020464 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-nv7nt"] Apr 21 17:35:57.022916 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.022898 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.025709 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.025684 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 17:35:57.025709 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.025699 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-55shw\"" Apr 21 17:35:57.027045 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.027029 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 17:35:57.035776 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.035745 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nv7nt"] Apr 21 17:35:57.122969 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.122916 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.122969 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.122975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-674h9\" (UniqueName: \"kubernetes.io/projected/ac21d64f-1975-43d4-bed8-a9c4ecce476f-kube-api-access-674h9\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.123210 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.123031 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ac21d64f-1975-43d4-bed8-a9c4ecce476f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.123210 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.123077 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ac21d64f-1975-43d4-bed8-a9c4ecce476f-crio-socket\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.123210 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.123181 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ac21d64f-1975-43d4-bed8-a9c4ecce476f-data-volume\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.224039 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.224005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ac21d64f-1975-43d4-bed8-a9c4ecce476f-data-volume\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.224290 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.224068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.224290 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.224119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-674h9\" (UniqueName: \"kubernetes.io/projected/ac21d64f-1975-43d4-bed8-a9c4ecce476f-kube-api-access-674h9\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.224290 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.224172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ac21d64f-1975-43d4-bed8-a9c4ecce476f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.224290 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:57.224184 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 17:35:57.224290 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.224209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ac21d64f-1975-43d4-bed8-a9c4ecce476f-crio-socket\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.224290 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:57.224260 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls podName:ac21d64f-1975-43d4-bed8-a9c4ecce476f nodeName:}" failed. No retries permitted until 2026-04-21 17:35:57.724239024 +0000 UTC m=+142.434933132 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls") pod "insights-runtime-extractor-nv7nt" (UID: "ac21d64f-1975-43d4-bed8-a9c4ecce476f") : secret "insights-runtime-extractor-tls" not found Apr 21 17:35:57.224290 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.224290 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ac21d64f-1975-43d4-bed8-a9c4ecce476f-crio-socket\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.224565 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.224403 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ac21d64f-1975-43d4-bed8-a9c4ecce476f-data-volume\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.224672 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.224651 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ac21d64f-1975-43d4-bed8-a9c4ecce476f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.232838 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.232815 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-674h9\" (UniqueName: \"kubernetes.io/projected/ac21d64f-1975-43d4-bed8-a9c4ecce476f-kube-api-access-674h9\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.727816 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:57.727765 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:57.728007 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:57.727915 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 17:35:57.728007 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:57.727992 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls podName:ac21d64f-1975-43d4-bed8-a9c4ecce476f nodeName:}" failed. No retries permitted until 2026-04-21 17:35:58.727976703 +0000 UTC m=+143.438670790 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls") pod "insights-runtime-extractor-nv7nt" (UID: "ac21d64f-1975-43d4-bed8-a9c4ecce476f") : secret "insights-runtime-extractor-tls" not found Apr 21 17:35:58.736232 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:35:58.736194 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:35:58.736711 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:58.736366 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 17:35:58.736711 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:35:58.736449 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls podName:ac21d64f-1975-43d4-bed8-a9c4ecce476f nodeName:}" failed. No retries permitted until 2026-04-21 17:36:00.736427867 +0000 UTC m=+145.447121959 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls") pod "insights-runtime-extractor-nv7nt" (UID: "ac21d64f-1975-43d4-bed8-a9c4ecce476f") : secret "insights-runtime-extractor-tls" not found Apr 21 17:36:00.751809 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:00.751746 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:36:00.752236 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:36:00.751905 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 17:36:00.752236 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:36:00.751983 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls podName:ac21d64f-1975-43d4-bed8-a9c4ecce476f nodeName:}" failed. No retries permitted until 2026-04-21 17:36:04.7519668 +0000 UTC m=+149.462660886 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls") pod "insights-runtime-extractor-nv7nt" (UID: "ac21d64f-1975-43d4-bed8-a9c4ecce476f") : secret "insights-runtime-extractor-tls" not found Apr 21 17:36:01.659861 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:01.659823 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:36:01.660027 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:36:01.659969 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 17:36:01.660027 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:36:01.659987 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-cd4c6697c-b2gpw: secret "image-registry-tls" not found Apr 21 17:36:01.660118 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:36:01.660052 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls podName:ec9e9129-fbcf-473a-b882-5c7e25ee6b50 nodeName:}" failed. No retries permitted until 2026-04-21 17:36:17.660037023 +0000 UTC m=+162.370731109 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls") pod "image-registry-cd4c6697c-b2gpw" (UID: "ec9e9129-fbcf-473a-b882-5c7e25ee6b50") : secret "image-registry-tls" not found Apr 21 17:36:04.785510 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:04.785469 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:36:04.787872 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:04.787847 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ac21d64f-1975-43d4-bed8-a9c4ecce476f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nv7nt\" (UID: \"ac21d64f-1975-43d4-bed8-a9c4ecce476f\") " pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:36:04.832061 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:04.832020 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nv7nt" Apr 21 17:36:04.951428 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:04.951391 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nv7nt"] Apr 21 17:36:05.327039 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:05.327009 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nv7nt" event={"ID":"ac21d64f-1975-43d4-bed8-a9c4ecce476f","Type":"ContainerStarted","Data":"cba0b3112d0fe9c32f4c76c9832c7663a9b65a0808e412e97f9fbd85afd0d5ed"} Apr 21 17:36:05.327039 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:05.327044 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nv7nt" event={"ID":"ac21d64f-1975-43d4-bed8-a9c4ecce476f","Type":"ContainerStarted","Data":"1ccdb9b916d8287127e11feb6250d452313a21997f7874dfa7c89571c6cd59b2"} Apr 21 17:36:06.331282 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:06.331237 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nv7nt" event={"ID":"ac21d64f-1975-43d4-bed8-a9c4ecce476f","Type":"ContainerStarted","Data":"314d65f834982a5a511630a2a008875a50bb375c04eccd0f7dc1a9f8814cd17c"} Apr 21 17:36:07.334968 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:07.334878 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nv7nt" event={"ID":"ac21d64f-1975-43d4-bed8-a9c4ecce476f","Type":"ContainerStarted","Data":"ca0e36a43394376587cb07cb19e72b2a274e640713baa21bb74212ebffc4f92d"} Apr 21 17:36:07.353865 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:07.353812 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-nv7nt" podStartSLOduration=8.320189314 podStartE2EDuration="10.35379789s" podCreationTimestamp="2026-04-21 17:35:57 +0000 UTC" firstStartedPulling="2026-04-21 17:36:05.007910142 +0000 UTC m=+149.718604228" lastFinishedPulling="2026-04-21 17:36:07.041518715 +0000 UTC m=+151.752212804" observedRunningTime="2026-04-21 17:36:07.351935514 +0000 UTC m=+152.062629622" watchObservedRunningTime="2026-04-21 17:36:07.35379789 +0000 UTC m=+152.064491998" Apr 21 17:36:07.905565 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:07.905527 2573 scope.go:117] "RemoveContainer" containerID="77449c6931b74a37adcad8d287c3d967cd5bd91c35d5d6d71b8b4b590856ef88" Apr 21 17:36:08.338544 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:08.338516 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 17:36:08.339012 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:08.338914 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/1.log" Apr 21 17:36:08.339012 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:08.338956 2573 generic.go:358] "Generic (PLEG): container finished" podID="8a1fd903-a226-46d4-8e61-54eac7ea70b3" containerID="2bc31f9bef602c063d5d3239901f030520530e7499af3e1262c0b7d4a6cdbb7d" exitCode=255 Apr 21 17:36:08.339112 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:08.339027 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" event={"ID":"8a1fd903-a226-46d4-8e61-54eac7ea70b3","Type":"ContainerDied","Data":"2bc31f9bef602c063d5d3239901f030520530e7499af3e1262c0b7d4a6cdbb7d"} Apr 21 17:36:08.339112 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:08.339065 2573 scope.go:117] "RemoveContainer" containerID="77449c6931b74a37adcad8d287c3d967cd5bd91c35d5d6d71b8b4b590856ef88" Apr 21 17:36:08.339608 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:08.339587 2573 scope.go:117] "RemoveContainer" containerID="2bc31f9bef602c063d5d3239901f030520530e7499af3e1262c0b7d4a6cdbb7d" Apr 21 17:36:08.339806 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:36:08.339778 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-pz6fc_openshift-console-operator(8a1fd903-a226-46d4-8e61-54eac7ea70b3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" podUID="8a1fd903-a226-46d4-8e61-54eac7ea70b3" Apr 21 17:36:09.342746 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:09.342718 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 17:36:12.045184 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:12.045116 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:36:12.047543 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:12.047515 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/433814e5-34ad-4fb0-bc66-58a825bc82e4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-45p2m\" (UID: \"433814e5-34ad-4fb0-bc66-58a825bc82e4\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:36:12.230608 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:36:12.230560 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-4vw75" podUID="480585ac-7f87-43b8-98a2-398a61a10ad3" Apr 21 17:36:12.236749 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:36:12.236712 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dzd2m" podUID="85cdcbba-06ac-4178-a25d-5d6eb221a155" Apr 21 17:36:12.322421 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:12.322342 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" Apr 21 17:36:12.350397 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:12.350371 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4vw75" Apr 21 17:36:12.350561 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:12.350409 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:36:12.443518 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:12.443488 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m"] Apr 21 17:36:12.448416 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:36:12.448374 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod433814e5_34ad_4fb0_bc66_58a825bc82e4.slice/crio-d7af9b659d253df741407caf9ac86baa2d1e1d051e854579f50727a0589de75f WatchSource:0}: Error finding container d7af9b659d253df741407caf9ac86baa2d1e1d051e854579f50727a0589de75f: Status 404 returned error can't find the container with id d7af9b659d253df741407caf9ac86baa2d1e1d051e854579f50727a0589de75f Apr 21 17:36:13.354537 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:13.354503 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" event={"ID":"433814e5-34ad-4fb0-bc66-58a825bc82e4","Type":"ContainerStarted","Data":"d7af9b659d253df741407caf9ac86baa2d1e1d051e854579f50727a0589de75f"} Apr 21 17:36:13.915268 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:36:13.915226 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-wtk7c" podUID="dcd8a3eb-e25c-4dcb-9468-d578a60a826c" Apr 21 17:36:15.347032 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:15.346997 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:36:15.347032 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:15.347037 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:36:15.347493 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:15.347466 2573 scope.go:117] "RemoveContainer" containerID="2bc31f9bef602c063d5d3239901f030520530e7499af3e1262c0b7d4a6cdbb7d" Apr 21 17:36:15.347697 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:36:15.347677 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-pz6fc_openshift-console-operator(8a1fd903-a226-46d4-8e61-54eac7ea70b3)\"" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" podUID="8a1fd903-a226-46d4-8e61-54eac7ea70b3" Apr 21 17:36:15.361786 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:15.361751 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" event={"ID":"433814e5-34ad-4fb0-bc66-58a825bc82e4","Type":"ContainerStarted","Data":"5552f66b024531b32e27fb3193803e65cb75a1d4e9bc162003561d40b40e3d61"} Apr 21 17:36:15.392297 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:15.392243 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-45p2m" podStartSLOduration=33.279363789 podStartE2EDuration="35.392228174s" podCreationTimestamp="2026-04-21 17:35:40 +0000 UTC" firstStartedPulling="2026-04-21 17:36:12.450713352 +0000 UTC m=+157.161407438" lastFinishedPulling="2026-04-21 17:36:14.563577732 +0000 UTC m=+159.274271823" observedRunningTime="2026-04-21 17:36:15.390832006 +0000 UTC m=+160.101526127" watchObservedRunningTime="2026-04-21 17:36:15.392228174 +0000 UTC m=+160.102922318" Apr 21 17:36:17.185237 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:17.185174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:36:17.185712 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:17.185291 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert\") pod \"ingress-canary-dzd2m\" (UID: \"85cdcbba-06ac-4178-a25d-5d6eb221a155\") " pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:36:17.187725 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:17.187701 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/480585ac-7f87-43b8-98a2-398a61a10ad3-metrics-tls\") pod \"dns-default-4vw75\" (UID: \"480585ac-7f87-43b8-98a2-398a61a10ad3\") " pod="openshift-dns/dns-default-4vw75" Apr 21 17:36:17.187820 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:17.187801 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85cdcbba-06ac-4178-a25d-5d6eb221a155-cert\") pod \"ingress-canary-dzd2m\" (UID: \"85cdcbba-06ac-4178-a25d-5d6eb221a155\") " pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:36:17.454636 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:17.454548 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4fbh6\"" Apr 21 17:36:17.455769 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:17.455750 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mlvk8\"" Apr 21 17:36:17.462051 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:17.462027 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dzd2m" Apr 21 17:36:17.462051 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:17.462045 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4vw75" Apr 21 17:36:17.605014 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:17.604971 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4vw75"] Apr 21 17:36:17.608525 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:36:17.608489 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod480585ac_7f87_43b8_98a2_398a61a10ad3.slice/crio-b1006ca97cf10f038f8d23fc968d006a9b8c76f47dbf94da657109138b673b91 WatchSource:0}: Error finding container b1006ca97cf10f038f8d23fc968d006a9b8c76f47dbf94da657109138b673b91: Status 404 returned error can't find the container with id b1006ca97cf10f038f8d23fc968d006a9b8c76f47dbf94da657109138b673b91 Apr 21 17:36:17.624883 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:17.624847 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dzd2m"] Apr 21 17:36:17.628777 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:36:17.628744 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85cdcbba_06ac_4178_a25d_5d6eb221a155.slice/crio-b4ea6b8c374235c47900b3c90a478499d82538fb076324163062726e54242404 WatchSource:0}: Error finding container b4ea6b8c374235c47900b3c90a478499d82538fb076324163062726e54242404: Status 404 returned error can't find the container with id b4ea6b8c374235c47900b3c90a478499d82538fb076324163062726e54242404 Apr 21 17:36:17.688117 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:17.688079 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:36:17.690505 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:17.690483 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls\") pod \"image-registry-cd4c6697c-b2gpw\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:36:17.955775 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:17.955742 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:36:18.102680 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:18.102644 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-cd4c6697c-b2gpw"] Apr 21 17:36:18.108041 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:36:18.108003 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec9e9129_fbcf_473a_b882_5c7e25ee6b50.slice/crio-81b2cc0522d5a9141e072507ae07a0e10f36494f81e9b6a669f5c0951b6da8ec WatchSource:0}: Error finding container 81b2cc0522d5a9141e072507ae07a0e10f36494f81e9b6a669f5c0951b6da8ec: Status 404 returned error can't find the container with id 81b2cc0522d5a9141e072507ae07a0e10f36494f81e9b6a669f5c0951b6da8ec Apr 21 17:36:18.371639 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:18.371563 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" event={"ID":"ec9e9129-fbcf-473a-b882-5c7e25ee6b50","Type":"ContainerStarted","Data":"7896fd86425aa007094675b7205c5ae15719fd275b5066a9e9f42067dcec70e8"} Apr 21 17:36:18.371639 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:18.371606 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" event={"ID":"ec9e9129-fbcf-473a-b882-5c7e25ee6b50","Type":"ContainerStarted","Data":"81b2cc0522d5a9141e072507ae07a0e10f36494f81e9b6a669f5c0951b6da8ec"} Apr 21 17:36:18.371639 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:18.371647 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:36:18.373200 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:18.373166 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dzd2m" event={"ID":"85cdcbba-06ac-4178-a25d-5d6eb221a155","Type":"ContainerStarted","Data":"b4ea6b8c374235c47900b3c90a478499d82538fb076324163062726e54242404"} Apr 21 17:36:18.374462 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:18.374425 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4vw75" event={"ID":"480585ac-7f87-43b8-98a2-398a61a10ad3","Type":"ContainerStarted","Data":"b1006ca97cf10f038f8d23fc968d006a9b8c76f47dbf94da657109138b673b91"} Apr 21 17:36:18.391731 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:18.391667 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" podStartSLOduration=33.3916457 podStartE2EDuration="33.3916457s" podCreationTimestamp="2026-04-21 17:35:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:36:18.391261665 +0000 UTC m=+163.101955774" watchObservedRunningTime="2026-04-21 17:36:18.3916457 +0000 UTC m=+163.102339809" Apr 21 17:36:18.893095 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:18.893038 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-cd4c6697c-b2gpw"] Apr 21 17:36:20.381380 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:20.381275 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dzd2m" event={"ID":"85cdcbba-06ac-4178-a25d-5d6eb221a155","Type":"ContainerStarted","Data":"26b3a14e72bcbfc468d5f9f19c698eb63139817ac7a5722db1879ea8d026a15f"} Apr 21 17:36:20.382921 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:20.382895 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4vw75" event={"ID":"480585ac-7f87-43b8-98a2-398a61a10ad3","Type":"ContainerStarted","Data":"4923749b46b8244e39680f3fdf0887a617d5af18c87b7a2bced829f7f4d1f0ce"} Apr 21 17:36:20.382921 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:20.382925 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4vw75" event={"ID":"480585ac-7f87-43b8-98a2-398a61a10ad3","Type":"ContainerStarted","Data":"94f79756d57b16cc3815586d5ae194ea364a09fbb31e7f75060d33b0e8e94eb9"} Apr 21 17:36:20.383077 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:20.382999 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-4vw75" Apr 21 17:36:20.397908 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:20.397835 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dzd2m" podStartSLOduration=129.543180076 podStartE2EDuration="2m11.397818658s" podCreationTimestamp="2026-04-21 17:34:09 +0000 UTC" firstStartedPulling="2026-04-21 17:36:17.630654053 +0000 UTC m=+162.341348139" lastFinishedPulling="2026-04-21 17:36:19.485292632 +0000 UTC m=+164.195986721" observedRunningTime="2026-04-21 17:36:20.397451415 +0000 UTC m=+165.108145537" watchObservedRunningTime="2026-04-21 17:36:20.397818658 +0000 UTC m=+165.108512770" Apr 21 17:36:20.416800 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:20.416750 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4vw75" podStartSLOduration=129.545683634 podStartE2EDuration="2m11.416732089s" podCreationTimestamp="2026-04-21 17:34:09 +0000 UTC" firstStartedPulling="2026-04-21 17:36:17.610349056 +0000 UTC m=+162.321043141" lastFinishedPulling="2026-04-21 17:36:19.48139751 +0000 UTC m=+164.192091596" observedRunningTime="2026-04-21 17:36:20.415933452 +0000 UTC m=+165.126627572" watchObservedRunningTime="2026-04-21 17:36:20.416732089 +0000 UTC m=+165.127426197" Apr 21 17:36:22.094714 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.094676 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-m84vb"] Apr 21 17:36:22.101234 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.101210 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" Apr 21 17:36:22.105369 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.105347 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-wk565\"" Apr 21 17:36:22.105536 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.105348 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 17:36:22.105536 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.105401 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 17:36:22.105536 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.105348 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 17:36:22.107759 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.107721 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-m84vb"] Apr 21 17:36:22.228550 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.228499 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-m84vb\" (UID: \"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" Apr 21 17:36:22.228727 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.228576 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-m84vb\" (UID: \"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" Apr 21 17:36:22.228727 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.228610 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-m84vb\" (UID: \"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" Apr 21 17:36:22.228727 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.228649 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbdvr\" (UniqueName: \"kubernetes.io/projected/1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e-kube-api-access-gbdvr\") pod \"prometheus-operator-5676c8c784-m84vb\" (UID: \"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" Apr 21 17:36:22.329351 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.329310 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-m84vb\" (UID: \"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" Apr 21 17:36:22.329520 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.329360 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-m84vb\" (UID: \"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" Apr 21 17:36:22.329520 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.329412 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbdvr\" (UniqueName: \"kubernetes.io/projected/1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e-kube-api-access-gbdvr\") pod \"prometheus-operator-5676c8c784-m84vb\" (UID: \"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" Apr 21 17:36:22.329520 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.329456 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-m84vb\" (UID: \"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" Apr 21 17:36:22.330196 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.330171 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-m84vb\" (UID: \"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" Apr 21 17:36:22.331908 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.331884 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-m84vb\" (UID: \"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" Apr 21 17:36:22.332020 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.331977 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-m84vb\" (UID: \"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" Apr 21 17:36:22.338122 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.338095 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbdvr\" (UniqueName: \"kubernetes.io/projected/1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e-kube-api-access-gbdvr\") pod \"prometheus-operator-5676c8c784-m84vb\" (UID: \"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" Apr 21 17:36:22.411480 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.411387 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" Apr 21 17:36:22.542510 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:22.542477 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-m84vb"] Apr 21 17:36:22.546325 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:36:22.546286 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1046fd5a_af2d_4faa_84b3_25b9c6ca7b9e.slice/crio-5e4300e8d167b055c0e7f29fcc634c8c17f8800ed61567db4c153135c3a69271 WatchSource:0}: Error finding container 5e4300e8d167b055c0e7f29fcc634c8c17f8800ed61567db4c153135c3a69271: Status 404 returned error can't find the container with id 5e4300e8d167b055c0e7f29fcc634c8c17f8800ed61567db4c153135c3a69271 Apr 21 17:36:23.394213 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:23.394168 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" event={"ID":"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e","Type":"ContainerStarted","Data":"5e4300e8d167b055c0e7f29fcc634c8c17f8800ed61567db4c153135c3a69271"} Apr 21 17:36:24.398742 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:24.398707 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" event={"ID":"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e","Type":"ContainerStarted","Data":"b27fa1c842a4339d647c18f2ff7baa44d70ded5f356bb9816b0c940b361cd31c"} Apr 21 17:36:24.398742 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:24.398746 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" event={"ID":"1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e","Type":"ContainerStarted","Data":"53ced95e41515565598dd4f0766d49e28997954e7282c20f91caf5c2bd2546bc"} Apr 21 17:36:24.416312 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:24.416256 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-m84vb" podStartSLOduration=1.023694318 podStartE2EDuration="2.416240985s" podCreationTimestamp="2026-04-21 17:36:22 +0000 UTC" firstStartedPulling="2026-04-21 17:36:22.548214462 +0000 UTC m=+167.258908562" lastFinishedPulling="2026-04-21 17:36:23.94076114 +0000 UTC m=+168.651455229" observedRunningTime="2026-04-21 17:36:24.415600505 +0000 UTC m=+169.126294648" watchObservedRunningTime="2026-04-21 17:36:24.416240985 +0000 UTC m=+169.126935284" Apr 21 17:36:24.905265 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:24.905218 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:36:26.450223 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.450193 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-mn2tn"] Apr 21 17:36:26.453970 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.453890 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.456654 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.456630 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 17:36:26.456779 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.456752 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 17:36:26.456875 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.456848 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 17:36:26.457000 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.456929 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xm58w\"" Apr 21 17:36:26.471572 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.471544 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-vm4tx"] Apr 21 17:36:26.474664 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.474646 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.477002 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.476979 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 17:36:26.477945 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.477922 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-c8rm6\"" Apr 21 17:36:26.478258 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.478241 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 17:36:26.478359 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.478342 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 17:36:26.488886 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.488858 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-vm4tx"] Apr 21 17:36:26.564987 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.564947 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1cb4fd4e-5662-4538-8827-31633f56c7ed-metrics-client-ca\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.564987 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.564993 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.565241 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.565019 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-accelerators-collector-config\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.565241 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.565045 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56pd6\" (UniqueName: \"kubernetes.io/projected/1cb4fd4e-5662-4538-8827-31633f56c7ed-kube-api-access-56pd6\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.565241 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.565070 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2725cd96-5201-42ad-93cd-e934fa8eb17e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.565241 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.565092 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2725cd96-5201-42ad-93cd-e934fa8eb17e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.565241 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.565160 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2725cd96-5201-42ad-93cd-e934fa8eb17e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.565241 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.565215 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-tls\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.565241 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.565236 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-wtmp\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.565480 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.565252 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk8kr\" (UniqueName: \"kubernetes.io/projected/2725cd96-5201-42ad-93cd-e934fa8eb17e-kube-api-access-hk8kr\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.565480 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.565272 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2725cd96-5201-42ad-93cd-e934fa8eb17e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.565480 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.565342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1cb4fd4e-5662-4538-8827-31633f56c7ed-sys\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.565480 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.565366 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2725cd96-5201-42ad-93cd-e934fa8eb17e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.565480 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.565415 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1cb4fd4e-5662-4538-8827-31633f56c7ed-root\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.565480 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.565430 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-textfile\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.666381 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666346 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-wtmp\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.666545 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hk8kr\" (UniqueName: \"kubernetes.io/projected/2725cd96-5201-42ad-93cd-e934fa8eb17e-kube-api-access-hk8kr\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.666545 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666422 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2725cd96-5201-42ad-93cd-e934fa8eb17e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.666545 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666461 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1cb4fd4e-5662-4538-8827-31633f56c7ed-sys\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.666545 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2725cd96-5201-42ad-93cd-e934fa8eb17e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.666545 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666525 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1cb4fd4e-5662-4538-8827-31633f56c7ed-root\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.666545 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666539 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-wtmp\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.666825 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666549 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-textfile\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.666825 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666555 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1cb4fd4e-5662-4538-8827-31633f56c7ed-sys\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.666825 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666621 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1cb4fd4e-5662-4538-8827-31633f56c7ed-metrics-client-ca\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.666825 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666644 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.666825 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666639 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1cb4fd4e-5662-4538-8827-31633f56c7ed-root\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.666825 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666663 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-accelerators-collector-config\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.666825 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666688 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56pd6\" (UniqueName: \"kubernetes.io/projected/1cb4fd4e-5662-4538-8827-31633f56c7ed-kube-api-access-56pd6\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.666825 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666711 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2725cd96-5201-42ad-93cd-e934fa8eb17e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.667266 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666954 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2725cd96-5201-42ad-93cd-e934fa8eb17e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.667266 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666989 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-textfile\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.667266 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.666999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2725cd96-5201-42ad-93cd-e934fa8eb17e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.667266 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.667074 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-tls\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.667266 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.667102 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/2725cd96-5201-42ad-93cd-e934fa8eb17e-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.667511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.667310 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/2725cd96-5201-42ad-93cd-e934fa8eb17e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.667511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.667319 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-accelerators-collector-config\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.667511 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.667464 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1cb4fd4e-5662-4538-8827-31633f56c7ed-metrics-client-ca\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.667663 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.667611 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2725cd96-5201-42ad-93cd-e934fa8eb17e-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.669413 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.669383 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/2725cd96-5201-42ad-93cd-e934fa8eb17e-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.669563 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.669452 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-tls\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.669629 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.669582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1cb4fd4e-5662-4538-8827-31633f56c7ed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.669629 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.669610 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2725cd96-5201-42ad-93cd-e934fa8eb17e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.677696 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.677664 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56pd6\" (UniqueName: \"kubernetes.io/projected/1cb4fd4e-5662-4538-8827-31633f56c7ed-kube-api-access-56pd6\") pod \"node-exporter-mn2tn\" (UID: \"1cb4fd4e-5662-4538-8827-31633f56c7ed\") " pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.684361 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.684331 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk8kr\" (UniqueName: \"kubernetes.io/projected/2725cd96-5201-42ad-93cd-e934fa8eb17e-kube-api-access-hk8kr\") pod \"kube-state-metrics-69db897b98-vm4tx\" (UID: \"2725cd96-5201-42ad-93cd-e934fa8eb17e\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.763822 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.763783 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mn2tn" Apr 21 17:36:26.775333 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:36:26.775300 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1cb4fd4e_5662_4538_8827_31633f56c7ed.slice/crio-c77c324d638fb99f70fbbad31e5f6f8e75c808c5669bb89089debf0923b94eb8 WatchSource:0}: Error finding container c77c324d638fb99f70fbbad31e5f6f8e75c808c5669bb89089debf0923b94eb8: Status 404 returned error can't find the container with id c77c324d638fb99f70fbbad31e5f6f8e75c808c5669bb89089debf0923b94eb8 Apr 21 17:36:26.782907 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.782881 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" Apr 21 17:36:26.913324 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:26.913284 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-vm4tx"] Apr 21 17:36:26.916893 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:36:26.916862 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2725cd96_5201_42ad_93cd_e934fa8eb17e.slice/crio-0213bd75dea64ec8656758dc79c13856f3c1b71a3d043d785ffdaf5efd42871b WatchSource:0}: Error finding container 0213bd75dea64ec8656758dc79c13856f3c1b71a3d043d785ffdaf5efd42871b: Status 404 returned error can't find the container with id 0213bd75dea64ec8656758dc79c13856f3c1b71a3d043d785ffdaf5efd42871b Apr 21 17:36:27.408489 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:27.408451 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" event={"ID":"2725cd96-5201-42ad-93cd-e934fa8eb17e","Type":"ContainerStarted","Data":"0213bd75dea64ec8656758dc79c13856f3c1b71a3d043d785ffdaf5efd42871b"} Apr 21 17:36:27.409500 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:27.409476 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mn2tn" event={"ID":"1cb4fd4e-5662-4538-8827-31633f56c7ed","Type":"ContainerStarted","Data":"c77c324d638fb99f70fbbad31e5f6f8e75c808c5669bb89089debf0923b94eb8"} Apr 21 17:36:29.417428 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.417393 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" event={"ID":"2725cd96-5201-42ad-93cd-e934fa8eb17e","Type":"ContainerStarted","Data":"ff4ddca9561ecc604c35132c35af47c58d447615e882004d0b6a5c715ed1c8f6"} Apr 21 17:36:29.417897 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.417438 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" event={"ID":"2725cd96-5201-42ad-93cd-e934fa8eb17e","Type":"ContainerStarted","Data":"99da9a9e19adbd90c9067770f71e61943a0a121cea977e5d7c50fe45b0d7d715"} Apr 21 17:36:29.417897 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.417459 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" event={"ID":"2725cd96-5201-42ad-93cd-e934fa8eb17e","Type":"ContainerStarted","Data":"31fabc8f038c8b747bdb83335b6ae382f1578b6bf0ee7f7a74d81e537c29aff9"} Apr 21 17:36:29.418825 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.418799 2573 generic.go:358] "Generic (PLEG): container finished" podID="1cb4fd4e-5662-4538-8827-31633f56c7ed" containerID="13b5b253e203a88d354bfe8f29534fe062fa17ea6c9a0032d8c1ec7fd3274726" exitCode=0 Apr 21 17:36:29.418887 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.418869 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mn2tn" event={"ID":"1cb4fd4e-5662-4538-8827-31633f56c7ed","Type":"ContainerDied","Data":"13b5b253e203a88d354bfe8f29534fe062fa17ea6c9a0032d8c1ec7fd3274726"} Apr 21 17:36:29.423255 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.423227 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr"] Apr 21 17:36:29.426912 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.426889 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.429921 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.429707 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 17:36:29.429921 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.429729 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 17:36:29.429921 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.429747 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-c4db475hlqka4\"" Apr 21 17:36:29.429921 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.429708 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 17:36:29.429921 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.429781 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 17:36:29.430829 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.430811 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-vcbtw\"" Apr 21 17:36:29.430938 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.430874 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 17:36:29.439532 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.439488 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr"] Apr 21 17:36:29.445201 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.445115 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-vm4tx" podStartSLOduration=1.9595529539999998 podStartE2EDuration="3.44509921s" podCreationTimestamp="2026-04-21 17:36:26 +0000 UTC" firstStartedPulling="2026-04-21 17:36:26.918761227 +0000 UTC m=+171.629455314" lastFinishedPulling="2026-04-21 17:36:28.404307483 +0000 UTC m=+173.115001570" observedRunningTime="2026-04-21 17:36:29.443626195 +0000 UTC m=+174.154320320" watchObservedRunningTime="2026-04-21 17:36:29.44509921 +0000 UTC m=+174.155793314" Apr 21 17:36:29.491885 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.491855 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnfzs\" (UniqueName: \"kubernetes.io/projected/8166e55e-1dae-40ee-ae0f-4833d4cff10c-kube-api-access-hnfzs\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.492057 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.491997 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.492163 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.492126 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.492233 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.492179 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.492477 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.492346 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-grpc-tls\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.492477 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.492403 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.492622 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.492516 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8166e55e-1dae-40ee-ae0f-4833d4cff10c-metrics-client-ca\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.492622 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.492560 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-tls\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.593992 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.593953 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.594208 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.594004 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.594208 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.594022 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.594208 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.594042 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-grpc-tls\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.594208 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.594073 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.594208 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.594098 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8166e55e-1dae-40ee-ae0f-4833d4cff10c-metrics-client-ca\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.594208 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.594124 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-tls\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.594208 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.594183 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnfzs\" (UniqueName: \"kubernetes.io/projected/8166e55e-1dae-40ee-ae0f-4833d4cff10c-kube-api-access-hnfzs\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.595000 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.594941 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8166e55e-1dae-40ee-ae0f-4833d4cff10c-metrics-client-ca\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.596787 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.596755 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-grpc-tls\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.596787 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.596777 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.597020 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.596990 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.597214 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.597196 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-tls\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.597513 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.597493 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.597772 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.597756 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8166e55e-1dae-40ee-ae0f-4833d4cff10c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.606499 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.606465 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnfzs\" (UniqueName: \"kubernetes.io/projected/8166e55e-1dae-40ee-ae0f-4833d4cff10c-kube-api-access-hnfzs\") pod \"thanos-querier-569f6b9d8b-r5tpr\" (UID: \"8166e55e-1dae-40ee-ae0f-4833d4cff10c\") " pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.737950 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.737910 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:29.875382 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.875334 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr"] Apr 21 17:36:29.879533 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:36:29.879504 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8166e55e_1dae_40ee_ae0f_4833d4cff10c.slice/crio-219056faab4aab4db545e7d84b2455966abffa6a16af1e4598ad5ec2accd9453 WatchSource:0}: Error finding container 219056faab4aab4db545e7d84b2455966abffa6a16af1e4598ad5ec2accd9453: Status 404 returned error can't find the container with id 219056faab4aab4db545e7d84b2455966abffa6a16af1e4598ad5ec2accd9453 Apr 21 17:36:29.905496 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:29.905461 2573 scope.go:117] "RemoveContainer" containerID="2bc31f9bef602c063d5d3239901f030520530e7499af3e1262c0b7d4a6cdbb7d" Apr 21 17:36:30.390427 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:30.390389 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4vw75" Apr 21 17:36:30.423992 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:30.423957 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" event={"ID":"8166e55e-1dae-40ee-ae0f-4833d4cff10c","Type":"ContainerStarted","Data":"219056faab4aab4db545e7d84b2455966abffa6a16af1e4598ad5ec2accd9453"} Apr 21 17:36:30.426459 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:30.426421 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mn2tn" event={"ID":"1cb4fd4e-5662-4538-8827-31633f56c7ed","Type":"ContainerStarted","Data":"39f3cf2236c027ed8fca651c415d85fb96e3f784ad19e8012580ad8a4206850f"} Apr 21 17:36:30.426459 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:30.426466 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mn2tn" event={"ID":"1cb4fd4e-5662-4538-8827-31633f56c7ed","Type":"ContainerStarted","Data":"cd5431816f60e18ff1a34505e01d05a8d55c0a73ee196da8fe1e5dfa347da19c"} Apr 21 17:36:30.428878 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:30.428857 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 17:36:30.429020 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:30.428958 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" event={"ID":"8a1fd903-a226-46d4-8e61-54eac7ea70b3","Type":"ContainerStarted","Data":"7f3506a5bde20d15e81b09153a7735a742b916b3d770f6eedeee6635e2ee83bd"} Apr 21 17:36:30.450741 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:30.450681 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-mn2tn" podStartSLOduration=3.190133868 podStartE2EDuration="4.450661499s" podCreationTimestamp="2026-04-21 17:36:26 +0000 UTC" firstStartedPulling="2026-04-21 17:36:26.777394546 +0000 UTC m=+171.488088633" lastFinishedPulling="2026-04-21 17:36:28.037922162 +0000 UTC m=+172.748616264" observedRunningTime="2026-04-21 17:36:30.450414141 +0000 UTC m=+175.161108250" watchObservedRunningTime="2026-04-21 17:36:30.450661499 +0000 UTC m=+175.161355607" Apr 21 17:36:30.483754 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:30.483673 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" podStartSLOduration=43.365800695 podStartE2EDuration="45.483654314s" podCreationTimestamp="2026-04-21 17:35:45 +0000 UTC" firstStartedPulling="2026-04-21 17:35:45.474402757 +0000 UTC m=+130.185096843" lastFinishedPulling="2026-04-21 17:35:47.592256362 +0000 UTC m=+132.302950462" observedRunningTime="2026-04-21 17:36:30.482048936 +0000 UTC m=+175.192743074" watchObservedRunningTime="2026-04-21 17:36:30.483654314 +0000 UTC m=+175.194348421" Apr 21 17:36:31.210888 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:31.210849 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-vxpw5"] Apr 21 17:36:31.214440 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:31.214410 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vxpw5" Apr 21 17:36:31.218395 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:31.218368 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 17:36:31.218545 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:31.218415 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-s6bl9\"" Apr 21 17:36:31.226164 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:31.226104 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-vxpw5"] Apr 21 17:36:31.313909 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:31.313871 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2bae9fa3-c4a6-4863-9da7-c77595420218-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-vxpw5\" (UID: \"2bae9fa3-c4a6-4863-9da7-c77595420218\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vxpw5" Apr 21 17:36:31.415305 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:31.415267 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2bae9fa3-c4a6-4863-9da7-c77595420218-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-vxpw5\" (UID: \"2bae9fa3-c4a6-4863-9da7-c77595420218\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vxpw5" Apr 21 17:36:31.415481 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:36:31.415451 2573 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 17:36:31.415565 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:36:31.415548 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bae9fa3-c4a6-4863-9da7-c77595420218-monitoring-plugin-cert podName:2bae9fa3-c4a6-4863-9da7-c77595420218 nodeName:}" failed. No retries permitted until 2026-04-21 17:36:31.91552396 +0000 UTC m=+176.626218049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/2bae9fa3-c4a6-4863-9da7-c77595420218-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-vxpw5" (UID: "2bae9fa3-c4a6-4863-9da7-c77595420218") : secret "monitoring-plugin-cert" not found Apr 21 17:36:31.919292 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:31.919261 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2bae9fa3-c4a6-4863-9da7-c77595420218-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-vxpw5\" (UID: \"2bae9fa3-c4a6-4863-9da7-c77595420218\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vxpw5" Apr 21 17:36:31.921969 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:31.921940 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2bae9fa3-c4a6-4863-9da7-c77595420218-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-vxpw5\" (UID: \"2bae9fa3-c4a6-4863-9da7-c77595420218\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vxpw5" Apr 21 17:36:32.127101 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:32.127063 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vxpw5" Apr 21 17:36:32.249556 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:32.249525 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-vxpw5"] Apr 21 17:36:32.252807 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:36:32.252774 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bae9fa3_c4a6_4863_9da7_c77595420218.slice/crio-b6d8af07a314d3960f7054b54b843e3cfa74a25d2b356db6364ee2a3b21d49ad WatchSource:0}: Error finding container b6d8af07a314d3960f7054b54b843e3cfa74a25d2b356db6364ee2a3b21d49ad: Status 404 returned error can't find the container with id b6d8af07a314d3960f7054b54b843e3cfa74a25d2b356db6364ee2a3b21d49ad Apr 21 17:36:32.439167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:32.439060 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" event={"ID":"8166e55e-1dae-40ee-ae0f-4833d4cff10c","Type":"ContainerStarted","Data":"bc749dfe506a557d8aaa8518f400903e401ce2052442093af51e7f92874a114e"} Apr 21 17:36:32.439167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:32.439108 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" event={"ID":"8166e55e-1dae-40ee-ae0f-4833d4cff10c","Type":"ContainerStarted","Data":"25b43a32e3cbe262c6018370e98ac9137dfc026045e838cc882f2cc736f350a0"} Apr 21 17:36:32.439167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:32.439124 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" event={"ID":"8166e55e-1dae-40ee-ae0f-4833d4cff10c","Type":"ContainerStarted","Data":"e632299add4f24258d5f821d813f24a0260f31900b5d25a8edf97c4117a46125"} Apr 21 17:36:32.441213 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:32.441124 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vxpw5" event={"ID":"2bae9fa3-c4a6-4863-9da7-c77595420218","Type":"ContainerStarted","Data":"b6d8af07a314d3960f7054b54b843e3cfa74a25d2b356db6364ee2a3b21d49ad"} Apr 21 17:36:33.447416 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:33.447368 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" event={"ID":"8166e55e-1dae-40ee-ae0f-4833d4cff10c","Type":"ContainerStarted","Data":"d056903b8305ba4826fbd4b17acadc7dc6ab25133805b581d7d391822bba7a1b"} Apr 21 17:36:33.447416 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:33.447416 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" event={"ID":"8166e55e-1dae-40ee-ae0f-4833d4cff10c","Type":"ContainerStarted","Data":"3d330c68c4cac06a992ecca84069fecbadd4497dfc52ca463fa4cd5f8d87351d"} Apr 21 17:36:33.447939 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:33.447432 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" event={"ID":"8166e55e-1dae-40ee-ae0f-4833d4cff10c","Type":"ContainerStarted","Data":"18b5733128b32bb3f42aef4701bef18cec8d2da52df088bbbea99428b1c5e708"} Apr 21 17:36:33.447939 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:33.447700 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:33.474024 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:33.473970 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" podStartSLOduration=1.564055637 podStartE2EDuration="4.473956273s" podCreationTimestamp="2026-04-21 17:36:29 +0000 UTC" firstStartedPulling="2026-04-21 17:36:29.881880619 +0000 UTC m=+174.592574705" lastFinishedPulling="2026-04-21 17:36:32.791781253 +0000 UTC m=+177.502475341" observedRunningTime="2026-04-21 17:36:33.472440602 +0000 UTC m=+178.183134743" watchObservedRunningTime="2026-04-21 17:36:33.473956273 +0000 UTC m=+178.184650381" Apr 21 17:36:34.452246 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:34.452200 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vxpw5" event={"ID":"2bae9fa3-c4a6-4863-9da7-c77595420218","Type":"ContainerStarted","Data":"eac7ee9fec8cf27b65bfcb6a54006699a44bc9a635b54b50eeb2ae2161a868f4"} Apr 21 17:36:34.452677 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:34.452488 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vxpw5" Apr 21 17:36:34.457700 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:34.457677 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vxpw5" Apr 21 17:36:34.468987 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:34.468940 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vxpw5" podStartSLOduration=1.973273166 podStartE2EDuration="3.468925471s" podCreationTimestamp="2026-04-21 17:36:31 +0000 UTC" firstStartedPulling="2026-04-21 17:36:32.254683424 +0000 UTC m=+176.965377510" lastFinishedPulling="2026-04-21 17:36:33.750335721 +0000 UTC m=+178.461029815" observedRunningTime="2026-04-21 17:36:34.467927269 +0000 UTC m=+179.178621378" watchObservedRunningTime="2026-04-21 17:36:34.468925471 +0000 UTC m=+179.179619980" Apr 21 17:36:39.458581 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:39.458549 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-569f6b9d8b-r5tpr" Apr 21 17:36:40.388598 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:40.388565 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:36:40.430044 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:40.430009 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:36:40.435232 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:40.435193 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-pz6fc" Apr 21 17:36:41.214855 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.214818 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-657bc54896-lcvlq"] Apr 21 17:36:41.218201 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.218182 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.221703 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.221677 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 17:36:41.221848 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.221763 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 17:36:41.222879 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.222857 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 17:36:41.223097 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.223064 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 17:36:41.223200 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.223176 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 17:36:41.223200 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.223191 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 17:36:41.223315 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.223223 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-79d6t\"" Apr 21 17:36:41.223315 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.223242 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 17:36:41.227956 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.227932 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-657bc54896-lcvlq"] Apr 21 17:36:41.401674 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.401634 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwzk5\" (UniqueName: \"kubernetes.io/projected/fa3d9541-9339-4abe-82d6-5378681f968e-kube-api-access-jwzk5\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.401674 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.401676 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa3d9541-9339-4abe-82d6-5378681f968e-console-serving-cert\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.401939 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.401764 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-oauth-serving-cert\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.401939 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.401814 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-service-ca\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.401939 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.401906 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa3d9541-9339-4abe-82d6-5378681f968e-console-oauth-config\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.402046 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.401949 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-console-config\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.502983 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.502945 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa3d9541-9339-4abe-82d6-5378681f968e-console-oauth-config\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.503168 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.503001 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-console-config\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.503168 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.503020 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwzk5\" (UniqueName: \"kubernetes.io/projected/fa3d9541-9339-4abe-82d6-5378681f968e-kube-api-access-jwzk5\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.503168 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.503040 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa3d9541-9339-4abe-82d6-5378681f968e-console-serving-cert\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.503168 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.503079 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-oauth-serving-cert\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.503168 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.503097 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-service-ca\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.503820 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.503796 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-console-config\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.503907 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.503857 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-oauth-serving-cert\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.503907 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.503886 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-service-ca\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.505514 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.505488 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa3d9541-9339-4abe-82d6-5378681f968e-console-serving-cert\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.505583 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.505563 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa3d9541-9339-4abe-82d6-5378681f968e-console-oauth-config\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.511884 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.511858 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwzk5\" (UniqueName: \"kubernetes.io/projected/fa3d9541-9339-4abe-82d6-5378681f968e-kube-api-access-jwzk5\") pod \"console-657bc54896-lcvlq\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.528478 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.527909 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:41.661117 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:41.661036 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-657bc54896-lcvlq"] Apr 21 17:36:41.663915 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:36:41.663883 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa3d9541_9339_4abe_82d6_5378681f968e.slice/crio-be197f52f0c172a6d368d32e06ae7b40a1fabd33282de16cd8c6320c8056bfd6 WatchSource:0}: Error finding container be197f52f0c172a6d368d32e06ae7b40a1fabd33282de16cd8c6320c8056bfd6: Status 404 returned error can't find the container with id be197f52f0c172a6d368d32e06ae7b40a1fabd33282de16cd8c6320c8056bfd6 Apr 21 17:36:42.476052 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:42.476013 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-657bc54896-lcvlq" event={"ID":"fa3d9541-9339-4abe-82d6-5378681f968e","Type":"ContainerStarted","Data":"be197f52f0c172a6d368d32e06ae7b40a1fabd33282de16cd8c6320c8056bfd6"} Apr 21 17:36:44.483036 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:44.482999 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-657bc54896-lcvlq" event={"ID":"fa3d9541-9339-4abe-82d6-5378681f968e","Type":"ContainerStarted","Data":"1ed3354ff6b94af7560cde235299eaa72ea8fc4a1afa7cff67f827045feaab7a"} Apr 21 17:36:44.502104 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:44.502049 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-657bc54896-lcvlq" podStartSLOduration=0.859228435 podStartE2EDuration="3.502032736s" podCreationTimestamp="2026-04-21 17:36:41 +0000 UTC" firstStartedPulling="2026-04-21 17:36:41.666400091 +0000 UTC m=+186.377094177" lastFinishedPulling="2026-04-21 17:36:44.30920439 +0000 UTC m=+189.019898478" observedRunningTime="2026-04-21 17:36:44.501289 +0000 UTC m=+189.211983104" watchObservedRunningTime="2026-04-21 17:36:44.502032736 +0000 UTC m=+189.212726843" Apr 21 17:36:45.402574 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.401824 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" podUID="ec9e9129-fbcf-473a-b882-5c7e25ee6b50" containerName="registry" containerID="cri-o://7896fd86425aa007094675b7205c5ae15719fd275b5066a9e9f42067dcec70e8" gracePeriod=30 Apr 21 17:36:45.663829 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.663750 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:36:45.838893 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.838854 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-image-registry-private-configuration\") pod \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " Apr 21 17:36:45.838893 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.838896 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-installation-pull-secrets\") pod \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " Apr 21 17:36:45.839187 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.838920 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-trusted-ca\") pod \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " Apr 21 17:36:45.839187 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.838961 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-ca-trust-extracted\") pod \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " Apr 21 17:36:45.839187 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.838985 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxclg\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-kube-api-access-vxclg\") pod \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " Apr 21 17:36:45.839187 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.839006 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-bound-sa-token\") pod \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " Apr 21 17:36:45.839187 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.839032 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls\") pod \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " Apr 21 17:36:45.839187 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.839070 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-certificates\") pod \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\" (UID: \"ec9e9129-fbcf-473a-b882-5c7e25ee6b50\") " Apr 21 17:36:45.839490 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.839402 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ec9e9129-fbcf-473a-b882-5c7e25ee6b50" (UID: "ec9e9129-fbcf-473a-b882-5c7e25ee6b50"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:36:45.839621 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.839578 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ec9e9129-fbcf-473a-b882-5c7e25ee6b50" (UID: "ec9e9129-fbcf-473a-b882-5c7e25ee6b50"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:36:45.841757 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.841727 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ec9e9129-fbcf-473a-b882-5c7e25ee6b50" (UID: "ec9e9129-fbcf-473a-b882-5c7e25ee6b50"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:36:45.842326 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.842291 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ec9e9129-fbcf-473a-b882-5c7e25ee6b50" (UID: "ec9e9129-fbcf-473a-b882-5c7e25ee6b50"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:36:45.842456 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.842391 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ec9e9129-fbcf-473a-b882-5c7e25ee6b50" (UID: "ec9e9129-fbcf-473a-b882-5c7e25ee6b50"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:36:45.842456 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.842414 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-kube-api-access-vxclg" (OuterVolumeSpecName: "kube-api-access-vxclg") pod "ec9e9129-fbcf-473a-b882-5c7e25ee6b50" (UID: "ec9e9129-fbcf-473a-b882-5c7e25ee6b50"). InnerVolumeSpecName "kube-api-access-vxclg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:36:45.843805 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.843525 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ec9e9129-fbcf-473a-b882-5c7e25ee6b50" (UID: "ec9e9129-fbcf-473a-b882-5c7e25ee6b50"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:36:45.851254 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.851215 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ec9e9129-fbcf-473a-b882-5c7e25ee6b50" (UID: "ec9e9129-fbcf-473a-b882-5c7e25ee6b50"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:36:45.940201 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.940163 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-image-registry-private-configuration\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:36:45.940201 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.940194 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-installation-pull-secrets\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:36:45.940201 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.940206 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-trusted-ca\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:36:45.940503 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.940216 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-ca-trust-extracted\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:36:45.940503 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.940228 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vxclg\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-kube-api-access-vxclg\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:36:45.940503 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.940237 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-bound-sa-token\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:36:45.940503 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.940246 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-tls\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:36:45.940503 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:45.940254 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ec9e9129-fbcf-473a-b882-5c7e25ee6b50-registry-certificates\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:36:46.491949 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:46.491916 2573 generic.go:358] "Generic (PLEG): container finished" podID="ec9e9129-fbcf-473a-b882-5c7e25ee6b50" containerID="7896fd86425aa007094675b7205c5ae15719fd275b5066a9e9f42067dcec70e8" exitCode=0 Apr 21 17:36:46.492099 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:46.491987 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" Apr 21 17:36:46.492099 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:46.491998 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" event={"ID":"ec9e9129-fbcf-473a-b882-5c7e25ee6b50","Type":"ContainerDied","Data":"7896fd86425aa007094675b7205c5ae15719fd275b5066a9e9f42067dcec70e8"} Apr 21 17:36:46.492099 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:46.492039 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-cd4c6697c-b2gpw" event={"ID":"ec9e9129-fbcf-473a-b882-5c7e25ee6b50","Type":"ContainerDied","Data":"81b2cc0522d5a9141e072507ae07a0e10f36494f81e9b6a669f5c0951b6da8ec"} Apr 21 17:36:46.492099 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:46.492058 2573 scope.go:117] "RemoveContainer" containerID="7896fd86425aa007094675b7205c5ae15719fd275b5066a9e9f42067dcec70e8" Apr 21 17:36:46.500727 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:46.500711 2573 scope.go:117] "RemoveContainer" containerID="7896fd86425aa007094675b7205c5ae15719fd275b5066a9e9f42067dcec70e8" Apr 21 17:36:46.501101 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:36:46.501067 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7896fd86425aa007094675b7205c5ae15719fd275b5066a9e9f42067dcec70e8\": container with ID starting with 7896fd86425aa007094675b7205c5ae15719fd275b5066a9e9f42067dcec70e8 not found: ID does not exist" containerID="7896fd86425aa007094675b7205c5ae15719fd275b5066a9e9f42067dcec70e8" Apr 21 17:36:46.501198 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:46.501112 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7896fd86425aa007094675b7205c5ae15719fd275b5066a9e9f42067dcec70e8"} err="failed to get container status \"7896fd86425aa007094675b7205c5ae15719fd275b5066a9e9f42067dcec70e8\": rpc error: code = NotFound desc = could not find container \"7896fd86425aa007094675b7205c5ae15719fd275b5066a9e9f42067dcec70e8\": container with ID starting with 7896fd86425aa007094675b7205c5ae15719fd275b5066a9e9f42067dcec70e8 not found: ID does not exist" Apr 21 17:36:46.509540 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:46.509506 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-cd4c6697c-b2gpw"] Apr 21 17:36:46.517674 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:46.517641 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-cd4c6697c-b2gpw"] Apr 21 17:36:47.909684 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:47.909641 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec9e9129-fbcf-473a-b882-5c7e25ee6b50" path="/var/lib/kubelet/pods/ec9e9129-fbcf-473a-b882-5c7e25ee6b50/volumes" Apr 21 17:36:49.218853 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.218818 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-696d4bccbb-wqzq5"] Apr 21 17:36:49.219228 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.219168 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec9e9129-fbcf-473a-b882-5c7e25ee6b50" containerName="registry" Apr 21 17:36:49.219228 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.219181 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9e9129-fbcf-473a-b882-5c7e25ee6b50" containerName="registry" Apr 21 17:36:49.219306 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.219240 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec9e9129-fbcf-473a-b882-5c7e25ee6b50" containerName="registry" Apr 21 17:36:49.225506 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.225488 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.234567 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.234540 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-696d4bccbb-wqzq5"] Apr 21 17:36:49.239955 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.239924 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 17:36:49.273520 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.273476 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d02387c-a065-4627-982a-1de15c72a7b9-console-oauth-config\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.273734 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.273529 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-console-config\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.273734 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.273571 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-trusted-ca-bundle\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.273734 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.273617 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-oauth-serving-cert\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.273734 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.273643 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d02387c-a065-4627-982a-1de15c72a7b9-console-serving-cert\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.273734 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.273663 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-service-ca\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.273917 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.273749 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4qzp\" (UniqueName: \"kubernetes.io/projected/3d02387c-a065-4627-982a-1de15c72a7b9-kube-api-access-q4qzp\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.374859 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.374823 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d02387c-a065-4627-982a-1de15c72a7b9-console-oauth-config\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.374859 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.374868 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-console-config\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.375110 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.374889 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-trusted-ca-bundle\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.375110 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.374914 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-oauth-serving-cert\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.375110 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.374936 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d02387c-a065-4627-982a-1de15c72a7b9-console-serving-cert\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.375110 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.374951 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-service-ca\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.375110 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.374983 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4qzp\" (UniqueName: \"kubernetes.io/projected/3d02387c-a065-4627-982a-1de15c72a7b9-kube-api-access-q4qzp\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.375774 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.375742 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-console-config\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.375774 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.375764 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-oauth-serving-cert\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.375941 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.375770 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-service-ca\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.376074 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.376051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-trusted-ca-bundle\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.377946 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.377918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d02387c-a065-4627-982a-1de15c72a7b9-console-oauth-config\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.378166 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.378127 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d02387c-a065-4627-982a-1de15c72a7b9-console-serving-cert\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.387491 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.387466 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4qzp\" (UniqueName: \"kubernetes.io/projected/3d02387c-a065-4627-982a-1de15c72a7b9-kube-api-access-q4qzp\") pod \"console-696d4bccbb-wqzq5\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.536796 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.536755 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:49.671980 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:49.671944 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-696d4bccbb-wqzq5"] Apr 21 17:36:49.676104 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:36:49.676076 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d02387c_a065_4627_982a_1de15c72a7b9.slice/crio-6c8b28edc143cfb8dff409dbc7a988fd714a19375ea897a5c0cbaa64824b3388 WatchSource:0}: Error finding container 6c8b28edc143cfb8dff409dbc7a988fd714a19375ea897a5c0cbaa64824b3388: Status 404 returned error can't find the container with id 6c8b28edc143cfb8dff409dbc7a988fd714a19375ea897a5c0cbaa64824b3388 Apr 21 17:36:50.505861 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:50.505820 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-696d4bccbb-wqzq5" event={"ID":"3d02387c-a065-4627-982a-1de15c72a7b9","Type":"ContainerStarted","Data":"e99fd94d1b5c3804f8206ed47d93d4bf70410cae2980ca7ae4da0aef0a8ccfd4"} Apr 21 17:36:50.505861 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:50.505863 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-696d4bccbb-wqzq5" event={"ID":"3d02387c-a065-4627-982a-1de15c72a7b9","Type":"ContainerStarted","Data":"6c8b28edc143cfb8dff409dbc7a988fd714a19375ea897a5c0cbaa64824b3388"} Apr 21 17:36:50.524054 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:50.523996 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-696d4bccbb-wqzq5" podStartSLOduration=1.523976912 podStartE2EDuration="1.523976912s" podCreationTimestamp="2026-04-21 17:36:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:36:50.523667434 +0000 UTC m=+195.234361567" watchObservedRunningTime="2026-04-21 17:36:50.523976912 +0000 UTC m=+195.234671021" Apr 21 17:36:51.528938 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:51.528905 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:51.528938 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:51.528942 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:51.533977 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:51.533951 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:52.517928 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:52.517898 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:36:59.537779 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:59.537746 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:59.538225 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:59.537823 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:36:59.542654 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:36:59.542628 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:37:00.540796 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:00.540765 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:37:00.589837 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:00.589798 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-657bc54896-lcvlq"] Apr 21 17:37:18.587329 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:18.587290 2573 generic.go:358] "Generic (PLEG): container finished" podID="a784f755-6ef2-4edf-993b-25f3e45d1082" containerID="b5909a37956215864c18516a35c491ae40e30b6aa64dc0331851216cd91b62b9" exitCode=0 Apr 21 17:37:18.587811 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:18.587362 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jb47l" event={"ID":"a784f755-6ef2-4edf-993b-25f3e45d1082","Type":"ContainerDied","Data":"b5909a37956215864c18516a35c491ae40e30b6aa64dc0331851216cd91b62b9"} Apr 21 17:37:18.587811 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:18.587723 2573 scope.go:117] "RemoveContainer" containerID="b5909a37956215864c18516a35c491ae40e30b6aa64dc0331851216cd91b62b9" Apr 21 17:37:19.591778 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:19.591744 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-jb47l" event={"ID":"a784f755-6ef2-4edf-993b-25f3e45d1082","Type":"ContainerStarted","Data":"4c9358887acc9333b3d37881755684d9b7181853d4c1cc3d7ba29aceea9a632f"} Apr 21 17:37:25.610579 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.610518 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-657bc54896-lcvlq" podUID="fa3d9541-9339-4abe-82d6-5378681f968e" containerName="console" containerID="cri-o://1ed3354ff6b94af7560cde235299eaa72ea8fc4a1afa7cff67f827045feaab7a" gracePeriod=15 Apr 21 17:37:25.848228 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.848205 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-657bc54896-lcvlq_fa3d9541-9339-4abe-82d6-5378681f968e/console/0.log" Apr 21 17:37:25.848377 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.848264 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:37:25.980040 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.979937 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-oauth-serving-cert\") pod \"fa3d9541-9339-4abe-82d6-5378681f968e\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " Apr 21 17:37:25.980040 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.979983 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa3d9541-9339-4abe-82d6-5378681f968e-console-serving-cert\") pod \"fa3d9541-9339-4abe-82d6-5378681f968e\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " Apr 21 17:37:25.980040 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.980004 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-service-ca\") pod \"fa3d9541-9339-4abe-82d6-5378681f968e\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " Apr 21 17:37:25.980040 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.980033 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa3d9541-9339-4abe-82d6-5378681f968e-console-oauth-config\") pod \"fa3d9541-9339-4abe-82d6-5378681f968e\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " Apr 21 17:37:25.980405 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.980058 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwzk5\" (UniqueName: \"kubernetes.io/projected/fa3d9541-9339-4abe-82d6-5378681f968e-kube-api-access-jwzk5\") pod \"fa3d9541-9339-4abe-82d6-5378681f968e\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " Apr 21 17:37:25.980405 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.980103 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-console-config\") pod \"fa3d9541-9339-4abe-82d6-5378681f968e\" (UID: \"fa3d9541-9339-4abe-82d6-5378681f968e\") " Apr 21 17:37:25.980515 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.980483 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fa3d9541-9339-4abe-82d6-5378681f968e" (UID: "fa3d9541-9339-4abe-82d6-5378681f968e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:37:25.980567 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.980509 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-service-ca" (OuterVolumeSpecName: "service-ca") pod "fa3d9541-9339-4abe-82d6-5378681f968e" (UID: "fa3d9541-9339-4abe-82d6-5378681f968e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:37:25.980567 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.980517 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-console-config" (OuterVolumeSpecName: "console-config") pod "fa3d9541-9339-4abe-82d6-5378681f968e" (UID: "fa3d9541-9339-4abe-82d6-5378681f968e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:37:25.982512 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.982484 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa3d9541-9339-4abe-82d6-5378681f968e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fa3d9541-9339-4abe-82d6-5378681f968e" (UID: "fa3d9541-9339-4abe-82d6-5378681f968e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:25.982512 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.982475 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa3d9541-9339-4abe-82d6-5378681f968e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fa3d9541-9339-4abe-82d6-5378681f968e" (UID: "fa3d9541-9339-4abe-82d6-5378681f968e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:37:25.982659 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:25.982523 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa3d9541-9339-4abe-82d6-5378681f968e-kube-api-access-jwzk5" (OuterVolumeSpecName: "kube-api-access-jwzk5") pod "fa3d9541-9339-4abe-82d6-5378681f968e" (UID: "fa3d9541-9339-4abe-82d6-5378681f968e"). InnerVolumeSpecName "kube-api-access-jwzk5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:37:26.080756 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.080715 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-oauth-serving-cert\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:37:26.080756 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.080744 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa3d9541-9339-4abe-82d6-5378681f968e-console-serving-cert\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:37:26.080756 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.080757 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-service-ca\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:37:26.080756 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.080767 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa3d9541-9339-4abe-82d6-5378681f968e-console-oauth-config\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:37:26.081040 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.080775 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jwzk5\" (UniqueName: \"kubernetes.io/projected/fa3d9541-9339-4abe-82d6-5378681f968e-kube-api-access-jwzk5\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:37:26.081040 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.080785 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa3d9541-9339-4abe-82d6-5378681f968e-console-config\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:37:26.613907 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.613872 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-657bc54896-lcvlq_fa3d9541-9339-4abe-82d6-5378681f968e/console/0.log" Apr 21 17:37:26.614394 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.613920 2573 generic.go:358] "Generic (PLEG): container finished" podID="fa3d9541-9339-4abe-82d6-5378681f968e" containerID="1ed3354ff6b94af7560cde235299eaa72ea8fc4a1afa7cff67f827045feaab7a" exitCode=2 Apr 21 17:37:26.614394 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.613989 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-657bc54896-lcvlq" event={"ID":"fa3d9541-9339-4abe-82d6-5378681f968e","Type":"ContainerDied","Data":"1ed3354ff6b94af7560cde235299eaa72ea8fc4a1afa7cff67f827045feaab7a"} Apr 21 17:37:26.614394 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.614009 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-657bc54896-lcvlq" Apr 21 17:37:26.614394 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.614026 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-657bc54896-lcvlq" event={"ID":"fa3d9541-9339-4abe-82d6-5378681f968e","Type":"ContainerDied","Data":"be197f52f0c172a6d368d32e06ae7b40a1fabd33282de16cd8c6320c8056bfd6"} Apr 21 17:37:26.614394 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.614052 2573 scope.go:117] "RemoveContainer" containerID="1ed3354ff6b94af7560cde235299eaa72ea8fc4a1afa7cff67f827045feaab7a" Apr 21 17:37:26.622069 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.622053 2573 scope.go:117] "RemoveContainer" containerID="1ed3354ff6b94af7560cde235299eaa72ea8fc4a1afa7cff67f827045feaab7a" Apr 21 17:37:26.622373 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:37:26.622348 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed3354ff6b94af7560cde235299eaa72ea8fc4a1afa7cff67f827045feaab7a\": container with ID starting with 1ed3354ff6b94af7560cde235299eaa72ea8fc4a1afa7cff67f827045feaab7a not found: ID does not exist" containerID="1ed3354ff6b94af7560cde235299eaa72ea8fc4a1afa7cff67f827045feaab7a" Apr 21 17:37:26.622444 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.622382 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed3354ff6b94af7560cde235299eaa72ea8fc4a1afa7cff67f827045feaab7a"} err="failed to get container status \"1ed3354ff6b94af7560cde235299eaa72ea8fc4a1afa7cff67f827045feaab7a\": rpc error: code = NotFound desc = could not find container \"1ed3354ff6b94af7560cde235299eaa72ea8fc4a1afa7cff67f827045feaab7a\": container with ID starting with 1ed3354ff6b94af7560cde235299eaa72ea8fc4a1afa7cff67f827045feaab7a not found: ID does not exist" Apr 21 17:37:26.635149 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.635091 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-657bc54896-lcvlq"] Apr 21 17:37:26.640520 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:26.640491 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-657bc54896-lcvlq"] Apr 21 17:37:27.909561 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:27.909529 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa3d9541-9339-4abe-82d6-5378681f968e" path="/var/lib/kubelet/pods/fa3d9541-9339-4abe-82d6-5378681f968e/volumes" Apr 21 17:37:47.759633 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:47.759589 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:37:47.761941 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:47.761921 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dcd8a3eb-e25c-4dcb-9468-d578a60a826c-metrics-certs\") pod \"network-metrics-daemon-wtk7c\" (UID: \"dcd8a3eb-e25c-4dcb-9468-d578a60a826c\") " pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:37:48.009201 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:48.009164 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-pqc26\"" Apr 21 17:37:48.016306 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:48.016210 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wtk7c" Apr 21 17:37:48.154990 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:48.154955 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wtk7c"] Apr 21 17:37:48.158422 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:37:48.158391 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcd8a3eb_e25c_4dcb_9468_d578a60a826c.slice/crio-f4b0e1469437746ca843aa3a4b66ef22faa3cf8a20b3c490056c9cc59649d70b WatchSource:0}: Error finding container f4b0e1469437746ca843aa3a4b66ef22faa3cf8a20b3c490056c9cc59649d70b: Status 404 returned error can't find the container with id f4b0e1469437746ca843aa3a4b66ef22faa3cf8a20b3c490056c9cc59649d70b Apr 21 17:37:48.685627 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:48.685582 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wtk7c" event={"ID":"dcd8a3eb-e25c-4dcb-9468-d578a60a826c","Type":"ContainerStarted","Data":"f4b0e1469437746ca843aa3a4b66ef22faa3cf8a20b3c490056c9cc59649d70b"} Apr 21 17:37:49.690361 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:49.690322 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wtk7c" event={"ID":"dcd8a3eb-e25c-4dcb-9468-d578a60a826c","Type":"ContainerStarted","Data":"771208b98e952f7ba8273e996c793b421bd115f258bf107ef5223a126f71b3ea"} Apr 21 17:37:49.690361 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:49.690366 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wtk7c" event={"ID":"dcd8a3eb-e25c-4dcb-9468-d578a60a826c","Type":"ContainerStarted","Data":"d5ab596f32a4cc4599a77689e6980b286ff894ab7171d0f1543efe5c157b2373"} Apr 21 17:37:49.708112 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:49.708055 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wtk7c" podStartSLOduration=252.768210837 podStartE2EDuration="4m13.708036333s" podCreationTimestamp="2026-04-21 17:33:36 +0000 UTC" firstStartedPulling="2026-04-21 17:37:48.160242102 +0000 UTC m=+252.870936188" lastFinishedPulling="2026-04-21 17:37:49.100067592 +0000 UTC m=+253.810761684" observedRunningTime="2026-04-21 17:37:49.707160973 +0000 UTC m=+254.417855078" watchObservedRunningTime="2026-04-21 17:37:49.708036333 +0000 UTC m=+254.418730442" Apr 21 17:37:50.819622 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.819522 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-b98c9d888-h4g5h"] Apr 21 17:37:50.820082 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.819956 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa3d9541-9339-4abe-82d6-5378681f968e" containerName="console" Apr 21 17:37:50.820082 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.819971 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa3d9541-9339-4abe-82d6-5378681f968e" containerName="console" Apr 21 17:37:50.820082 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.820039 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa3d9541-9339-4abe-82d6-5378681f968e" containerName="console" Apr 21 17:37:50.823593 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.823558 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.826373 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.826347 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 17:37:50.826534 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.826347 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-7mdkd\"" Apr 21 17:37:50.826804 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.826785 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 17:37:50.826877 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.826804 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 17:37:50.826877 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.826808 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 17:37:50.827384 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.827367 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 17:37:50.832528 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.832499 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 17:37:50.836030 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.836004 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-b98c9d888-h4g5h"] Apr 21 17:37:50.883514 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.883477 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fa555626-6c87-484e-9f9b-4f1ff5732351-telemeter-client-tls\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.883514 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.883518 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fa555626-6c87-484e-9f9b-4f1ff5732351-secret-telemeter-client\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.883728 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.883547 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckpzl\" (UniqueName: \"kubernetes.io/projected/fa555626-6c87-484e-9f9b-4f1ff5732351-kube-api-access-ckpzl\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.883728 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.883630 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fa555626-6c87-484e-9f9b-4f1ff5732351-federate-client-tls\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.883728 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.883683 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa555626-6c87-484e-9f9b-4f1ff5732351-serving-certs-ca-bundle\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.883728 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.883703 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa555626-6c87-484e-9f9b-4f1ff5732351-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.883854 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.883732 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa555626-6c87-484e-9f9b-4f1ff5732351-metrics-client-ca\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.883854 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.883822 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa555626-6c87-484e-9f9b-4f1ff5732351-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.986106 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.984562 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fa555626-6c87-484e-9f9b-4f1ff5732351-telemeter-client-tls\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.986106 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.984616 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fa555626-6c87-484e-9f9b-4f1ff5732351-secret-telemeter-client\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.986106 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.984659 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckpzl\" (UniqueName: \"kubernetes.io/projected/fa555626-6c87-484e-9f9b-4f1ff5732351-kube-api-access-ckpzl\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.986106 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.984694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fa555626-6c87-484e-9f9b-4f1ff5732351-federate-client-tls\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.986106 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.984724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa555626-6c87-484e-9f9b-4f1ff5732351-serving-certs-ca-bundle\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.986106 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.984752 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa555626-6c87-484e-9f9b-4f1ff5732351-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.986106 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.984796 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa555626-6c87-484e-9f9b-4f1ff5732351-metrics-client-ca\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.986106 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.984847 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa555626-6c87-484e-9f9b-4f1ff5732351-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.986106 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.985828 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa555626-6c87-484e-9f9b-4f1ff5732351-telemeter-trusted-ca-bundle\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.986772 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.986654 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fa555626-6c87-484e-9f9b-4f1ff5732351-metrics-client-ca\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.987430 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.987398 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa555626-6c87-484e-9f9b-4f1ff5732351-serving-certs-ca-bundle\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.988249 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.988224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/fa555626-6c87-484e-9f9b-4f1ff5732351-federate-client-tls\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.988497 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.988477 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/fa555626-6c87-484e-9f9b-4f1ff5732351-telemeter-client-tls\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.988564 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.988500 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fa555626-6c87-484e-9f9b-4f1ff5732351-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.988658 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.988643 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/fa555626-6c87-484e-9f9b-4f1ff5732351-secret-telemeter-client\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:50.996249 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:50.996224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckpzl\" (UniqueName: \"kubernetes.io/projected/fa555626-6c87-484e-9f9b-4f1ff5732351-kube-api-access-ckpzl\") pod \"telemeter-client-b98c9d888-h4g5h\" (UID: \"fa555626-6c87-484e-9f9b-4f1ff5732351\") " pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:51.135435 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:51.135340 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" Apr 21 17:37:51.295379 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:51.295338 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-b98c9d888-h4g5h"] Apr 21 17:37:51.299959 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:37:51.299922 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa555626_6c87_484e_9f9b_4f1ff5732351.slice/crio-6258e3d33210286e019a910719a4ed4abff687741cff87b819a516df8151a655 WatchSource:0}: Error finding container 6258e3d33210286e019a910719a4ed4abff687741cff87b819a516df8151a655: Status 404 returned error can't find the container with id 6258e3d33210286e019a910719a4ed4abff687741cff87b819a516df8151a655 Apr 21 17:37:51.697596 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:51.697559 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" event={"ID":"fa555626-6c87-484e-9f9b-4f1ff5732351","Type":"ContainerStarted","Data":"6258e3d33210286e019a910719a4ed4abff687741cff87b819a516df8151a655"} Apr 21 17:37:53.704377 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:53.704343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" event={"ID":"fa555626-6c87-484e-9f9b-4f1ff5732351","Type":"ContainerStarted","Data":"6137e12a997107221fead3830c908577f12864d17b5067bd9cd92d8f6ed2e835"} Apr 21 17:37:54.708662 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:54.708630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" event={"ID":"fa555626-6c87-484e-9f9b-4f1ff5732351","Type":"ContainerStarted","Data":"9152adec972feb65a79c9b65232c3484d36a0e693b2d19c543922e59686799b3"} Apr 21 17:37:55.713586 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:55.713552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" event={"ID":"fa555626-6c87-484e-9f9b-4f1ff5732351","Type":"ContainerStarted","Data":"194ea647a092dddbc744562c7d9492022cf5abd54b8d4b91ec7b6390f8bb80b8"} Apr 21 17:37:55.744635 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:55.744584 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-b98c9d888-h4g5h" podStartSLOduration=2.428477274 podStartE2EDuration="5.744566425s" podCreationTimestamp="2026-04-21 17:37:50 +0000 UTC" firstStartedPulling="2026-04-21 17:37:51.30201159 +0000 UTC m=+256.012705678" lastFinishedPulling="2026-04-21 17:37:54.618100739 +0000 UTC m=+259.328794829" observedRunningTime="2026-04-21 17:37:55.743505523 +0000 UTC m=+260.454199656" watchObservedRunningTime="2026-04-21 17:37:55.744566425 +0000 UTC m=+260.455260512" Apr 21 17:37:56.838770 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:56.838736 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84b9db998d-7c2lz"] Apr 21 17:37:56.843304 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:56.843271 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:56.850095 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:56.850064 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84b9db998d-7c2lz"] Apr 21 17:37:56.935442 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:56.935409 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7375079-7914-401b-acf2-42c7b45ca910-console-serving-cert\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:56.935442 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:56.935449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-console-config\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:56.935675 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:56.935474 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-service-ca\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:56.935675 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:56.935558 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-trusted-ca-bundle\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:56.935675 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:56.935597 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7375079-7914-401b-acf2-42c7b45ca910-console-oauth-config\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:56.935675 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:56.935615 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnjvf\" (UniqueName: \"kubernetes.io/projected/b7375079-7914-401b-acf2-42c7b45ca910-kube-api-access-mnjvf\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:56.935675 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:56.935638 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-oauth-serving-cert\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.036869 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.036827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7375079-7914-401b-acf2-42c7b45ca910-console-serving-cert\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.036869 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.036872 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-console-config\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.037164 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.036896 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-service-ca\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.037164 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.036920 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-trusted-ca-bundle\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.037164 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.036940 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7375079-7914-401b-acf2-42c7b45ca910-console-oauth-config\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.037164 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.036959 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnjvf\" (UniqueName: \"kubernetes.io/projected/b7375079-7914-401b-acf2-42c7b45ca910-kube-api-access-mnjvf\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.037164 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.036977 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-oauth-serving-cert\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.037723 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.037695 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-console-config\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.037850 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.037774 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-service-ca\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.037850 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.037828 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-oauth-serving-cert\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.038197 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.038176 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-trusted-ca-bundle\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.039531 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.039507 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7375079-7914-401b-acf2-42c7b45ca910-console-oauth-config\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.039633 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.039596 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7375079-7914-401b-acf2-42c7b45ca910-console-serving-cert\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.045834 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.045811 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnjvf\" (UniqueName: \"kubernetes.io/projected/b7375079-7914-401b-acf2-42c7b45ca910-kube-api-access-mnjvf\") pod \"console-84b9db998d-7c2lz\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.154186 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.154071 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:37:57.299660 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.299628 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84b9db998d-7c2lz"] Apr 21 17:37:57.302261 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:37:57.302217 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7375079_7914_401b_acf2_42c7b45ca910.slice/crio-ce5256033e5d963b4c660a26a1b31c8705823c00c826d936bef2cce71ed74b73 WatchSource:0}: Error finding container ce5256033e5d963b4c660a26a1b31c8705823c00c826d936bef2cce71ed74b73: Status 404 returned error can't find the container with id ce5256033e5d963b4c660a26a1b31c8705823c00c826d936bef2cce71ed74b73 Apr 21 17:37:57.720731 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.720695 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84b9db998d-7c2lz" event={"ID":"b7375079-7914-401b-acf2-42c7b45ca910","Type":"ContainerStarted","Data":"c8eba5626aa44f561cb4bc422a7b1d9087b19e89e11bb180c23a82e0a23960fc"} Apr 21 17:37:57.720731 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.720731 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84b9db998d-7c2lz" event={"ID":"b7375079-7914-401b-acf2-42c7b45ca910","Type":"ContainerStarted","Data":"ce5256033e5d963b4c660a26a1b31c8705823c00c826d936bef2cce71ed74b73"} Apr 21 17:37:57.744145 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:37:57.744074 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84b9db998d-7c2lz" podStartSLOduration=1.744059253 podStartE2EDuration="1.744059253s" podCreationTimestamp="2026-04-21 17:37:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:37:57.742460156 +0000 UTC m=+262.453154266" watchObservedRunningTime="2026-04-21 17:37:57.744059253 +0000 UTC m=+262.454753362" Apr 21 17:38:07.154245 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:07.154193 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:38:07.154245 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:07.154251 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:38:07.159289 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:07.159263 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:38:07.755503 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:07.755471 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:38:07.806761 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:07.806721 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-696d4bccbb-wqzq5"] Apr 21 17:38:32.826853 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:32.826779 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-696d4bccbb-wqzq5" podUID="3d02387c-a065-4627-982a-1de15c72a7b9" containerName="console" containerID="cri-o://e99fd94d1b5c3804f8206ed47d93d4bf70410cae2980ca7ae4da0aef0a8ccfd4" gracePeriod=15 Apr 21 17:38:33.072780 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.072754 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-696d4bccbb-wqzq5_3d02387c-a065-4627-982a-1de15c72a7b9/console/0.log" Apr 21 17:38:33.072926 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.072816 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:38:33.245777 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.245742 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-trusted-ca-bundle\") pod \"3d02387c-a065-4627-982a-1de15c72a7b9\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " Apr 21 17:38:33.245941 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.245804 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-oauth-serving-cert\") pod \"3d02387c-a065-4627-982a-1de15c72a7b9\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " Apr 21 17:38:33.245941 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.245830 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-service-ca\") pod \"3d02387c-a065-4627-982a-1de15c72a7b9\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " Apr 21 17:38:33.245941 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.245861 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d02387c-a065-4627-982a-1de15c72a7b9-console-serving-cert\") pod \"3d02387c-a065-4627-982a-1de15c72a7b9\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " Apr 21 17:38:33.245941 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.245909 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-console-config\") pod \"3d02387c-a065-4627-982a-1de15c72a7b9\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " Apr 21 17:38:33.246165 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.245961 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4qzp\" (UniqueName: \"kubernetes.io/projected/3d02387c-a065-4627-982a-1de15c72a7b9-kube-api-access-q4qzp\") pod \"3d02387c-a065-4627-982a-1de15c72a7b9\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " Apr 21 17:38:33.246165 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.246003 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d02387c-a065-4627-982a-1de15c72a7b9-console-oauth-config\") pod \"3d02387c-a065-4627-982a-1de15c72a7b9\" (UID: \"3d02387c-a065-4627-982a-1de15c72a7b9\") " Apr 21 17:38:33.246360 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.246249 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3d02387c-a065-4627-982a-1de15c72a7b9" (UID: "3d02387c-a065-4627-982a-1de15c72a7b9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:38:33.246360 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.246330 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3d02387c-a065-4627-982a-1de15c72a7b9" (UID: "3d02387c-a065-4627-982a-1de15c72a7b9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:38:33.246479 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.246369 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-service-ca" (OuterVolumeSpecName: "service-ca") pod "3d02387c-a065-4627-982a-1de15c72a7b9" (UID: "3d02387c-a065-4627-982a-1de15c72a7b9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:38:33.246479 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.246445 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-console-config" (OuterVolumeSpecName: "console-config") pod "3d02387c-a065-4627-982a-1de15c72a7b9" (UID: "3d02387c-a065-4627-982a-1de15c72a7b9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:38:33.248176 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.248153 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d02387c-a065-4627-982a-1de15c72a7b9-kube-api-access-q4qzp" (OuterVolumeSpecName: "kube-api-access-q4qzp") pod "3d02387c-a065-4627-982a-1de15c72a7b9" (UID: "3d02387c-a065-4627-982a-1de15c72a7b9"). InnerVolumeSpecName "kube-api-access-q4qzp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:38:33.248281 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.248262 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d02387c-a065-4627-982a-1de15c72a7b9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3d02387c-a065-4627-982a-1de15c72a7b9" (UID: "3d02387c-a065-4627-982a-1de15c72a7b9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:38:33.248320 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.248277 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d02387c-a065-4627-982a-1de15c72a7b9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3d02387c-a065-4627-982a-1de15c72a7b9" (UID: "3d02387c-a065-4627-982a-1de15c72a7b9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:38:33.347080 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.347041 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3d02387c-a065-4627-982a-1de15c72a7b9-console-serving-cert\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:38:33.347080 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.347074 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-console-config\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:38:33.347080 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.347085 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q4qzp\" (UniqueName: \"kubernetes.io/projected/3d02387c-a065-4627-982a-1de15c72a7b9-kube-api-access-q4qzp\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:38:33.347346 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.347094 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3d02387c-a065-4627-982a-1de15c72a7b9-console-oauth-config\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:38:33.347346 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.347104 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-trusted-ca-bundle\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:38:33.347346 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.347112 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-oauth-serving-cert\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:38:33.347346 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.347120 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3d02387c-a065-4627-982a-1de15c72a7b9-service-ca\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:38:33.831492 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.831465 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-696d4bccbb-wqzq5_3d02387c-a065-4627-982a-1de15c72a7b9/console/0.log" Apr 21 17:38:33.831891 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.831505 2573 generic.go:358] "Generic (PLEG): container finished" podID="3d02387c-a065-4627-982a-1de15c72a7b9" containerID="e99fd94d1b5c3804f8206ed47d93d4bf70410cae2980ca7ae4da0aef0a8ccfd4" exitCode=2 Apr 21 17:38:33.831891 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.831540 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-696d4bccbb-wqzq5" event={"ID":"3d02387c-a065-4627-982a-1de15c72a7b9","Type":"ContainerDied","Data":"e99fd94d1b5c3804f8206ed47d93d4bf70410cae2980ca7ae4da0aef0a8ccfd4"} Apr 21 17:38:33.831891 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.831571 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-696d4bccbb-wqzq5" Apr 21 17:38:33.831891 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.831581 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-696d4bccbb-wqzq5" event={"ID":"3d02387c-a065-4627-982a-1de15c72a7b9","Type":"ContainerDied","Data":"6c8b28edc143cfb8dff409dbc7a988fd714a19375ea897a5c0cbaa64824b3388"} Apr 21 17:38:33.831891 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.831598 2573 scope.go:117] "RemoveContainer" containerID="e99fd94d1b5c3804f8206ed47d93d4bf70410cae2980ca7ae4da0aef0a8ccfd4" Apr 21 17:38:33.839977 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.839954 2573 scope.go:117] "RemoveContainer" containerID="e99fd94d1b5c3804f8206ed47d93d4bf70410cae2980ca7ae4da0aef0a8ccfd4" Apr 21 17:38:33.840278 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:38:33.840258 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e99fd94d1b5c3804f8206ed47d93d4bf70410cae2980ca7ae4da0aef0a8ccfd4\": container with ID starting with e99fd94d1b5c3804f8206ed47d93d4bf70410cae2980ca7ae4da0aef0a8ccfd4 not found: ID does not exist" containerID="e99fd94d1b5c3804f8206ed47d93d4bf70410cae2980ca7ae4da0aef0a8ccfd4" Apr 21 17:38:33.840359 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.840284 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99fd94d1b5c3804f8206ed47d93d4bf70410cae2980ca7ae4da0aef0a8ccfd4"} err="failed to get container status \"e99fd94d1b5c3804f8206ed47d93d4bf70410cae2980ca7ae4da0aef0a8ccfd4\": rpc error: code = NotFound desc = could not find container \"e99fd94d1b5c3804f8206ed47d93d4bf70410cae2980ca7ae4da0aef0a8ccfd4\": container with ID starting with e99fd94d1b5c3804f8206ed47d93d4bf70410cae2980ca7ae4da0aef0a8ccfd4 not found: ID does not exist" Apr 21 17:38:33.852625 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.852597 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-696d4bccbb-wqzq5"] Apr 21 17:38:33.856929 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.856899 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-696d4bccbb-wqzq5"] Apr 21 17:38:33.909632 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:33.909594 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d02387c-a065-4627-982a-1de15c72a7b9" path="/var/lib/kubelet/pods/3d02387c-a065-4627-982a-1de15c72a7b9/volumes" Apr 21 17:38:35.777149 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:35.777097 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 17:38:35.777583 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:35.777354 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 17:38:35.782094 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:35.782061 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 17:38:35.782698 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:35.782677 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 17:38:35.788474 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:38:35.788449 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 17:39:02.743083 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.742377 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-696f884676-nlxbt"] Apr 21 17:39:02.745802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.743837 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d02387c-a065-4627-982a-1de15c72a7b9" containerName="console" Apr 21 17:39:02.745802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.743863 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d02387c-a065-4627-982a-1de15c72a7b9" containerName="console" Apr 21 17:39:02.745802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.743957 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d02387c-a065-4627-982a-1de15c72a7b9" containerName="console" Apr 21 17:39:02.746777 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.746756 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.758346 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.758311 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-696f884676-nlxbt"] Apr 21 17:39:02.781893 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.781853 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmcg8\" (UniqueName: \"kubernetes.io/projected/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-kube-api-access-qmcg8\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.782070 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.781898 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-oauth-serving-cert\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.782070 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.781929 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-oauth-config\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.782070 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.781976 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-service-ca\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.782070 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.781994 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-trusted-ca-bundle\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.782070 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.782065 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-serving-cert\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.782283 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.782085 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-config\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.883478 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.883432 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-serving-cert\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.883478 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.883478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-config\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.883763 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.883514 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmcg8\" (UniqueName: \"kubernetes.io/projected/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-kube-api-access-qmcg8\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.883763 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.883537 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-oauth-serving-cert\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.883763 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.883558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-oauth-config\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.883763 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.883605 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-service-ca\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.883763 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.883628 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-trusted-ca-bundle\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.884489 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.884459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-oauth-serving-cert\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.884489 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.884476 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-service-ca\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.884640 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.884459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-config\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.884640 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.884510 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-trusted-ca-bundle\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.886078 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.886048 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-oauth-config\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.886078 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.886066 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-serving-cert\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:02.892922 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:02.892895 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmcg8\" (UniqueName: \"kubernetes.io/projected/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-kube-api-access-qmcg8\") pod \"console-696f884676-nlxbt\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:03.059263 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:03.059168 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:03.189616 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:03.189591 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-696f884676-nlxbt"] Apr 21 17:39:03.192396 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:39:03.192366 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd8df74c_1bed_46ce_9580_75cfa8ffc3d6.slice/crio-2cf6e1a6e4720944e9b8c68865c62b245fbadbb922d76d4d27d0a5db8880c5db WatchSource:0}: Error finding container 2cf6e1a6e4720944e9b8c68865c62b245fbadbb922d76d4d27d0a5db8880c5db: Status 404 returned error can't find the container with id 2cf6e1a6e4720944e9b8c68865c62b245fbadbb922d76d4d27d0a5db8880c5db Apr 21 17:39:03.194166 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:03.194128 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 17:39:03.918202 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:03.918159 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-696f884676-nlxbt" event={"ID":"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6","Type":"ContainerStarted","Data":"34c77671ebdb05d6955a57cde0c327f8e6f8b1bccc617147d2045db0908b6254"} Apr 21 17:39:03.918202 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:03.918197 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-696f884676-nlxbt" event={"ID":"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6","Type":"ContainerStarted","Data":"2cf6e1a6e4720944e9b8c68865c62b245fbadbb922d76d4d27d0a5db8880c5db"} Apr 21 17:39:03.937785 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:03.937602 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-696f884676-nlxbt" podStartSLOduration=1.9375865939999999 podStartE2EDuration="1.937586594s" podCreationTimestamp="2026-04-21 17:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:39:03.936927689 +0000 UTC m=+328.647621838" watchObservedRunningTime="2026-04-21 17:39:03.937586594 +0000 UTC m=+328.648280703" Apr 21 17:39:13.060150 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:13.060066 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:13.060150 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:13.060156 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:13.065001 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:13.064967 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:13.949411 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:13.949377 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:39:13.999403 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:13.999369 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84b9db998d-7c2lz"] Apr 21 17:39:39.021453 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.021350 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-84b9db998d-7c2lz" podUID="b7375079-7914-401b-acf2-42c7b45ca910" containerName="console" containerID="cri-o://c8eba5626aa44f561cb4bc422a7b1d9087b19e89e11bb180c23a82e0a23960fc" gracePeriod=15 Apr 21 17:39:39.261950 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.261924 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84b9db998d-7c2lz_b7375079-7914-401b-acf2-42c7b45ca910/console/0.log" Apr 21 17:39:39.262102 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.261993 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:39:39.300792 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.300695 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-console-config\") pod \"b7375079-7914-401b-acf2-42c7b45ca910\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " Apr 21 17:39:39.300792 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.300742 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-service-ca\") pod \"b7375079-7914-401b-acf2-42c7b45ca910\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " Apr 21 17:39:39.300792 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.300772 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-trusted-ca-bundle\") pod \"b7375079-7914-401b-acf2-42c7b45ca910\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " Apr 21 17:39:39.301081 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.300798 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-oauth-serving-cert\") pod \"b7375079-7914-401b-acf2-42c7b45ca910\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " Apr 21 17:39:39.301081 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.300916 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7375079-7914-401b-acf2-42c7b45ca910-console-serving-cert\") pod \"b7375079-7914-401b-acf2-42c7b45ca910\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " Apr 21 17:39:39.301081 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.300954 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnjvf\" (UniqueName: \"kubernetes.io/projected/b7375079-7914-401b-acf2-42c7b45ca910-kube-api-access-mnjvf\") pod \"b7375079-7914-401b-acf2-42c7b45ca910\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " Apr 21 17:39:39.301081 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.301002 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7375079-7914-401b-acf2-42c7b45ca910-console-oauth-config\") pod \"b7375079-7914-401b-acf2-42c7b45ca910\" (UID: \"b7375079-7914-401b-acf2-42c7b45ca910\") " Apr 21 17:39:39.301300 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.301217 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-console-config" (OuterVolumeSpecName: "console-config") pod "b7375079-7914-401b-acf2-42c7b45ca910" (UID: "b7375079-7914-401b-acf2-42c7b45ca910"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:39:39.301300 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.301262 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b7375079-7914-401b-acf2-42c7b45ca910" (UID: "b7375079-7914-401b-acf2-42c7b45ca910"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:39:39.301440 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.301302 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b7375079-7914-401b-acf2-42c7b45ca910" (UID: "b7375079-7914-401b-acf2-42c7b45ca910"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:39:39.301545 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.301524 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-service-ca" (OuterVolumeSpecName: "service-ca") pod "b7375079-7914-401b-acf2-42c7b45ca910" (UID: "b7375079-7914-401b-acf2-42c7b45ca910"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:39:39.303280 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.303247 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7375079-7914-401b-acf2-42c7b45ca910-kube-api-access-mnjvf" (OuterVolumeSpecName: "kube-api-access-mnjvf") pod "b7375079-7914-401b-acf2-42c7b45ca910" (UID: "b7375079-7914-401b-acf2-42c7b45ca910"). InnerVolumeSpecName "kube-api-access-mnjvf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:39:39.303387 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.303310 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7375079-7914-401b-acf2-42c7b45ca910-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b7375079-7914-401b-acf2-42c7b45ca910" (UID: "b7375079-7914-401b-acf2-42c7b45ca910"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:39:39.303387 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.303326 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7375079-7914-401b-acf2-42c7b45ca910-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b7375079-7914-401b-acf2-42c7b45ca910" (UID: "b7375079-7914-401b-acf2-42c7b45ca910"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:39:39.402155 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.402093 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-service-ca\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:39:39.402155 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.402126 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-trusted-ca-bundle\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:39:39.402155 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.402165 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-oauth-serving-cert\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:39:39.402391 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.402179 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7375079-7914-401b-acf2-42c7b45ca910-console-serving-cert\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:39:39.402391 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.402195 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mnjvf\" (UniqueName: \"kubernetes.io/projected/b7375079-7914-401b-acf2-42c7b45ca910-kube-api-access-mnjvf\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:39:39.402391 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.402208 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7375079-7914-401b-acf2-42c7b45ca910-console-oauth-config\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:39:39.402391 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:39.402217 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7375079-7914-401b-acf2-42c7b45ca910-console-config\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:39:40.021647 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:40.021608 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84b9db998d-7c2lz_b7375079-7914-401b-acf2-42c7b45ca910/console/0.log" Apr 21 17:39:40.022119 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:40.021653 2573 generic.go:358] "Generic (PLEG): container finished" podID="b7375079-7914-401b-acf2-42c7b45ca910" containerID="c8eba5626aa44f561cb4bc422a7b1d9087b19e89e11bb180c23a82e0a23960fc" exitCode=2 Apr 21 17:39:40.022119 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:40.021690 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84b9db998d-7c2lz" event={"ID":"b7375079-7914-401b-acf2-42c7b45ca910","Type":"ContainerDied","Data":"c8eba5626aa44f561cb4bc422a7b1d9087b19e89e11bb180c23a82e0a23960fc"} Apr 21 17:39:40.022119 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:40.021719 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84b9db998d-7c2lz" event={"ID":"b7375079-7914-401b-acf2-42c7b45ca910","Type":"ContainerDied","Data":"ce5256033e5d963b4c660a26a1b31c8705823c00c826d936bef2cce71ed74b73"} Apr 21 17:39:40.022119 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:40.021718 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84b9db998d-7c2lz" Apr 21 17:39:40.022119 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:40.021732 2573 scope.go:117] "RemoveContainer" containerID="c8eba5626aa44f561cb4bc422a7b1d9087b19e89e11bb180c23a82e0a23960fc" Apr 21 17:39:40.029688 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:40.029670 2573 scope.go:117] "RemoveContainer" containerID="c8eba5626aa44f561cb4bc422a7b1d9087b19e89e11bb180c23a82e0a23960fc" Apr 21 17:39:40.029947 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:39:40.029930 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8eba5626aa44f561cb4bc422a7b1d9087b19e89e11bb180c23a82e0a23960fc\": container with ID starting with c8eba5626aa44f561cb4bc422a7b1d9087b19e89e11bb180c23a82e0a23960fc not found: ID does not exist" containerID="c8eba5626aa44f561cb4bc422a7b1d9087b19e89e11bb180c23a82e0a23960fc" Apr 21 17:39:40.030001 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:40.029956 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8eba5626aa44f561cb4bc422a7b1d9087b19e89e11bb180c23a82e0a23960fc"} err="failed to get container status \"c8eba5626aa44f561cb4bc422a7b1d9087b19e89e11bb180c23a82e0a23960fc\": rpc error: code = NotFound desc = could not find container \"c8eba5626aa44f561cb4bc422a7b1d9087b19e89e11bb180c23a82e0a23960fc\": container with ID starting with c8eba5626aa44f561cb4bc422a7b1d9087b19e89e11bb180c23a82e0a23960fc not found: ID does not exist" Apr 21 17:39:40.040127 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:40.040090 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84b9db998d-7c2lz"] Apr 21 17:39:40.044281 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:40.044255 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-84b9db998d-7c2lz"] Apr 21 17:39:41.664909 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.664875 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fxbtq"] Apr 21 17:39:41.665371 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.665202 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7375079-7914-401b-acf2-42c7b45ca910" containerName="console" Apr 21 17:39:41.665371 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.665213 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7375079-7914-401b-acf2-42c7b45ca910" containerName="console" Apr 21 17:39:41.665371 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.665278 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7375079-7914-401b-acf2-42c7b45ca910" containerName="console" Apr 21 17:39:41.669595 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.669575 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fxbtq" Apr 21 17:39:41.672716 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.672696 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 17:39:41.676884 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.676762 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fxbtq"] Apr 21 17:39:41.720620 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.720577 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c-dbus\") pod \"global-pull-secret-syncer-fxbtq\" (UID: \"e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c\") " pod="kube-system/global-pull-secret-syncer-fxbtq" Apr 21 17:39:41.720620 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.720620 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c-original-pull-secret\") pod \"global-pull-secret-syncer-fxbtq\" (UID: \"e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c\") " pod="kube-system/global-pull-secret-syncer-fxbtq" Apr 21 17:39:41.720832 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.720732 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c-kubelet-config\") pod \"global-pull-secret-syncer-fxbtq\" (UID: \"e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c\") " pod="kube-system/global-pull-secret-syncer-fxbtq" Apr 21 17:39:41.821676 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.821636 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c-kubelet-config\") pod \"global-pull-secret-syncer-fxbtq\" (UID: \"e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c\") " pod="kube-system/global-pull-secret-syncer-fxbtq" Apr 21 17:39:41.821839 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.821685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c-dbus\") pod \"global-pull-secret-syncer-fxbtq\" (UID: \"e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c\") " pod="kube-system/global-pull-secret-syncer-fxbtq" Apr 21 17:39:41.821839 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.821714 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c-original-pull-secret\") pod \"global-pull-secret-syncer-fxbtq\" (UID: \"e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c\") " pod="kube-system/global-pull-secret-syncer-fxbtq" Apr 21 17:39:41.821839 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.821757 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c-kubelet-config\") pod \"global-pull-secret-syncer-fxbtq\" (UID: \"e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c\") " pod="kube-system/global-pull-secret-syncer-fxbtq" Apr 21 17:39:41.821935 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.821909 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c-dbus\") pod \"global-pull-secret-syncer-fxbtq\" (UID: \"e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c\") " pod="kube-system/global-pull-secret-syncer-fxbtq" Apr 21 17:39:41.824032 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.824012 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c-original-pull-secret\") pod \"global-pull-secret-syncer-fxbtq\" (UID: \"e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c\") " pod="kube-system/global-pull-secret-syncer-fxbtq" Apr 21 17:39:41.909870 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.909835 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7375079-7914-401b-acf2-42c7b45ca910" path="/var/lib/kubelet/pods/b7375079-7914-401b-acf2-42c7b45ca910/volumes" Apr 21 17:39:41.980039 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:41.979962 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fxbtq" Apr 21 17:39:42.107938 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:42.107801 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fxbtq"] Apr 21 17:39:42.111015 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:39:42.110984 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4b0f8ab_bbb9_4342_8f16_fd9dd2ec095c.slice/crio-5dff9b4e2789f4c78fc4d7bc52e9a173a3da910d7238cbbe8b3a92afd3c23ab9 WatchSource:0}: Error finding container 5dff9b4e2789f4c78fc4d7bc52e9a173a3da910d7238cbbe8b3a92afd3c23ab9: Status 404 returned error can't find the container with id 5dff9b4e2789f4c78fc4d7bc52e9a173a3da910d7238cbbe8b3a92afd3c23ab9 Apr 21 17:39:43.033503 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:43.033464 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fxbtq" event={"ID":"e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c","Type":"ContainerStarted","Data":"5dff9b4e2789f4c78fc4d7bc52e9a173a3da910d7238cbbe8b3a92afd3c23ab9"} Apr 21 17:39:47.047269 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:47.047230 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fxbtq" event={"ID":"e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c","Type":"ContainerStarted","Data":"3189a0c5852564902e7b9955569aa6c37c89c4a4168d2f6eb0c8940c6e0a41aa"} Apr 21 17:39:47.063551 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:39:47.063456 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fxbtq" podStartSLOduration=2.015934792 podStartE2EDuration="6.063438432s" podCreationTimestamp="2026-04-21 17:39:41 +0000 UTC" firstStartedPulling="2026-04-21 17:39:42.112864682 +0000 UTC m=+366.823558767" lastFinishedPulling="2026-04-21 17:39:46.160368317 +0000 UTC m=+370.871062407" observedRunningTime="2026-04-21 17:39:47.062922996 +0000 UTC m=+371.773617104" watchObservedRunningTime="2026-04-21 17:39:47.063438432 +0000 UTC m=+371.774132544" Apr 21 17:40:10.692399 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.692361 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6"] Apr 21 17:40:10.695850 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.695828 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" Apr 21 17:40:10.699121 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.699095 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 17:40:10.700491 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.700472 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 17:40:10.700491 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.700485 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-fdhhv\"" Apr 21 17:40:10.709292 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.709258 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6"] Apr 21 17:40:10.764783 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.764742 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjpgq\" (UniqueName: \"kubernetes.io/projected/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-kube-api-access-zjpgq\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6\" (UID: \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" Apr 21 17:40:10.764965 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.764849 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6\" (UID: \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" Apr 21 17:40:10.764965 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.764873 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6\" (UID: \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" Apr 21 17:40:10.865898 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.865857 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6\" (UID: \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" Apr 21 17:40:10.865898 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.865899 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6\" (UID: \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" Apr 21 17:40:10.866193 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.865957 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjpgq\" (UniqueName: \"kubernetes.io/projected/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-kube-api-access-zjpgq\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6\" (UID: \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" Apr 21 17:40:10.866378 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.866349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6\" (UID: \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" Apr 21 17:40:10.866441 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.866380 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6\" (UID: \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" Apr 21 17:40:10.878784 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:10.878744 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjpgq\" (UniqueName: \"kubernetes.io/projected/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-kube-api-access-zjpgq\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6\" (UID: \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" Apr 21 17:40:11.005432 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:11.005394 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" Apr 21 17:40:11.141343 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:11.141187 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6"] Apr 21 17:40:11.144034 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:40:11.144002 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f70ef83_e9fd_4b8e_ab25_d3f77437b53a.slice/crio-511c00bb454fde8a86ee15ccd6777832ae804322550650db9d317ca6cf520ee7 WatchSource:0}: Error finding container 511c00bb454fde8a86ee15ccd6777832ae804322550650db9d317ca6cf520ee7: Status 404 returned error can't find the container with id 511c00bb454fde8a86ee15ccd6777832ae804322550650db9d317ca6cf520ee7 Apr 21 17:40:12.119856 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:12.119813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" event={"ID":"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a","Type":"ContainerStarted","Data":"511c00bb454fde8a86ee15ccd6777832ae804322550650db9d317ca6cf520ee7"} Apr 21 17:40:18.139100 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:18.138999 2573 generic.go:358] "Generic (PLEG): container finished" podID="2f70ef83-e9fd-4b8e-ab25-d3f77437b53a" containerID="1fd5b6f74f15ee0795c71345641d047cdfbda46a321a3c4f119e69bcb6a56351" exitCode=0 Apr 21 17:40:18.139100 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:18.139043 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" event={"ID":"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a","Type":"ContainerDied","Data":"1fd5b6f74f15ee0795c71345641d047cdfbda46a321a3c4f119e69bcb6a56351"} Apr 21 17:40:20.147924 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:20.147880 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" event={"ID":"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a","Type":"ContainerStarted","Data":"539862da84dc2d47d2bb626f262b751f846f73b50ff714bec3ced1f618eaf133"} Apr 21 17:40:21.152403 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:21.152369 2573 generic.go:358] "Generic (PLEG): container finished" podID="2f70ef83-e9fd-4b8e-ab25-d3f77437b53a" containerID="539862da84dc2d47d2bb626f262b751f846f73b50ff714bec3ced1f618eaf133" exitCode=0 Apr 21 17:40:21.152822 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:21.152442 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" event={"ID":"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a","Type":"ContainerDied","Data":"539862da84dc2d47d2bb626f262b751f846f73b50ff714bec3ced1f618eaf133"} Apr 21 17:40:29.184727 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:29.184691 2573 generic.go:358] "Generic (PLEG): container finished" podID="2f70ef83-e9fd-4b8e-ab25-d3f77437b53a" containerID="fd815089723b93d1a35f3c4eec84583880d6b48dc8125cadeefebc845b4dcb96" exitCode=0 Apr 21 17:40:29.185185 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:29.184776 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" event={"ID":"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a","Type":"ContainerDied","Data":"fd815089723b93d1a35f3c4eec84583880d6b48dc8125cadeefebc845b4dcb96"} Apr 21 17:40:30.318930 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:30.318904 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" Apr 21 17:40:30.338433 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:30.338402 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjpgq\" (UniqueName: \"kubernetes.io/projected/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-kube-api-access-zjpgq\") pod \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\" (UID: \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\") " Apr 21 17:40:30.340662 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:30.340628 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-kube-api-access-zjpgq" (OuterVolumeSpecName: "kube-api-access-zjpgq") pod "2f70ef83-e9fd-4b8e-ab25-d3f77437b53a" (UID: "2f70ef83-e9fd-4b8e-ab25-d3f77437b53a"). InnerVolumeSpecName "kube-api-access-zjpgq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:40:30.439046 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:30.438956 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-util\") pod \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\" (UID: \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\") " Apr 21 17:40:30.439046 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:30.438996 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-bundle\") pod \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\" (UID: \"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a\") " Apr 21 17:40:30.439316 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:30.439157 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zjpgq\" (UniqueName: \"kubernetes.io/projected/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-kube-api-access-zjpgq\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:40:30.439693 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:30.439666 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-bundle" (OuterVolumeSpecName: "bundle") pod "2f70ef83-e9fd-4b8e-ab25-d3f77437b53a" (UID: "2f70ef83-e9fd-4b8e-ab25-d3f77437b53a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:40:30.442890 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:30.442858 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-util" (OuterVolumeSpecName: "util") pod "2f70ef83-e9fd-4b8e-ab25-d3f77437b53a" (UID: "2f70ef83-e9fd-4b8e-ab25-d3f77437b53a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:40:30.539686 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:30.539650 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-util\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:40:30.539686 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:30.539682 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f70ef83-e9fd-4b8e-ab25-d3f77437b53a-bundle\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:40:31.192070 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:31.192034 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" event={"ID":"2f70ef83-e9fd-4b8e-ab25-d3f77437b53a","Type":"ContainerDied","Data":"511c00bb454fde8a86ee15ccd6777832ae804322550650db9d317ca6cf520ee7"} Apr 21 17:40:31.192070 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:31.192073 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="511c00bb454fde8a86ee15ccd6777832ae804322550650db9d317ca6cf520ee7" Apr 21 17:40:31.192070 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:31.192055 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dqr6p6" Apr 21 17:40:38.738322 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.738283 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8c97g"] Apr 21 17:40:38.738712 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.738593 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f70ef83-e9fd-4b8e-ab25-d3f77437b53a" containerName="pull" Apr 21 17:40:38.738712 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.738604 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f70ef83-e9fd-4b8e-ab25-d3f77437b53a" containerName="pull" Apr 21 17:40:38.738712 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.738629 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f70ef83-e9fd-4b8e-ab25-d3f77437b53a" containerName="util" Apr 21 17:40:38.738712 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.738635 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f70ef83-e9fd-4b8e-ab25-d3f77437b53a" containerName="util" Apr 21 17:40:38.738712 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.738640 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f70ef83-e9fd-4b8e-ab25-d3f77437b53a" containerName="extract" Apr 21 17:40:38.738712 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.738646 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f70ef83-e9fd-4b8e-ab25-d3f77437b53a" containerName="extract" Apr 21 17:40:38.738712 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.738692 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f70ef83-e9fd-4b8e-ab25-d3f77437b53a" containerName="extract" Apr 21 17:40:38.743177 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.743125 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8c97g" Apr 21 17:40:38.746644 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.746615 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-7wgt9\"" Apr 21 17:40:38.747116 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.747095 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 21 17:40:38.747601 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.747582 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 21 17:40:38.760830 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.760795 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8c97g"] Apr 21 17:40:38.800880 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.800839 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hvwj\" (UniqueName: \"kubernetes.io/projected/4f1ad77c-4ea8-47d7-814b-db87f4e42ce0-kube-api-access-8hvwj\") pod \"cert-manager-operator-controller-manager-54b9655956-8c97g\" (UID: \"4f1ad77c-4ea8-47d7-814b-db87f4e42ce0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8c97g" Apr 21 17:40:38.800880 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.800888 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f1ad77c-4ea8-47d7-814b-db87f4e42ce0-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-8c97g\" (UID: \"4f1ad77c-4ea8-47d7-814b-db87f4e42ce0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8c97g" Apr 21 17:40:38.901495 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.901451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hvwj\" (UniqueName: \"kubernetes.io/projected/4f1ad77c-4ea8-47d7-814b-db87f4e42ce0-kube-api-access-8hvwj\") pod \"cert-manager-operator-controller-manager-54b9655956-8c97g\" (UID: \"4f1ad77c-4ea8-47d7-814b-db87f4e42ce0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8c97g" Apr 21 17:40:38.901495 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.901499 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f1ad77c-4ea8-47d7-814b-db87f4e42ce0-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-8c97g\" (UID: \"4f1ad77c-4ea8-47d7-814b-db87f4e42ce0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8c97g" Apr 21 17:40:38.901852 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.901835 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4f1ad77c-4ea8-47d7-814b-db87f4e42ce0-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-8c97g\" (UID: \"4f1ad77c-4ea8-47d7-814b-db87f4e42ce0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8c97g" Apr 21 17:40:38.915475 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:38.915446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hvwj\" (UniqueName: \"kubernetes.io/projected/4f1ad77c-4ea8-47d7-814b-db87f4e42ce0-kube-api-access-8hvwj\") pod \"cert-manager-operator-controller-manager-54b9655956-8c97g\" (UID: \"4f1ad77c-4ea8-47d7-814b-db87f4e42ce0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8c97g" Apr 21 17:40:39.052587 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:39.052486 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8c97g" Apr 21 17:40:39.194942 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:39.194756 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8c97g"] Apr 21 17:40:39.197900 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:40:39.197866 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f1ad77c_4ea8_47d7_814b_db87f4e42ce0.slice/crio-5a37c32a6b7d2cc6e531c3ab82122e3c9a9555a89493f72a156f24dac75de13d WatchSource:0}: Error finding container 5a37c32a6b7d2cc6e531c3ab82122e3c9a9555a89493f72a156f24dac75de13d: Status 404 returned error can't find the container with id 5a37c32a6b7d2cc6e531c3ab82122e3c9a9555a89493f72a156f24dac75de13d Apr 21 17:40:39.217197 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:39.217157 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8c97g" event={"ID":"4f1ad77c-4ea8-47d7-814b-db87f4e42ce0","Type":"ContainerStarted","Data":"5a37c32a6b7d2cc6e531c3ab82122e3c9a9555a89493f72a156f24dac75de13d"} Apr 21 17:40:41.225827 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:41.225716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8c97g" event={"ID":"4f1ad77c-4ea8-47d7-814b-db87f4e42ce0","Type":"ContainerStarted","Data":"4d6320fa913ed81c6b7a51279f3e80e6e36b6d8d505eecb6f298bddd36e80ed2"} Apr 21 17:40:41.267543 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:41.267480 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-8c97g" podStartSLOduration=1.6641808519999999 podStartE2EDuration="3.267456626s" podCreationTimestamp="2026-04-21 17:40:38 +0000 UTC" firstStartedPulling="2026-04-21 17:40:39.200627149 +0000 UTC m=+423.911321240" lastFinishedPulling="2026-04-21 17:40:40.803902925 +0000 UTC m=+425.514597014" observedRunningTime="2026-04-21 17:40:41.264815525 +0000 UTC m=+425.975509630" watchObservedRunningTime="2026-04-21 17:40:41.267456626 +0000 UTC m=+425.978150735" Apr 21 17:40:42.941660 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:42.941623 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g"] Apr 21 17:40:42.945262 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:42.945243 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" Apr 21 17:40:42.948889 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:42.948863 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 17:40:42.949078 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:42.949020 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-fdhhv\"" Apr 21 17:40:42.950234 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:42.950215 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 17:40:42.955572 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:42.955544 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g"] Apr 21 17:40:43.036707 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:43.036669 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05701106-6dac-4257-9892-0513b2761bd9-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g\" (UID: \"05701106-6dac-4257-9892-0513b2761bd9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" Apr 21 17:40:43.036707 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:43.036711 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q89hf\" (UniqueName: \"kubernetes.io/projected/05701106-6dac-4257-9892-0513b2761bd9-kube-api-access-q89hf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g\" (UID: \"05701106-6dac-4257-9892-0513b2761bd9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" Apr 21 17:40:43.036941 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:43.036807 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05701106-6dac-4257-9892-0513b2761bd9-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g\" (UID: \"05701106-6dac-4257-9892-0513b2761bd9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" Apr 21 17:40:43.137964 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:43.137915 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05701106-6dac-4257-9892-0513b2761bd9-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g\" (UID: \"05701106-6dac-4257-9892-0513b2761bd9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" Apr 21 17:40:43.138166 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:43.137979 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q89hf\" (UniqueName: \"kubernetes.io/projected/05701106-6dac-4257-9892-0513b2761bd9-kube-api-access-q89hf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g\" (UID: \"05701106-6dac-4257-9892-0513b2761bd9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" Apr 21 17:40:43.138166 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:43.138033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05701106-6dac-4257-9892-0513b2761bd9-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g\" (UID: \"05701106-6dac-4257-9892-0513b2761bd9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" Apr 21 17:40:43.138345 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:43.138325 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05701106-6dac-4257-9892-0513b2761bd9-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g\" (UID: \"05701106-6dac-4257-9892-0513b2761bd9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" Apr 21 17:40:43.138406 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:43.138365 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05701106-6dac-4257-9892-0513b2761bd9-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g\" (UID: \"05701106-6dac-4257-9892-0513b2761bd9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" Apr 21 17:40:43.150961 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:43.150929 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q89hf\" (UniqueName: \"kubernetes.io/projected/05701106-6dac-4257-9892-0513b2761bd9-kube-api-access-q89hf\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g\" (UID: \"05701106-6dac-4257-9892-0513b2761bd9\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" Apr 21 17:40:43.256259 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:43.256228 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" Apr 21 17:40:43.381723 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:43.381687 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g"] Apr 21 17:40:43.384608 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:40:43.384571 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05701106_6dac_4257_9892_0513b2761bd9.slice/crio-a4541dc48bfa3225cb06987aa7c78702c3801897b5476b70d8cff8bda0d020b3 WatchSource:0}: Error finding container a4541dc48bfa3225cb06987aa7c78702c3801897b5476b70d8cff8bda0d020b3: Status 404 returned error can't find the container with id a4541dc48bfa3225cb06987aa7c78702c3801897b5476b70d8cff8bda0d020b3 Apr 21 17:40:44.094121 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.094084 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-7xdb5"] Apr 21 17:40:44.097350 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.097328 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-7xdb5" Apr 21 17:40:44.102369 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.102339 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 17:40:44.103579 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.103527 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 17:40:44.103579 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.103527 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-6b85n\"" Apr 21 17:40:44.112934 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.112905 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-7xdb5"] Apr 21 17:40:44.143895 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.143860 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/239f1220-f25d-4a1a-ae84-6c527a9dbe6b-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-7xdb5\" (UID: \"239f1220-f25d-4a1a-ae84-6c527a9dbe6b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-7xdb5" Apr 21 17:40:44.144080 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.143914 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7gg2\" (UniqueName: \"kubernetes.io/projected/239f1220-f25d-4a1a-ae84-6c527a9dbe6b-kube-api-access-c7gg2\") pod \"cert-manager-webhook-587ccfb98-7xdb5\" (UID: \"239f1220-f25d-4a1a-ae84-6c527a9dbe6b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-7xdb5" Apr 21 17:40:44.236814 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.236777 2573 generic.go:358] "Generic (PLEG): container finished" podID="05701106-6dac-4257-9892-0513b2761bd9" containerID="7d5f5d9a3ae5cf366c01650b29a4c13dc10c1d5904e77d974bcaf27ba87e4d7a" exitCode=0 Apr 21 17:40:44.236958 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.236863 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" event={"ID":"05701106-6dac-4257-9892-0513b2761bd9","Type":"ContainerDied","Data":"7d5f5d9a3ae5cf366c01650b29a4c13dc10c1d5904e77d974bcaf27ba87e4d7a"} Apr 21 17:40:44.236958 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.236903 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" event={"ID":"05701106-6dac-4257-9892-0513b2761bd9","Type":"ContainerStarted","Data":"a4541dc48bfa3225cb06987aa7c78702c3801897b5476b70d8cff8bda0d020b3"} Apr 21 17:40:44.244566 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.244535 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/239f1220-f25d-4a1a-ae84-6c527a9dbe6b-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-7xdb5\" (UID: \"239f1220-f25d-4a1a-ae84-6c527a9dbe6b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-7xdb5" Apr 21 17:40:44.244669 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.244583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7gg2\" (UniqueName: \"kubernetes.io/projected/239f1220-f25d-4a1a-ae84-6c527a9dbe6b-kube-api-access-c7gg2\") pod \"cert-manager-webhook-587ccfb98-7xdb5\" (UID: \"239f1220-f25d-4a1a-ae84-6c527a9dbe6b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-7xdb5" Apr 21 17:40:44.260445 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.260414 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/239f1220-f25d-4a1a-ae84-6c527a9dbe6b-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-7xdb5\" (UID: \"239f1220-f25d-4a1a-ae84-6c527a9dbe6b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-7xdb5" Apr 21 17:40:44.262209 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.262186 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7gg2\" (UniqueName: \"kubernetes.io/projected/239f1220-f25d-4a1a-ae84-6c527a9dbe6b-kube-api-access-c7gg2\") pod \"cert-manager-webhook-587ccfb98-7xdb5\" (UID: \"239f1220-f25d-4a1a-ae84-6c527a9dbe6b\") " pod="cert-manager/cert-manager-webhook-587ccfb98-7xdb5" Apr 21 17:40:44.422833 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.422738 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-7xdb5" Apr 21 17:40:44.550082 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:44.550048 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-7xdb5"] Apr 21 17:40:44.552880 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:40:44.552836 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod239f1220_f25d_4a1a_ae84_6c527a9dbe6b.slice/crio-b05d38e79c8cdb6cfb3158e0d0927112fd60a756952f78b8b36fc9eb2bd6ccb0 WatchSource:0}: Error finding container b05d38e79c8cdb6cfb3158e0d0927112fd60a756952f78b8b36fc9eb2bd6ccb0: Status 404 returned error can't find the container with id b05d38e79c8cdb6cfb3158e0d0927112fd60a756952f78b8b36fc9eb2bd6ccb0 Apr 21 17:40:45.241919 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:45.241880 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-7xdb5" event={"ID":"239f1220-f25d-4a1a-ae84-6c527a9dbe6b","Type":"ContainerStarted","Data":"b05d38e79c8cdb6cfb3158e0d0927112fd60a756952f78b8b36fc9eb2bd6ccb0"} Apr 21 17:40:46.896469 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:46.896327 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-8kjf9"] Apr 21 17:40:46.899818 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:46.899791 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-8kjf9" Apr 21 17:40:46.902411 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:46.902378 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-c9j5q\"" Apr 21 17:40:46.910326 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:46.910161 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-8kjf9"] Apr 21 17:40:46.968747 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:46.968709 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b67402bb-f627-48fc-8a8b-cea27e2db8cb-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-8kjf9\" (UID: \"b67402bb-f627-48fc-8a8b-cea27e2db8cb\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8kjf9" Apr 21 17:40:46.968931 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:46.968805 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c96wb\" (UniqueName: \"kubernetes.io/projected/b67402bb-f627-48fc-8a8b-cea27e2db8cb-kube-api-access-c96wb\") pod \"cert-manager-cainjector-68b757865b-8kjf9\" (UID: \"b67402bb-f627-48fc-8a8b-cea27e2db8cb\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8kjf9" Apr 21 17:40:47.069678 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:47.069637 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b67402bb-f627-48fc-8a8b-cea27e2db8cb-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-8kjf9\" (UID: \"b67402bb-f627-48fc-8a8b-cea27e2db8cb\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8kjf9" Apr 21 17:40:47.069871 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:47.069728 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c96wb\" (UniqueName: \"kubernetes.io/projected/b67402bb-f627-48fc-8a8b-cea27e2db8cb-kube-api-access-c96wb\") pod \"cert-manager-cainjector-68b757865b-8kjf9\" (UID: \"b67402bb-f627-48fc-8a8b-cea27e2db8cb\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8kjf9" Apr 21 17:40:47.080782 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:47.080751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c96wb\" (UniqueName: \"kubernetes.io/projected/b67402bb-f627-48fc-8a8b-cea27e2db8cb-kube-api-access-c96wb\") pod \"cert-manager-cainjector-68b757865b-8kjf9\" (UID: \"b67402bb-f627-48fc-8a8b-cea27e2db8cb\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8kjf9" Apr 21 17:40:47.080998 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:47.080972 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b67402bb-f627-48fc-8a8b-cea27e2db8cb-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-8kjf9\" (UID: \"b67402bb-f627-48fc-8a8b-cea27e2db8cb\") " pod="cert-manager/cert-manager-cainjector-68b757865b-8kjf9" Apr 21 17:40:47.214226 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:47.214141 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-8kjf9" Apr 21 17:40:47.251236 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:47.251192 2573 generic.go:358] "Generic (PLEG): container finished" podID="05701106-6dac-4257-9892-0513b2761bd9" containerID="697412b5a89b962875876ad5a03d9c7249de7884243b1c4a233b2ada6a85cdee" exitCode=0 Apr 21 17:40:47.251428 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:47.251260 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" event={"ID":"05701106-6dac-4257-9892-0513b2761bd9","Type":"ContainerDied","Data":"697412b5a89b962875876ad5a03d9c7249de7884243b1c4a233b2ada6a85cdee"} Apr 21 17:40:47.363011 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:47.362980 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-8kjf9"] Apr 21 17:40:47.366004 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:40:47.365968 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb67402bb_f627_48fc_8a8b_cea27e2db8cb.slice/crio-91dfb43235360429d84f8e235ccaee9711119b634083a0f887df6148f5a26236 WatchSource:0}: Error finding container 91dfb43235360429d84f8e235ccaee9711119b634083a0f887df6148f5a26236: Status 404 returned error can't find the container with id 91dfb43235360429d84f8e235ccaee9711119b634083a0f887df6148f5a26236 Apr 21 17:40:48.259634 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:48.259599 2573 generic.go:358] "Generic (PLEG): container finished" podID="05701106-6dac-4257-9892-0513b2761bd9" containerID="447f470b9163610ad27f02ca56288a4783e24bdb2d69f95486d0eb0f33f63730" exitCode=0 Apr 21 17:40:48.260111 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:48.259690 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" event={"ID":"05701106-6dac-4257-9892-0513b2761bd9","Type":"ContainerDied","Data":"447f470b9163610ad27f02ca56288a4783e24bdb2d69f95486d0eb0f33f63730"} Apr 21 17:40:48.260992 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:48.260965 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-8kjf9" event={"ID":"b67402bb-f627-48fc-8a8b-cea27e2db8cb","Type":"ContainerStarted","Data":"91dfb43235360429d84f8e235ccaee9711119b634083a0f887df6148f5a26236"} Apr 21 17:40:49.265656 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.265621 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-8kjf9" event={"ID":"b67402bb-f627-48fc-8a8b-cea27e2db8cb","Type":"ContainerStarted","Data":"6a199db9da84d3a05c62e838f28efc5c97955a5f471a611b7825b6f979454b67"} Apr 21 17:40:49.267081 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.267056 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-7xdb5" event={"ID":"239f1220-f25d-4a1a-ae84-6c527a9dbe6b","Type":"ContainerStarted","Data":"bf1cfffeaf9cb7e682004f4299c28ad481f9c1ae42bdda3332530f0b5042f0cc"} Apr 21 17:40:49.267344 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.267318 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-7xdb5" Apr 21 17:40:49.288177 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.288099 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-8kjf9" podStartSLOduration=2.247003228 podStartE2EDuration="3.288083268s" podCreationTimestamp="2026-04-21 17:40:46 +0000 UTC" firstStartedPulling="2026-04-21 17:40:47.368828172 +0000 UTC m=+432.079522257" lastFinishedPulling="2026-04-21 17:40:48.409908196 +0000 UTC m=+433.120602297" observedRunningTime="2026-04-21 17:40:49.286765751 +0000 UTC m=+433.997459861" watchObservedRunningTime="2026-04-21 17:40:49.288083268 +0000 UTC m=+433.998777375" Apr 21 17:40:49.305329 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.305260 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-7xdb5" podStartSLOduration=1.450881008 podStartE2EDuration="5.305238424s" podCreationTimestamp="2026-04-21 17:40:44 +0000 UTC" firstStartedPulling="2026-04-21 17:40:44.554806769 +0000 UTC m=+429.265500855" lastFinishedPulling="2026-04-21 17:40:48.409164179 +0000 UTC m=+433.119858271" observedRunningTime="2026-04-21 17:40:49.304493069 +0000 UTC m=+434.015187190" watchObservedRunningTime="2026-04-21 17:40:49.305238424 +0000 UTC m=+434.015932533" Apr 21 17:40:49.405095 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.405071 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" Apr 21 17:40:49.489897 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.489859 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q89hf\" (UniqueName: \"kubernetes.io/projected/05701106-6dac-4257-9892-0513b2761bd9-kube-api-access-q89hf\") pod \"05701106-6dac-4257-9892-0513b2761bd9\" (UID: \"05701106-6dac-4257-9892-0513b2761bd9\") " Apr 21 17:40:49.490099 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.489948 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05701106-6dac-4257-9892-0513b2761bd9-util\") pod \"05701106-6dac-4257-9892-0513b2761bd9\" (UID: \"05701106-6dac-4257-9892-0513b2761bd9\") " Apr 21 17:40:49.490099 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.489990 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05701106-6dac-4257-9892-0513b2761bd9-bundle\") pod \"05701106-6dac-4257-9892-0513b2761bd9\" (UID: \"05701106-6dac-4257-9892-0513b2761bd9\") " Apr 21 17:40:49.490450 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.490421 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05701106-6dac-4257-9892-0513b2761bd9-bundle" (OuterVolumeSpecName: "bundle") pod "05701106-6dac-4257-9892-0513b2761bd9" (UID: "05701106-6dac-4257-9892-0513b2761bd9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:40:49.492078 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.492054 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05701106-6dac-4257-9892-0513b2761bd9-kube-api-access-q89hf" (OuterVolumeSpecName: "kube-api-access-q89hf") pod "05701106-6dac-4257-9892-0513b2761bd9" (UID: "05701106-6dac-4257-9892-0513b2761bd9"). InnerVolumeSpecName "kube-api-access-q89hf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:40:49.530575 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.530485 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05701106-6dac-4257-9892-0513b2761bd9-util" (OuterVolumeSpecName: "util") pod "05701106-6dac-4257-9892-0513b2761bd9" (UID: "05701106-6dac-4257-9892-0513b2761bd9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:40:49.591436 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.591399 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05701106-6dac-4257-9892-0513b2761bd9-bundle\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:40:49.591436 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.591431 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q89hf\" (UniqueName: \"kubernetes.io/projected/05701106-6dac-4257-9892-0513b2761bd9-kube-api-access-q89hf\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:40:49.591436 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:49.591441 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05701106-6dac-4257-9892-0513b2761bd9-util\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:40:50.271916 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:50.271881 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" Apr 21 17:40:50.271916 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:50.271878 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fgh69g" event={"ID":"05701106-6dac-4257-9892-0513b2761bd9","Type":"ContainerDied","Data":"a4541dc48bfa3225cb06987aa7c78702c3801897b5476b70d8cff8bda0d020b3"} Apr 21 17:40:50.272422 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:50.271934 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4541dc48bfa3225cb06987aa7c78702c3801897b5476b70d8cff8bda0d020b3" Apr 21 17:40:55.274663 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:40:55.274632 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-7xdb5" Apr 21 17:41:02.044685 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.044650 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-fdjx4"] Apr 21 17:41:02.045197 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.044965 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05701106-6dac-4257-9892-0513b2761bd9" containerName="pull" Apr 21 17:41:02.045197 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.044976 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="05701106-6dac-4257-9892-0513b2761bd9" containerName="pull" Apr 21 17:41:02.045197 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.044988 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05701106-6dac-4257-9892-0513b2761bd9" containerName="extract" Apr 21 17:41:02.045197 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.044994 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="05701106-6dac-4257-9892-0513b2761bd9" containerName="extract" Apr 21 17:41:02.045197 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.045008 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05701106-6dac-4257-9892-0513b2761bd9" containerName="util" Apr 21 17:41:02.045197 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.045013 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="05701106-6dac-4257-9892-0513b2761bd9" containerName="util" Apr 21 17:41:02.045197 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.045072 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="05701106-6dac-4257-9892-0513b2761bd9" containerName="extract" Apr 21 17:41:02.050632 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.050614 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-fdjx4" Apr 21 17:41:02.055428 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.055403 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-wvb9n\"" Apr 21 17:41:02.059361 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.059324 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-fdjx4"] Apr 21 17:41:02.099701 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.099660 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21a7c4a2-1b3a-45a0-908b-a7666bb02ccc-bound-sa-token\") pod \"cert-manager-79c8d999ff-fdjx4\" (UID: \"21a7c4a2-1b3a-45a0-908b-a7666bb02ccc\") " pod="cert-manager/cert-manager-79c8d999ff-fdjx4" Apr 21 17:41:02.099701 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.099704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trgmp\" (UniqueName: \"kubernetes.io/projected/21a7c4a2-1b3a-45a0-908b-a7666bb02ccc-kube-api-access-trgmp\") pod \"cert-manager-79c8d999ff-fdjx4\" (UID: \"21a7c4a2-1b3a-45a0-908b-a7666bb02ccc\") " pod="cert-manager/cert-manager-79c8d999ff-fdjx4" Apr 21 17:41:02.201123 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.201083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21a7c4a2-1b3a-45a0-908b-a7666bb02ccc-bound-sa-token\") pod \"cert-manager-79c8d999ff-fdjx4\" (UID: \"21a7c4a2-1b3a-45a0-908b-a7666bb02ccc\") " pod="cert-manager/cert-manager-79c8d999ff-fdjx4" Apr 21 17:41:02.201123 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.201128 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-trgmp\" (UniqueName: \"kubernetes.io/projected/21a7c4a2-1b3a-45a0-908b-a7666bb02ccc-kube-api-access-trgmp\") pod \"cert-manager-79c8d999ff-fdjx4\" (UID: \"21a7c4a2-1b3a-45a0-908b-a7666bb02ccc\") " pod="cert-manager/cert-manager-79c8d999ff-fdjx4" Apr 21 17:41:02.213153 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.209839 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21a7c4a2-1b3a-45a0-908b-a7666bb02ccc-bound-sa-token\") pod \"cert-manager-79c8d999ff-fdjx4\" (UID: \"21a7c4a2-1b3a-45a0-908b-a7666bb02ccc\") " pod="cert-manager/cert-manager-79c8d999ff-fdjx4" Apr 21 17:41:02.216479 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.216446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-trgmp\" (UniqueName: \"kubernetes.io/projected/21a7c4a2-1b3a-45a0-908b-a7666bb02ccc-kube-api-access-trgmp\") pod \"cert-manager-79c8d999ff-fdjx4\" (UID: \"21a7c4a2-1b3a-45a0-908b-a7666bb02ccc\") " pod="cert-manager/cert-manager-79c8d999ff-fdjx4" Apr 21 17:41:02.360338 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.360242 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-fdjx4" Apr 21 17:41:02.497657 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:02.497626 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-fdjx4"] Apr 21 17:41:02.501322 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:41:02.501286 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21a7c4a2_1b3a_45a0_908b_a7666bb02ccc.slice/crio-38092ed19bbf51d444c8316e2a7aa57c2a0c638508249a1e0bbb65bd70042629 WatchSource:0}: Error finding container 38092ed19bbf51d444c8316e2a7aa57c2a0c638508249a1e0bbb65bd70042629: Status 404 returned error can't find the container with id 38092ed19bbf51d444c8316e2a7aa57c2a0c638508249a1e0bbb65bd70042629 Apr 21 17:41:03.317361 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:03.317319 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-fdjx4" event={"ID":"21a7c4a2-1b3a-45a0-908b-a7666bb02ccc","Type":"ContainerStarted","Data":"b0f065cb0e2fb82051369e761578e91c8945e0c444a870b9c74d5154e0033e70"} Apr 21 17:41:03.317361 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:03.317360 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-fdjx4" event={"ID":"21a7c4a2-1b3a-45a0-908b-a7666bb02ccc","Type":"ContainerStarted","Data":"38092ed19bbf51d444c8316e2a7aa57c2a0c638508249a1e0bbb65bd70042629"} Apr 21 17:41:03.338310 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:03.338257 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-fdjx4" podStartSLOduration=1.33824242 podStartE2EDuration="1.33824242s" podCreationTimestamp="2026-04-21 17:41:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:41:03.336624353 +0000 UTC m=+448.047318461" watchObservedRunningTime="2026-04-21 17:41:03.33824242 +0000 UTC m=+448.048936528" Apr 21 17:41:04.371619 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.371576 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w"] Apr 21 17:41:04.375588 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.375560 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" Apr 21 17:41:04.378336 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.378308 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 17:41:04.379484 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.379463 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 17:41:04.379625 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.379480 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-fdhhv\"" Apr 21 17:41:04.386982 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.383919 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w"] Apr 21 17:41:04.522119 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.522078 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ec86c25-3109-427a-8113-b74251d0ab26-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w\" (UID: \"1ec86c25-3109-427a-8113-b74251d0ab26\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" Apr 21 17:41:04.522339 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.522188 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-586n5\" (UniqueName: \"kubernetes.io/projected/1ec86c25-3109-427a-8113-b74251d0ab26-kube-api-access-586n5\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w\" (UID: \"1ec86c25-3109-427a-8113-b74251d0ab26\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" Apr 21 17:41:04.522339 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.522226 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ec86c25-3109-427a-8113-b74251d0ab26-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w\" (UID: \"1ec86c25-3109-427a-8113-b74251d0ab26\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" Apr 21 17:41:04.623586 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.623492 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ec86c25-3109-427a-8113-b74251d0ab26-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w\" (UID: \"1ec86c25-3109-427a-8113-b74251d0ab26\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" Apr 21 17:41:04.623586 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.623555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-586n5\" (UniqueName: \"kubernetes.io/projected/1ec86c25-3109-427a-8113-b74251d0ab26-kube-api-access-586n5\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w\" (UID: \"1ec86c25-3109-427a-8113-b74251d0ab26\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" Apr 21 17:41:04.623586 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.623581 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ec86c25-3109-427a-8113-b74251d0ab26-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w\" (UID: \"1ec86c25-3109-427a-8113-b74251d0ab26\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" Apr 21 17:41:04.623908 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.623887 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ec86c25-3109-427a-8113-b74251d0ab26-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w\" (UID: \"1ec86c25-3109-427a-8113-b74251d0ab26\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" Apr 21 17:41:04.623971 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.623912 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ec86c25-3109-427a-8113-b74251d0ab26-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w\" (UID: \"1ec86c25-3109-427a-8113-b74251d0ab26\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" Apr 21 17:41:04.631985 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.631952 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-586n5\" (UniqueName: \"kubernetes.io/projected/1ec86c25-3109-427a-8113-b74251d0ab26-kube-api-access-586n5\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w\" (UID: \"1ec86c25-3109-427a-8113-b74251d0ab26\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" Apr 21 17:41:04.687626 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.687585 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" Apr 21 17:41:04.823287 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:04.823261 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w"] Apr 21 17:41:04.825857 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:41:04.825827 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ec86c25_3109_427a_8113_b74251d0ab26.slice/crio-415edbb465b84c7a128c71048586438cd8feacdcadd1d54ee9313ebfe5333ed1 WatchSource:0}: Error finding container 415edbb465b84c7a128c71048586438cd8feacdcadd1d54ee9313ebfe5333ed1: Status 404 returned error can't find the container with id 415edbb465b84c7a128c71048586438cd8feacdcadd1d54ee9313ebfe5333ed1 Apr 21 17:41:05.325645 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:05.325609 2573 generic.go:358] "Generic (PLEG): container finished" podID="1ec86c25-3109-427a-8113-b74251d0ab26" containerID="385b9647f912e8349c28dc2f838f10dccc9f6ded9f5611c8d3a7e64fd787ad57" exitCode=0 Apr 21 17:41:05.325829 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:05.325692 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" event={"ID":"1ec86c25-3109-427a-8113-b74251d0ab26","Type":"ContainerDied","Data":"385b9647f912e8349c28dc2f838f10dccc9f6ded9f5611c8d3a7e64fd787ad57"} Apr 21 17:41:05.325829 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:05.325725 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" event={"ID":"1ec86c25-3109-427a-8113-b74251d0ab26","Type":"ContainerStarted","Data":"415edbb465b84c7a128c71048586438cd8feacdcadd1d54ee9313ebfe5333ed1"} Apr 21 17:41:06.331075 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:06.331031 2573 generic.go:358] "Generic (PLEG): container finished" podID="1ec86c25-3109-427a-8113-b74251d0ab26" containerID="21007f03335fc7d208bd709b82994da355b3644b8ccab081601bd65a6f8d2e00" exitCode=0 Apr 21 17:41:06.331507 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:06.331067 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" event={"ID":"1ec86c25-3109-427a-8113-b74251d0ab26","Type":"ContainerDied","Data":"21007f03335fc7d208bd709b82994da355b3644b8ccab081601bd65a6f8d2e00"} Apr 21 17:41:07.337116 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:07.337078 2573 generic.go:358] "Generic (PLEG): container finished" podID="1ec86c25-3109-427a-8113-b74251d0ab26" containerID="aea8b24cbbc94ecaf5c944648b2b75ba1cdc8097c749a67362e487241c0e1fc4" exitCode=0 Apr 21 17:41:07.337522 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:07.337176 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" event={"ID":"1ec86c25-3109-427a-8113-b74251d0ab26","Type":"ContainerDied","Data":"aea8b24cbbc94ecaf5c944648b2b75ba1cdc8097c749a67362e487241c0e1fc4"} Apr 21 17:41:08.467625 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:08.467599 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" Apr 21 17:41:08.558778 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:08.558741 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-586n5\" (UniqueName: \"kubernetes.io/projected/1ec86c25-3109-427a-8113-b74251d0ab26-kube-api-access-586n5\") pod \"1ec86c25-3109-427a-8113-b74251d0ab26\" (UID: \"1ec86c25-3109-427a-8113-b74251d0ab26\") " Apr 21 17:41:08.558979 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:08.558804 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ec86c25-3109-427a-8113-b74251d0ab26-util\") pod \"1ec86c25-3109-427a-8113-b74251d0ab26\" (UID: \"1ec86c25-3109-427a-8113-b74251d0ab26\") " Apr 21 17:41:08.558979 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:08.558858 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ec86c25-3109-427a-8113-b74251d0ab26-bundle\") pod \"1ec86c25-3109-427a-8113-b74251d0ab26\" (UID: \"1ec86c25-3109-427a-8113-b74251d0ab26\") " Apr 21 17:41:08.559582 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:08.559555 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ec86c25-3109-427a-8113-b74251d0ab26-bundle" (OuterVolumeSpecName: "bundle") pod "1ec86c25-3109-427a-8113-b74251d0ab26" (UID: "1ec86c25-3109-427a-8113-b74251d0ab26"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:41:08.560994 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:08.560967 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec86c25-3109-427a-8113-b74251d0ab26-kube-api-access-586n5" (OuterVolumeSpecName: "kube-api-access-586n5") pod "1ec86c25-3109-427a-8113-b74251d0ab26" (UID: "1ec86c25-3109-427a-8113-b74251d0ab26"). InnerVolumeSpecName "kube-api-access-586n5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:41:08.564112 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:08.564068 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ec86c25-3109-427a-8113-b74251d0ab26-util" (OuterVolumeSpecName: "util") pod "1ec86c25-3109-427a-8113-b74251d0ab26" (UID: "1ec86c25-3109-427a-8113-b74251d0ab26"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:41:08.659634 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:08.659541 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1ec86c25-3109-427a-8113-b74251d0ab26-bundle\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:41:08.659634 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:08.659577 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-586n5\" (UniqueName: \"kubernetes.io/projected/1ec86c25-3109-427a-8113-b74251d0ab26-kube-api-access-586n5\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:41:08.659634 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:08.659587 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1ec86c25-3109-427a-8113-b74251d0ab26-util\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:41:09.346386 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:09.346349 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" event={"ID":"1ec86c25-3109-427a-8113-b74251d0ab26","Type":"ContainerDied","Data":"415edbb465b84c7a128c71048586438cd8feacdcadd1d54ee9313ebfe5333ed1"} Apr 21 17:41:09.346386 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:09.346384 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="415edbb465b84c7a128c71048586438cd8feacdcadd1d54ee9313ebfe5333ed1" Apr 21 17:41:09.346598 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:09.346392 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5q4g8w" Apr 21 17:41:18.366397 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.366357 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz"] Apr 21 17:41:18.366756 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.366675 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ec86c25-3109-427a-8113-b74251d0ab26" containerName="extract" Apr 21 17:41:18.366756 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.366685 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec86c25-3109-427a-8113-b74251d0ab26" containerName="extract" Apr 21 17:41:18.366756 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.366693 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ec86c25-3109-427a-8113-b74251d0ab26" containerName="pull" Apr 21 17:41:18.366756 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.366699 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec86c25-3109-427a-8113-b74251d0ab26" containerName="pull" Apr 21 17:41:18.366756 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.366706 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ec86c25-3109-427a-8113-b74251d0ab26" containerName="util" Apr 21 17:41:18.366756 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.366712 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec86c25-3109-427a-8113-b74251d0ab26" containerName="util" Apr 21 17:41:18.366953 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.366773 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ec86c25-3109-427a-8113-b74251d0ab26" containerName="extract" Apr 21 17:41:18.374727 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.374693 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" Apr 21 17:41:18.380551 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.380517 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 17:41:18.381725 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.381701 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-fdhhv\"" Apr 21 17:41:18.381811 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.381730 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 17:41:18.388494 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.388458 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz"] Apr 21 17:41:18.542579 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.542539 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/680abfad-a760-4597-8554-0f027c4bf54a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz\" (UID: \"680abfad-a760-4597-8554-0f027c4bf54a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" Apr 21 17:41:18.542762 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.542636 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzhxz\" (UniqueName: \"kubernetes.io/projected/680abfad-a760-4597-8554-0f027c4bf54a-kube-api-access-xzhxz\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz\" (UID: \"680abfad-a760-4597-8554-0f027c4bf54a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" Apr 21 17:41:18.542762 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.542695 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/680abfad-a760-4597-8554-0f027c4bf54a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz\" (UID: \"680abfad-a760-4597-8554-0f027c4bf54a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" Apr 21 17:41:18.648570 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.643989 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/680abfad-a760-4597-8554-0f027c4bf54a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz\" (UID: \"680abfad-a760-4597-8554-0f027c4bf54a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" Apr 21 17:41:18.648570 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.644102 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzhxz\" (UniqueName: \"kubernetes.io/projected/680abfad-a760-4597-8554-0f027c4bf54a-kube-api-access-xzhxz\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz\" (UID: \"680abfad-a760-4597-8554-0f027c4bf54a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" Apr 21 17:41:18.648570 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.644196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/680abfad-a760-4597-8554-0f027c4bf54a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz\" (UID: \"680abfad-a760-4597-8554-0f027c4bf54a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" Apr 21 17:41:18.648570 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.644665 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/680abfad-a760-4597-8554-0f027c4bf54a-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz\" (UID: \"680abfad-a760-4597-8554-0f027c4bf54a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" Apr 21 17:41:18.648570 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.644906 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/680abfad-a760-4597-8554-0f027c4bf54a-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz\" (UID: \"680abfad-a760-4597-8554-0f027c4bf54a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" Apr 21 17:41:18.654088 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.654064 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzhxz\" (UniqueName: \"kubernetes.io/projected/680abfad-a760-4597-8554-0f027c4bf54a-kube-api-access-xzhxz\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz\" (UID: \"680abfad-a760-4597-8554-0f027c4bf54a\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" Apr 21 17:41:18.684287 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.684249 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" Apr 21 17:41:18.820453 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:18.820318 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz"] Apr 21 17:41:18.823302 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:41:18.823265 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod680abfad_a760_4597_8554_0f027c4bf54a.slice/crio-4339a107da3afc611867bfb0dd9e4010dab4c527c7b136227db3afa3c6319000 WatchSource:0}: Error finding container 4339a107da3afc611867bfb0dd9e4010dab4c527c7b136227db3afa3c6319000: Status 404 returned error can't find the container with id 4339a107da3afc611867bfb0dd9e4010dab4c527c7b136227db3afa3c6319000 Apr 21 17:41:19.126744 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.126703 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb"] Apr 21 17:41:19.130665 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.130636 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" Apr 21 17:41:19.134111 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.134083 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 17:41:19.134305 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.134289 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 17:41:19.140459 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.140435 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 17:41:19.140613 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.140500 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-gf9sq\"" Apr 21 17:41:19.140730 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.140715 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 17:41:19.161451 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.161419 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb"] Apr 21 17:41:19.249638 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.249609 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clgzp\" (UniqueName: \"kubernetes.io/projected/e8a8ce54-79d0-4e08-9b00-73835796a047-kube-api-access-clgzp\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb\" (UID: \"e8a8ce54-79d0-4e08-9b00-73835796a047\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" Apr 21 17:41:19.249806 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.249656 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e8a8ce54-79d0-4e08-9b00-73835796a047-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb\" (UID: \"e8a8ce54-79d0-4e08-9b00-73835796a047\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" Apr 21 17:41:19.249806 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.249685 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e8a8ce54-79d0-4e08-9b00-73835796a047-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb\" (UID: \"e8a8ce54-79d0-4e08-9b00-73835796a047\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" Apr 21 17:41:19.351015 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.350974 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e8a8ce54-79d0-4e08-9b00-73835796a047-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb\" (UID: \"e8a8ce54-79d0-4e08-9b00-73835796a047\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" Apr 21 17:41:19.351015 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.351020 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e8a8ce54-79d0-4e08-9b00-73835796a047-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb\" (UID: \"e8a8ce54-79d0-4e08-9b00-73835796a047\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" Apr 21 17:41:19.351301 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.351197 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clgzp\" (UniqueName: \"kubernetes.io/projected/e8a8ce54-79d0-4e08-9b00-73835796a047-kube-api-access-clgzp\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb\" (UID: \"e8a8ce54-79d0-4e08-9b00-73835796a047\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" Apr 21 17:41:19.353672 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.353644 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e8a8ce54-79d0-4e08-9b00-73835796a047-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb\" (UID: \"e8a8ce54-79d0-4e08-9b00-73835796a047\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" Apr 21 17:41:19.353672 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.353661 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e8a8ce54-79d0-4e08-9b00-73835796a047-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb\" (UID: \"e8a8ce54-79d0-4e08-9b00-73835796a047\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" Apr 21 17:41:19.365615 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.365582 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clgzp\" (UniqueName: \"kubernetes.io/projected/e8a8ce54-79d0-4e08-9b00-73835796a047-kube-api-access-clgzp\") pod \"opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb\" (UID: \"e8a8ce54-79d0-4e08-9b00-73835796a047\") " pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" Apr 21 17:41:19.381986 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.381948 2573 generic.go:358] "Generic (PLEG): container finished" podID="680abfad-a760-4597-8554-0f027c4bf54a" containerID="4139eb3dd3d86a7720cf8715f9b1d8c40b80572efa1efccb13993e16274b73ef" exitCode=0 Apr 21 17:41:19.381986 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.381989 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" event={"ID":"680abfad-a760-4597-8554-0f027c4bf54a","Type":"ContainerDied","Data":"4139eb3dd3d86a7720cf8715f9b1d8c40b80572efa1efccb13993e16274b73ef"} Apr 21 17:41:19.382416 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.382011 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" event={"ID":"680abfad-a760-4597-8554-0f027c4bf54a","Type":"ContainerStarted","Data":"4339a107da3afc611867bfb0dd9e4010dab4c527c7b136227db3afa3c6319000"} Apr 21 17:41:19.440831 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.440795 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" Apr 21 17:41:19.584460 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:19.584415 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb"] Apr 21 17:41:19.588913 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:41:19.588870 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8a8ce54_79d0_4e08_9b00_73835796a047.slice/crio-5ea6799b876b42c8b87493d2c3b33527715a84996165fa22c05dbba068e077b2 WatchSource:0}: Error finding container 5ea6799b876b42c8b87493d2c3b33527715a84996165fa22c05dbba068e077b2: Status 404 returned error can't find the container with id 5ea6799b876b42c8b87493d2c3b33527715a84996165fa22c05dbba068e077b2 Apr 21 17:41:20.389052 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:20.388942 2573 generic.go:358] "Generic (PLEG): container finished" podID="680abfad-a760-4597-8554-0f027c4bf54a" containerID="5684a040c9759164feeb3a6965c242f360ca82e773e775bed3d9c38462c44a96" exitCode=0 Apr 21 17:41:20.389052 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:20.389028 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" event={"ID":"680abfad-a760-4597-8554-0f027c4bf54a","Type":"ContainerDied","Data":"5684a040c9759164feeb3a6965c242f360ca82e773e775bed3d9c38462c44a96"} Apr 21 17:41:20.390621 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:20.390573 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" event={"ID":"e8a8ce54-79d0-4e08-9b00-73835796a047","Type":"ContainerStarted","Data":"5ea6799b876b42c8b87493d2c3b33527715a84996165fa22c05dbba068e077b2"} Apr 21 17:41:21.397927 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:21.397874 2573 generic.go:358] "Generic (PLEG): container finished" podID="680abfad-a760-4597-8554-0f027c4bf54a" containerID="7d8ffc1a4717083d66634ce4c82d91345a2c1d0341efd922c33c8f1b19a4ff93" exitCode=0 Apr 21 17:41:21.397927 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:21.397911 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" event={"ID":"680abfad-a760-4597-8554-0f027c4bf54a","Type":"ContainerDied","Data":"7d8ffc1a4717083d66634ce4c82d91345a2c1d0341efd922c33c8f1b19a4ff93"} Apr 21 17:41:22.408373 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:22.408333 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" event={"ID":"e8a8ce54-79d0-4e08-9b00-73835796a047","Type":"ContainerStarted","Data":"5e165b75db3fb9e2a4dca9881685c2b71b5b4c9e5d611c6df037c4f6fb1bff6a"} Apr 21 17:41:22.438939 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:22.438811 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" podStartSLOduration=0.849971914 podStartE2EDuration="3.43879229s" podCreationTimestamp="2026-04-21 17:41:19 +0000 UTC" firstStartedPulling="2026-04-21 17:41:19.590861889 +0000 UTC m=+464.301555974" lastFinishedPulling="2026-04-21 17:41:22.179682247 +0000 UTC m=+466.890376350" observedRunningTime="2026-04-21 17:41:22.436382936 +0000 UTC m=+467.147077068" watchObservedRunningTime="2026-04-21 17:41:22.43879229 +0000 UTC m=+467.149486401" Apr 21 17:41:22.539251 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:22.539226 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" Apr 21 17:41:22.681514 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:22.681474 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzhxz\" (UniqueName: \"kubernetes.io/projected/680abfad-a760-4597-8554-0f027c4bf54a-kube-api-access-xzhxz\") pod \"680abfad-a760-4597-8554-0f027c4bf54a\" (UID: \"680abfad-a760-4597-8554-0f027c4bf54a\") " Apr 21 17:41:22.681716 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:22.681546 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/680abfad-a760-4597-8554-0f027c4bf54a-util\") pod \"680abfad-a760-4597-8554-0f027c4bf54a\" (UID: \"680abfad-a760-4597-8554-0f027c4bf54a\") " Apr 21 17:41:22.681716 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:22.681651 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/680abfad-a760-4597-8554-0f027c4bf54a-bundle\") pod \"680abfad-a760-4597-8554-0f027c4bf54a\" (UID: \"680abfad-a760-4597-8554-0f027c4bf54a\") " Apr 21 17:41:22.682569 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:22.682541 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/680abfad-a760-4597-8554-0f027c4bf54a-bundle" (OuterVolumeSpecName: "bundle") pod "680abfad-a760-4597-8554-0f027c4bf54a" (UID: "680abfad-a760-4597-8554-0f027c4bf54a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:41:22.683682 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:22.683655 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/680abfad-a760-4597-8554-0f027c4bf54a-kube-api-access-xzhxz" (OuterVolumeSpecName: "kube-api-access-xzhxz") pod "680abfad-a760-4597-8554-0f027c4bf54a" (UID: "680abfad-a760-4597-8554-0f027c4bf54a"). InnerVolumeSpecName "kube-api-access-xzhxz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:41:22.686791 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:22.686762 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/680abfad-a760-4597-8554-0f027c4bf54a-util" (OuterVolumeSpecName: "util") pod "680abfad-a760-4597-8554-0f027c4bf54a" (UID: "680abfad-a760-4597-8554-0f027c4bf54a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:41:22.782668 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:22.782620 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/680abfad-a760-4597-8554-0f027c4bf54a-bundle\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:41:22.782668 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:22.782661 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xzhxz\" (UniqueName: \"kubernetes.io/projected/680abfad-a760-4597-8554-0f027c4bf54a-kube-api-access-xzhxz\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:41:22.782668 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:22.782674 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/680abfad-a760-4597-8554-0f027c4bf54a-util\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:41:23.413493 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:23.413457 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" Apr 21 17:41:23.413865 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:23.413456 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9vzppz" event={"ID":"680abfad-a760-4597-8554-0f027c4bf54a","Type":"ContainerDied","Data":"4339a107da3afc611867bfb0dd9e4010dab4c527c7b136227db3afa3c6319000"} Apr 21 17:41:23.413865 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:23.413581 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4339a107da3afc611867bfb0dd9e4010dab4c527c7b136227db3afa3c6319000" Apr 21 17:41:23.413985 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:23.413884 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" Apr 21 17:41:34.419644 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:34.419610 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb" Apr 21 17:41:36.969740 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.969706 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt"] Apr 21 17:41:36.970153 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.970078 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="680abfad-a760-4597-8554-0f027c4bf54a" containerName="extract" Apr 21 17:41:36.970153 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.970090 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="680abfad-a760-4597-8554-0f027c4bf54a" containerName="extract" Apr 21 17:41:36.970153 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.970104 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="680abfad-a760-4597-8554-0f027c4bf54a" containerName="pull" Apr 21 17:41:36.970153 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.970111 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="680abfad-a760-4597-8554-0f027c4bf54a" containerName="pull" Apr 21 17:41:36.970153 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.970128 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="680abfad-a760-4597-8554-0f027c4bf54a" containerName="util" Apr 21 17:41:36.970153 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.970151 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="680abfad-a760-4597-8554-0f027c4bf54a" containerName="util" Apr 21 17:41:36.970346 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.970206 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="680abfad-a760-4597-8554-0f027c4bf54a" containerName="extract" Apr 21 17:41:36.974959 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.974937 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" Apr 21 17:41:36.978770 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.978736 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-fdhhv\"" Apr 21 17:41:36.979892 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.979874 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 17:41:36.981577 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.981553 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 17:41:36.992965 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.992934 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a214330-edae-48c4-870b-70714d553386-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt\" (UID: \"7a214330-edae-48c4-870b-70714d553386\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" Apr 21 17:41:36.992965 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.992968 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a214330-edae-48c4-870b-70714d553386-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt\" (UID: \"7a214330-edae-48c4-870b-70714d553386\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" Apr 21 17:41:36.993206 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:36.992991 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbtb2\" (UniqueName: \"kubernetes.io/projected/7a214330-edae-48c4-870b-70714d553386-kube-api-access-vbtb2\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt\" (UID: \"7a214330-edae-48c4-870b-70714d553386\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" Apr 21 17:41:37.002826 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.002792 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt"] Apr 21 17:41:37.094347 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.094302 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a214330-edae-48c4-870b-70714d553386-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt\" (UID: \"7a214330-edae-48c4-870b-70714d553386\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" Apr 21 17:41:37.094347 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.094349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a214330-edae-48c4-870b-70714d553386-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt\" (UID: \"7a214330-edae-48c4-870b-70714d553386\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" Apr 21 17:41:37.094667 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.094380 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbtb2\" (UniqueName: \"kubernetes.io/projected/7a214330-edae-48c4-870b-70714d553386-kube-api-access-vbtb2\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt\" (UID: \"7a214330-edae-48c4-870b-70714d553386\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" Apr 21 17:41:37.094767 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.094746 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a214330-edae-48c4-870b-70714d553386-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt\" (UID: \"7a214330-edae-48c4-870b-70714d553386\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" Apr 21 17:41:37.094820 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.094771 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a214330-edae-48c4-870b-70714d553386-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt\" (UID: \"7a214330-edae-48c4-870b-70714d553386\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" Apr 21 17:41:37.129312 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.129268 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbtb2\" (UniqueName: \"kubernetes.io/projected/7a214330-edae-48c4-870b-70714d553386-kube-api-access-vbtb2\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt\" (UID: \"7a214330-edae-48c4-870b-70714d553386\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" Apr 21 17:41:37.284492 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.284453 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" Apr 21 17:41:37.441479 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.441373 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt"] Apr 21 17:41:37.444161 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:41:37.444100 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a214330_edae_48c4_870b_70714d553386.slice/crio-640c4083d45b3ef8d072b766b140324e59e143ce5937c5a56724399210264de7 WatchSource:0}: Error finding container 640c4083d45b3ef8d072b766b140324e59e143ce5937c5a56724399210264de7: Status 404 returned error can't find the container with id 640c4083d45b3ef8d072b766b140324e59e143ce5937c5a56724399210264de7 Apr 21 17:41:37.466862 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.466824 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" event={"ID":"7a214330-edae-48c4-870b-70714d553386","Type":"ContainerStarted","Data":"640c4083d45b3ef8d072b766b140324e59e143ce5937c5a56724399210264de7"} Apr 21 17:41:37.577829 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.577792 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s"] Apr 21 17:41:37.581512 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.581488 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s" Apr 21 17:41:37.587818 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.587792 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 17:41:37.587976 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.587792 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 17:41:37.588883 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.588865 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-nq62s\"" Apr 21 17:41:37.589009 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.588939 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 17:41:37.589009 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.588963 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 17:41:37.599011 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.598971 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9cfc99ce-60a1-496f-91e7-3032cca09532-tmp\") pod \"kube-auth-proxy-548b8d8fcb-zgz5s\" (UID: \"9cfc99ce-60a1-496f-91e7-3032cca09532\") " pod="openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s" Apr 21 17:41:37.599175 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.599024 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9cfc99ce-60a1-496f-91e7-3032cca09532-tls-certs\") pod \"kube-auth-proxy-548b8d8fcb-zgz5s\" (UID: \"9cfc99ce-60a1-496f-91e7-3032cca09532\") " pod="openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s" Apr 21 17:41:37.599246 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.599161 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg24r\" (UniqueName: \"kubernetes.io/projected/9cfc99ce-60a1-496f-91e7-3032cca09532-kube-api-access-pg24r\") pod \"kube-auth-proxy-548b8d8fcb-zgz5s\" (UID: \"9cfc99ce-60a1-496f-91e7-3032cca09532\") " pod="openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s" Apr 21 17:41:37.602660 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.602632 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s"] Apr 21 17:41:37.700692 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.700646 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9cfc99ce-60a1-496f-91e7-3032cca09532-tmp\") pod \"kube-auth-proxy-548b8d8fcb-zgz5s\" (UID: \"9cfc99ce-60a1-496f-91e7-3032cca09532\") " pod="openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s" Apr 21 17:41:37.700692 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.700697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9cfc99ce-60a1-496f-91e7-3032cca09532-tls-certs\") pod \"kube-auth-proxy-548b8d8fcb-zgz5s\" (UID: \"9cfc99ce-60a1-496f-91e7-3032cca09532\") " pod="openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s" Apr 21 17:41:37.700954 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.700735 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pg24r\" (UniqueName: \"kubernetes.io/projected/9cfc99ce-60a1-496f-91e7-3032cca09532-kube-api-access-pg24r\") pod \"kube-auth-proxy-548b8d8fcb-zgz5s\" (UID: \"9cfc99ce-60a1-496f-91e7-3032cca09532\") " pod="openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s" Apr 21 17:41:37.703112 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.703080 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9cfc99ce-60a1-496f-91e7-3032cca09532-tmp\") pod \"kube-auth-proxy-548b8d8fcb-zgz5s\" (UID: \"9cfc99ce-60a1-496f-91e7-3032cca09532\") " pod="openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s" Apr 21 17:41:37.703391 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.703370 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9cfc99ce-60a1-496f-91e7-3032cca09532-tls-certs\") pod \"kube-auth-proxy-548b8d8fcb-zgz5s\" (UID: \"9cfc99ce-60a1-496f-91e7-3032cca09532\") " pod="openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s" Apr 21 17:41:37.715214 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.715183 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg24r\" (UniqueName: \"kubernetes.io/projected/9cfc99ce-60a1-496f-91e7-3032cca09532-kube-api-access-pg24r\") pod \"kube-auth-proxy-548b8d8fcb-zgz5s\" (UID: \"9cfc99ce-60a1-496f-91e7-3032cca09532\") " pod="openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s" Apr 21 17:41:37.927850 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:37.927759 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s" Apr 21 17:41:38.105902 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:38.105875 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s"] Apr 21 17:41:38.107969 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:41:38.107940 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cfc99ce_60a1_496f_91e7_3032cca09532.slice/crio-7b09460af347d390337fc2d006cd5329811c24f550a33234a54106fe18eabc44 WatchSource:0}: Error finding container 7b09460af347d390337fc2d006cd5329811c24f550a33234a54106fe18eabc44: Status 404 returned error can't find the container with id 7b09460af347d390337fc2d006cd5329811c24f550a33234a54106fe18eabc44 Apr 21 17:41:38.471570 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:38.471524 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s" event={"ID":"9cfc99ce-60a1-496f-91e7-3032cca09532","Type":"ContainerStarted","Data":"7b09460af347d390337fc2d006cd5329811c24f550a33234a54106fe18eabc44"} Apr 21 17:41:38.472821 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:38.472792 2573 generic.go:358] "Generic (PLEG): container finished" podID="7a214330-edae-48c4-870b-70714d553386" containerID="987cdf7443dd01e88e974f9a901cf0b9f055d2625bb3a470e8d4167e6a113e33" exitCode=0 Apr 21 17:41:38.472936 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:38.472874 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" event={"ID":"7a214330-edae-48c4-870b-70714d553386","Type":"ContainerDied","Data":"987cdf7443dd01e88e974f9a901cf0b9f055d2625bb3a470e8d4167e6a113e33"} Apr 21 17:41:38.925765 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:38.925703 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp"] Apr 21 17:41:38.929656 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:38.929610 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:38.934327 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:38.934277 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-x6qtw\"" Apr 21 17:41:38.935290 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:38.935267 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 17:41:38.935519 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:38.935495 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 17:41:38.935585 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:38.935497 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 17:41:38.935585 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:38.935572 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 17:41:38.935680 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:38.935597 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 17:41:38.945324 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:38.944540 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp"] Apr 21 17:41:39.011483 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:39.011403 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/dc724ecf-4612-46dc-8525-fb6475593865-manager-config\") pod \"lws-controller-manager-bdd4f6877-g28kp\" (UID: \"dc724ecf-4612-46dc-8525-fb6475593865\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:39.011694 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:39.011533 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc724ecf-4612-46dc-8525-fb6475593865-cert\") pod \"lws-controller-manager-bdd4f6877-g28kp\" (UID: \"dc724ecf-4612-46dc-8525-fb6475593865\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:39.011694 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:39.011584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/dc724ecf-4612-46dc-8525-fb6475593865-metrics-cert\") pod \"lws-controller-manager-bdd4f6877-g28kp\" (UID: \"dc724ecf-4612-46dc-8525-fb6475593865\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:39.011694 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:39.011621 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkxff\" (UniqueName: \"kubernetes.io/projected/dc724ecf-4612-46dc-8525-fb6475593865-kube-api-access-pkxff\") pod \"lws-controller-manager-bdd4f6877-g28kp\" (UID: \"dc724ecf-4612-46dc-8525-fb6475593865\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:39.112914 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:39.112868 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/dc724ecf-4612-46dc-8525-fb6475593865-metrics-cert\") pod \"lws-controller-manager-bdd4f6877-g28kp\" (UID: \"dc724ecf-4612-46dc-8525-fb6475593865\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:39.113427 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:39.112928 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkxff\" (UniqueName: \"kubernetes.io/projected/dc724ecf-4612-46dc-8525-fb6475593865-kube-api-access-pkxff\") pod \"lws-controller-manager-bdd4f6877-g28kp\" (UID: \"dc724ecf-4612-46dc-8525-fb6475593865\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:39.113427 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:39.113050 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/dc724ecf-4612-46dc-8525-fb6475593865-manager-config\") pod \"lws-controller-manager-bdd4f6877-g28kp\" (UID: \"dc724ecf-4612-46dc-8525-fb6475593865\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:39.113427 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:39.113096 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc724ecf-4612-46dc-8525-fb6475593865-cert\") pod \"lws-controller-manager-bdd4f6877-g28kp\" (UID: \"dc724ecf-4612-46dc-8525-fb6475593865\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:39.113785 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:39.113747 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/dc724ecf-4612-46dc-8525-fb6475593865-manager-config\") pod \"lws-controller-manager-bdd4f6877-g28kp\" (UID: \"dc724ecf-4612-46dc-8525-fb6475593865\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:39.115787 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:39.115757 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/dc724ecf-4612-46dc-8525-fb6475593865-cert\") pod \"lws-controller-manager-bdd4f6877-g28kp\" (UID: \"dc724ecf-4612-46dc-8525-fb6475593865\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:39.115963 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:39.115939 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/dc724ecf-4612-46dc-8525-fb6475593865-metrics-cert\") pod \"lws-controller-manager-bdd4f6877-g28kp\" (UID: \"dc724ecf-4612-46dc-8525-fb6475593865\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:39.170618 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:39.170584 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkxff\" (UniqueName: \"kubernetes.io/projected/dc724ecf-4612-46dc-8525-fb6475593865-kube-api-access-pkxff\") pod \"lws-controller-manager-bdd4f6877-g28kp\" (UID: \"dc724ecf-4612-46dc-8525-fb6475593865\") " pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:39.253734 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:39.253697 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:39.427516 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:39.427481 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp"] Apr 21 17:41:40.039173 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:41:40.039126 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc724ecf_4612_46dc_8525_fb6475593865.slice/crio-61b8317b808fe5149e6c4085b334869350da1f395646573d17074cd2dc96a5d7 WatchSource:0}: Error finding container 61b8317b808fe5149e6c4085b334869350da1f395646573d17074cd2dc96a5d7: Status 404 returned error can't find the container with id 61b8317b808fe5149e6c4085b334869350da1f395646573d17074cd2dc96a5d7 Apr 21 17:41:40.484394 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:40.484354 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" event={"ID":"dc724ecf-4612-46dc-8525-fb6475593865","Type":"ContainerStarted","Data":"61b8317b808fe5149e6c4085b334869350da1f395646573d17074cd2dc96a5d7"} Apr 21 17:41:40.486272 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:40.486242 2573 generic.go:358] "Generic (PLEG): container finished" podID="7a214330-edae-48c4-870b-70714d553386" containerID="84f60016df5900210c955f1e39433852635ca922af2948588cd9ea757e3a4f7b" exitCode=0 Apr 21 17:41:40.486417 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:40.486282 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" event={"ID":"7a214330-edae-48c4-870b-70714d553386","Type":"ContainerDied","Data":"84f60016df5900210c955f1e39433852635ca922af2948588cd9ea757e3a4f7b"} Apr 21 17:41:41.492792 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:41.492754 2573 generic.go:358] "Generic (PLEG): container finished" podID="7a214330-edae-48c4-870b-70714d553386" containerID="a5e31c91bad708ea2c7e9579c4cd3b848594030a2b53584386388ab4dacb2417" exitCode=0 Apr 21 17:41:41.493259 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:41.492847 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" event={"ID":"7a214330-edae-48c4-870b-70714d553386","Type":"ContainerDied","Data":"a5e31c91bad708ea2c7e9579c4cd3b848594030a2b53584386388ab4dacb2417"} Apr 21 17:41:42.633956 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:42.633930 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" Apr 21 17:41:42.747005 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:42.746910 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a214330-edae-48c4-870b-70714d553386-util\") pod \"7a214330-edae-48c4-870b-70714d553386\" (UID: \"7a214330-edae-48c4-870b-70714d553386\") " Apr 21 17:41:42.747005 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:42.746990 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbtb2\" (UniqueName: \"kubernetes.io/projected/7a214330-edae-48c4-870b-70714d553386-kube-api-access-vbtb2\") pod \"7a214330-edae-48c4-870b-70714d553386\" (UID: \"7a214330-edae-48c4-870b-70714d553386\") " Apr 21 17:41:42.747250 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:42.747024 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a214330-edae-48c4-870b-70714d553386-bundle\") pod \"7a214330-edae-48c4-870b-70714d553386\" (UID: \"7a214330-edae-48c4-870b-70714d553386\") " Apr 21 17:41:42.747892 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:42.747866 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a214330-edae-48c4-870b-70714d553386-bundle" (OuterVolumeSpecName: "bundle") pod "7a214330-edae-48c4-870b-70714d553386" (UID: "7a214330-edae-48c4-870b-70714d553386"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:41:42.749420 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:42.749392 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a214330-edae-48c4-870b-70714d553386-kube-api-access-vbtb2" (OuterVolumeSpecName: "kube-api-access-vbtb2") pod "7a214330-edae-48c4-870b-70714d553386" (UID: "7a214330-edae-48c4-870b-70714d553386"). InnerVolumeSpecName "kube-api-access-vbtb2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:41:42.751924 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:42.751902 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a214330-edae-48c4-870b-70714d553386-util" (OuterVolumeSpecName: "util") pod "7a214330-edae-48c4-870b-70714d553386" (UID: "7a214330-edae-48c4-870b-70714d553386"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:41:42.847963 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:42.847922 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7a214330-edae-48c4-870b-70714d553386-util\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:41:42.847963 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:42.847957 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vbtb2\" (UniqueName: \"kubernetes.io/projected/7a214330-edae-48c4-870b-70714d553386-kube-api-access-vbtb2\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:41:42.847963 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:42.847968 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7a214330-edae-48c4-870b-70714d553386-bundle\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:41:43.504033 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:43.503991 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s" event={"ID":"9cfc99ce-60a1-496f-91e7-3032cca09532","Type":"ContainerStarted","Data":"e13c9aaf58ca49bf27ea87c4c6d9c690455e7ad83551ea8072278f16ef05b823"} Apr 21 17:41:43.505486 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:43.505454 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" event={"ID":"dc724ecf-4612-46dc-8525-fb6475593865","Type":"ContainerStarted","Data":"967884a275943c5b9497feb0e675df268905abe1c94c5ab25490ca810adcb73c"} Apr 21 17:41:43.505628 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:43.505549 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:43.507103 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:43.507083 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" Apr 21 17:41:43.507103 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:43.507093 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c483579ngt" event={"ID":"7a214330-edae-48c4-870b-70714d553386","Type":"ContainerDied","Data":"640c4083d45b3ef8d072b766b140324e59e143ce5937c5a56724399210264de7"} Apr 21 17:41:43.507277 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:43.507117 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="640c4083d45b3ef8d072b766b140324e59e143ce5937c5a56724399210264de7" Apr 21 17:41:43.536835 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:43.531784 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-548b8d8fcb-zgz5s" podStartSLOduration=2.142012682 podStartE2EDuration="6.531725077s" podCreationTimestamp="2026-04-21 17:41:37 +0000 UTC" firstStartedPulling="2026-04-21 17:41:38.110005883 +0000 UTC m=+482.820699970" lastFinishedPulling="2026-04-21 17:41:42.499718273 +0000 UTC m=+487.210412365" observedRunningTime="2026-04-21 17:41:43.529754923 +0000 UTC m=+488.240449054" watchObservedRunningTime="2026-04-21 17:41:43.531725077 +0000 UTC m=+488.242419185" Apr 21 17:41:43.566663 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:43.566586 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" podStartSLOduration=3.119150711 podStartE2EDuration="5.566567285s" podCreationTimestamp="2026-04-21 17:41:38 +0000 UTC" firstStartedPulling="2026-04-21 17:41:40.040954989 +0000 UTC m=+484.751649075" lastFinishedPulling="2026-04-21 17:41:42.488371559 +0000 UTC m=+487.199065649" observedRunningTime="2026-04-21 17:41:43.565318861 +0000 UTC m=+488.276012968" watchObservedRunningTime="2026-04-21 17:41:43.566567285 +0000 UTC m=+488.277261390" Apr 21 17:41:51.584446 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.584406 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk"] Apr 21 17:41:51.584839 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.584750 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a214330-edae-48c4-870b-70714d553386" containerName="util" Apr 21 17:41:51.584839 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.584762 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a214330-edae-48c4-870b-70714d553386" containerName="util" Apr 21 17:41:51.584839 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.584770 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a214330-edae-48c4-870b-70714d553386" containerName="extract" Apr 21 17:41:51.584839 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.584775 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a214330-edae-48c4-870b-70714d553386" containerName="extract" Apr 21 17:41:51.584839 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.584783 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a214330-edae-48c4-870b-70714d553386" containerName="pull" Apr 21 17:41:51.584839 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.584789 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a214330-edae-48c4-870b-70714d553386" containerName="pull" Apr 21 17:41:51.585025 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.584861 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a214330-edae-48c4-870b-70714d553386" containerName="extract" Apr 21 17:41:51.589543 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.589517 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" Apr 21 17:41:51.594167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.594117 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-fdhhv\"" Apr 21 17:41:51.594522 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.594503 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 17:41:51.595385 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.595366 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 17:41:51.611149 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.611104 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk"] Apr 21 17:41:51.731309 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.731274 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49ee73c-c95f-45ad-9576-aeca448fc758-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk\" (UID: \"c49ee73c-c95f-45ad-9576-aeca448fc758\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" Apr 21 17:41:51.731500 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.731331 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbkbk\" (UniqueName: \"kubernetes.io/projected/c49ee73c-c95f-45ad-9576-aeca448fc758-kube-api-access-fbkbk\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk\" (UID: \"c49ee73c-c95f-45ad-9576-aeca448fc758\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" Apr 21 17:41:51.731500 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.731397 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49ee73c-c95f-45ad-9576-aeca448fc758-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk\" (UID: \"c49ee73c-c95f-45ad-9576-aeca448fc758\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" Apr 21 17:41:51.832112 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.832068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49ee73c-c95f-45ad-9576-aeca448fc758-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk\" (UID: \"c49ee73c-c95f-45ad-9576-aeca448fc758\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" Apr 21 17:41:51.832342 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.832157 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbkbk\" (UniqueName: \"kubernetes.io/projected/c49ee73c-c95f-45ad-9576-aeca448fc758-kube-api-access-fbkbk\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk\" (UID: \"c49ee73c-c95f-45ad-9576-aeca448fc758\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" Apr 21 17:41:51.832342 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.832188 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49ee73c-c95f-45ad-9576-aeca448fc758-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk\" (UID: \"c49ee73c-c95f-45ad-9576-aeca448fc758\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" Apr 21 17:41:51.832493 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.832469 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49ee73c-c95f-45ad-9576-aeca448fc758-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk\" (UID: \"c49ee73c-c95f-45ad-9576-aeca448fc758\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" Apr 21 17:41:51.832583 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.832564 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49ee73c-c95f-45ad-9576-aeca448fc758-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk\" (UID: \"c49ee73c-c95f-45ad-9576-aeca448fc758\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" Apr 21 17:41:51.861353 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.861277 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbkbk\" (UniqueName: \"kubernetes.io/projected/c49ee73c-c95f-45ad-9576-aeca448fc758-kube-api-access-fbkbk\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk\" (UID: \"c49ee73c-c95f-45ad-9576-aeca448fc758\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" Apr 21 17:41:51.899210 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:51.899168 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" Apr 21 17:41:52.082590 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:52.082552 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk"] Apr 21 17:41:52.086258 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:41:52.086221 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc49ee73c_c95f_45ad_9576_aeca448fc758.slice/crio-a5c2b5714ba848b1405ebe2a2841730cd39ae324088d0fcec845f1a1c6c5855e WatchSource:0}: Error finding container a5c2b5714ba848b1405ebe2a2841730cd39ae324088d0fcec845f1a1c6c5855e: Status 404 returned error can't find the container with id a5c2b5714ba848b1405ebe2a2841730cd39ae324088d0fcec845f1a1c6c5855e Apr 21 17:41:52.542714 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:52.542677 2573 generic.go:358] "Generic (PLEG): container finished" podID="c49ee73c-c95f-45ad-9576-aeca448fc758" containerID="70209c5dfaba3b83d7a4b486ba0d0b5d0d46e08f3ec251f30bd518948e69bece" exitCode=0 Apr 21 17:41:52.542869 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:52.542740 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" event={"ID":"c49ee73c-c95f-45ad-9576-aeca448fc758","Type":"ContainerDied","Data":"70209c5dfaba3b83d7a4b486ba0d0b5d0d46e08f3ec251f30bd518948e69bece"} Apr 21 17:41:52.542869 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:52.542761 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" event={"ID":"c49ee73c-c95f-45ad-9576-aeca448fc758","Type":"ContainerStarted","Data":"a5c2b5714ba848b1405ebe2a2841730cd39ae324088d0fcec845f1a1c6c5855e"} Apr 21 17:41:54.513059 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:54.513017 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-bdd4f6877-g28kp" Apr 21 17:41:54.551776 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:54.551733 2573 generic.go:358] "Generic (PLEG): container finished" podID="c49ee73c-c95f-45ad-9576-aeca448fc758" containerID="fdf7af20a4dcf5a355e6dffa4e0a152cb9726126dd0e7c1176f5644c0f2ce24a" exitCode=0 Apr 21 17:41:54.551947 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:54.551817 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" event={"ID":"c49ee73c-c95f-45ad-9576-aeca448fc758","Type":"ContainerDied","Data":"fdf7af20a4dcf5a355e6dffa4e0a152cb9726126dd0e7c1176f5644c0f2ce24a"} Apr 21 17:41:55.559547 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:55.559117 2573 generic.go:358] "Generic (PLEG): container finished" podID="c49ee73c-c95f-45ad-9576-aeca448fc758" containerID="f3cb7680b33fdba719cbaadf79cb4ada3164d5acea87411b2bd3e53c7be964e4" exitCode=0 Apr 21 17:41:55.559547 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:55.559308 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" event={"ID":"c49ee73c-c95f-45ad-9576-aeca448fc758","Type":"ContainerDied","Data":"f3cb7680b33fdba719cbaadf79cb4ada3164d5acea87411b2bd3e53c7be964e4"} Apr 21 17:41:56.692010 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:56.691981 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" Apr 21 17:41:56.780943 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:56.780907 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49ee73c-c95f-45ad-9576-aeca448fc758-util\") pod \"c49ee73c-c95f-45ad-9576-aeca448fc758\" (UID: \"c49ee73c-c95f-45ad-9576-aeca448fc758\") " Apr 21 17:41:56.781173 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:56.780975 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbkbk\" (UniqueName: \"kubernetes.io/projected/c49ee73c-c95f-45ad-9576-aeca448fc758-kube-api-access-fbkbk\") pod \"c49ee73c-c95f-45ad-9576-aeca448fc758\" (UID: \"c49ee73c-c95f-45ad-9576-aeca448fc758\") " Apr 21 17:41:56.781173 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:56.781038 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49ee73c-c95f-45ad-9576-aeca448fc758-bundle\") pod \"c49ee73c-c95f-45ad-9576-aeca448fc758\" (UID: \"c49ee73c-c95f-45ad-9576-aeca448fc758\") " Apr 21 17:41:56.781899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:56.781871 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c49ee73c-c95f-45ad-9576-aeca448fc758-bundle" (OuterVolumeSpecName: "bundle") pod "c49ee73c-c95f-45ad-9576-aeca448fc758" (UID: "c49ee73c-c95f-45ad-9576-aeca448fc758"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:41:56.783080 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:56.783059 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c49ee73c-c95f-45ad-9576-aeca448fc758-kube-api-access-fbkbk" (OuterVolumeSpecName: "kube-api-access-fbkbk") pod "c49ee73c-c95f-45ad-9576-aeca448fc758" (UID: "c49ee73c-c95f-45ad-9576-aeca448fc758"). InnerVolumeSpecName "kube-api-access-fbkbk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:41:56.786490 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:56.786460 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c49ee73c-c95f-45ad-9576-aeca448fc758-util" (OuterVolumeSpecName: "util") pod "c49ee73c-c95f-45ad-9576-aeca448fc758" (UID: "c49ee73c-c95f-45ad-9576-aeca448fc758"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:41:56.881944 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:56.881859 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c49ee73c-c95f-45ad-9576-aeca448fc758-bundle\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:41:56.881944 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:56.881889 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c49ee73c-c95f-45ad-9576-aeca448fc758-util\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:41:56.881944 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:56.881898 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fbkbk\" (UniqueName: \"kubernetes.io/projected/c49ee73c-c95f-45ad-9576-aeca448fc758-kube-api-access-fbkbk\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:41:57.567859 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:57.567828 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" Apr 21 17:41:57.568043 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:57.567826 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c242ptk" event={"ID":"c49ee73c-c95f-45ad-9576-aeca448fc758","Type":"ContainerDied","Data":"a5c2b5714ba848b1405ebe2a2841730cd39ae324088d0fcec845f1a1c6c5855e"} Apr 21 17:41:57.568043 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:41:57.567937 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5c2b5714ba848b1405ebe2a2841730cd39ae324088d0fcec845f1a1c6c5855e" Apr 21 17:42:53.279694 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.279650 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv"] Apr 21 17:42:53.280209 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.279998 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c49ee73c-c95f-45ad-9576-aeca448fc758" containerName="extract" Apr 21 17:42:53.280209 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.280010 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49ee73c-c95f-45ad-9576-aeca448fc758" containerName="extract" Apr 21 17:42:53.280209 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.280027 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c49ee73c-c95f-45ad-9576-aeca448fc758" containerName="util" Apr 21 17:42:53.280209 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.280033 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49ee73c-c95f-45ad-9576-aeca448fc758" containerName="util" Apr 21 17:42:53.280209 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.280041 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c49ee73c-c95f-45ad-9576-aeca448fc758" containerName="pull" Apr 21 17:42:53.280209 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.280047 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49ee73c-c95f-45ad-9576-aeca448fc758" containerName="pull" Apr 21 17:42:53.280209 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.280108 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c49ee73c-c95f-45ad-9576-aeca448fc758" containerName="extract" Apr 21 17:42:53.283161 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.283119 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" Apr 21 17:42:53.286022 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.285999 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 17:42:53.287167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.287148 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-hdtws\"" Apr 21 17:42:53.287276 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.287154 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 17:42:53.291197 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.290881 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv"] Apr 21 17:42:53.380011 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.379967 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vll5r\" (UniqueName: \"kubernetes.io/projected/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-kube-api-access-vll5r\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv\" (UID: \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" Apr 21 17:42:53.380227 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.380102 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv\" (UID: \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" Apr 21 17:42:53.380227 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.380166 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv\" (UID: \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" Apr 21 17:42:53.481151 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.481105 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv\" (UID: \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" Apr 21 17:42:53.481398 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.481184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv\" (UID: \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" Apr 21 17:42:53.481398 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.481231 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vll5r\" (UniqueName: \"kubernetes.io/projected/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-kube-api-access-vll5r\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv\" (UID: \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" Apr 21 17:42:53.481524 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.481505 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv\" (UID: \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" Apr 21 17:42:53.481561 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.481538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv\" (UID: \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" Apr 21 17:42:53.490143 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.490107 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vll5r\" (UniqueName: \"kubernetes.io/projected/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-kube-api-access-vll5r\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv\" (UID: \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" Apr 21 17:42:53.593443 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.593348 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" Apr 21 17:42:53.725941 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.725909 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv"] Apr 21 17:42:53.727770 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:42:53.727735 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8942b5ef_5b9d_4ba2_8582_523f88c3feb6.slice/crio-b2e87560d2caf091d2c9e86b47571f035f5551e38426b961a2651b956e00d3af WatchSource:0}: Error finding container b2e87560d2caf091d2c9e86b47571f035f5551e38426b961a2651b956e00d3af: Status 404 returned error can't find the container with id b2e87560d2caf091d2c9e86b47571f035f5551e38426b961a2651b956e00d3af Apr 21 17:42:53.772693 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.772652 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" event={"ID":"8942b5ef-5b9d-4ba2-8582-523f88c3feb6","Type":"ContainerStarted","Data":"b2e87560d2caf091d2c9e86b47571f035f5551e38426b961a2651b956e00d3af"} Apr 21 17:42:53.879581 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.879500 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm"] Apr 21 17:42:53.882862 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.882844 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" Apr 21 17:42:53.889565 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.889535 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm"] Apr 21 17:42:53.986012 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.985946 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1cd980a-2ba1-4c37-9d39-0bdb967db655-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm\" (UID: \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" Apr 21 17:42:53.986258 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.986042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz77q\" (UniqueName: \"kubernetes.io/projected/c1cd980a-2ba1-4c37-9d39-0bdb967db655-kube-api-access-gz77q\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm\" (UID: \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" Apr 21 17:42:53.986258 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:53.986223 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1cd980a-2ba1-4c37-9d39-0bdb967db655-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm\" (UID: \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" Apr 21 17:42:54.087294 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.087247 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gz77q\" (UniqueName: \"kubernetes.io/projected/c1cd980a-2ba1-4c37-9d39-0bdb967db655-kube-api-access-gz77q\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm\" (UID: \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" Apr 21 17:42:54.087499 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.087339 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1cd980a-2ba1-4c37-9d39-0bdb967db655-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm\" (UID: \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" Apr 21 17:42:54.087499 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.087393 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1cd980a-2ba1-4c37-9d39-0bdb967db655-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm\" (UID: \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" Apr 21 17:42:54.087736 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.087710 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1cd980a-2ba1-4c37-9d39-0bdb967db655-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm\" (UID: \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" Apr 21 17:42:54.087800 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.087744 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1cd980a-2ba1-4c37-9d39-0bdb967db655-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm\" (UID: \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" Apr 21 17:42:54.096085 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.096052 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz77q\" (UniqueName: \"kubernetes.io/projected/c1cd980a-2ba1-4c37-9d39-0bdb967db655-kube-api-access-gz77q\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm\" (UID: \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" Apr 21 17:42:54.200629 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.200530 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" Apr 21 17:42:54.329108 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.329080 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm"] Apr 21 17:42:54.331914 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:42:54.331883 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1cd980a_2ba1_4c37_9d39_0bdb967db655.slice/crio-7f799aefd854d9c3e5937e3b593d10598ea0d32b4e250e535155187c0370551a WatchSource:0}: Error finding container 7f799aefd854d9c3e5937e3b593d10598ea0d32b4e250e535155187c0370551a: Status 404 returned error can't find the container with id 7f799aefd854d9c3e5937e3b593d10598ea0d32b4e250e535155187c0370551a Apr 21 17:42:54.479501 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.479414 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh"] Apr 21 17:42:54.483142 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.483113 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" Apr 21 17:42:54.490508 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.490473 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh"] Apr 21 17:42:54.591471 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.591436 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8af1143b-af39-4628-bce0-d70b72ae6de4-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh\" (UID: \"8af1143b-af39-4628-bce0-d70b72ae6de4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" Apr 21 17:42:54.591656 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.591494 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8af1143b-af39-4628-bce0-d70b72ae6de4-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh\" (UID: \"8af1143b-af39-4628-bce0-d70b72ae6de4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" Apr 21 17:42:54.591656 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.591613 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66lzh\" (UniqueName: \"kubernetes.io/projected/8af1143b-af39-4628-bce0-d70b72ae6de4-kube-api-access-66lzh\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh\" (UID: \"8af1143b-af39-4628-bce0-d70b72ae6de4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" Apr 21 17:42:54.692696 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.692651 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66lzh\" (UniqueName: \"kubernetes.io/projected/8af1143b-af39-4628-bce0-d70b72ae6de4-kube-api-access-66lzh\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh\" (UID: \"8af1143b-af39-4628-bce0-d70b72ae6de4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" Apr 21 17:42:54.692881 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.692724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8af1143b-af39-4628-bce0-d70b72ae6de4-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh\" (UID: \"8af1143b-af39-4628-bce0-d70b72ae6de4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" Apr 21 17:42:54.692881 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.692781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8af1143b-af39-4628-bce0-d70b72ae6de4-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh\" (UID: \"8af1143b-af39-4628-bce0-d70b72ae6de4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" Apr 21 17:42:54.693171 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.693121 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8af1143b-af39-4628-bce0-d70b72ae6de4-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh\" (UID: \"8af1143b-af39-4628-bce0-d70b72ae6de4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" Apr 21 17:42:54.693207 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.693187 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8af1143b-af39-4628-bce0-d70b72ae6de4-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh\" (UID: \"8af1143b-af39-4628-bce0-d70b72ae6de4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" Apr 21 17:42:54.702925 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.702888 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66lzh\" (UniqueName: \"kubernetes.io/projected/8af1143b-af39-4628-bce0-d70b72ae6de4-kube-api-access-66lzh\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh\" (UID: \"8af1143b-af39-4628-bce0-d70b72ae6de4\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" Apr 21 17:42:54.777744 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.777707 2573 generic.go:358] "Generic (PLEG): container finished" podID="8942b5ef-5b9d-4ba2-8582-523f88c3feb6" containerID="7a7c6acb2ece53f5bb1d64589b394f45e1673f962ff2448ff820059fa8dc34c3" exitCode=0 Apr 21 17:42:54.777944 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.777792 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" event={"ID":"8942b5ef-5b9d-4ba2-8582-523f88c3feb6","Type":"ContainerDied","Data":"7a7c6acb2ece53f5bb1d64589b394f45e1673f962ff2448ff820059fa8dc34c3"} Apr 21 17:42:54.779376 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.779351 2573 generic.go:358] "Generic (PLEG): container finished" podID="c1cd980a-2ba1-4c37-9d39-0bdb967db655" containerID="f9406b450c1d5539fc52d0fd082ee69210e97c5dd332d4895ce2977cc7ec0b9b" exitCode=0 Apr 21 17:42:54.779479 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.779396 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" event={"ID":"c1cd980a-2ba1-4c37-9d39-0bdb967db655","Type":"ContainerDied","Data":"f9406b450c1d5539fc52d0fd082ee69210e97c5dd332d4895ce2977cc7ec0b9b"} Apr 21 17:42:54.779479 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.779419 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" event={"ID":"c1cd980a-2ba1-4c37-9d39-0bdb967db655","Type":"ContainerStarted","Data":"7f799aefd854d9c3e5937e3b593d10598ea0d32b4e250e535155187c0370551a"} Apr 21 17:42:54.809520 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.809427 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" Apr 21 17:42:54.880055 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.880023 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb"] Apr 21 17:42:54.885165 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.885110 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" Apr 21 17:42:54.891424 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.891391 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb"] Apr 21 17:42:54.949065 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.949010 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh"] Apr 21 17:42:54.951236 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:42:54.951179 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8af1143b_af39_4628_bce0_d70b72ae6de4.slice/crio-393170c49741fa085cba719ba9b88b6842669d0aef23a7631a8a98b8c9a7c355 WatchSource:0}: Error finding container 393170c49741fa085cba719ba9b88b6842669d0aef23a7631a8a98b8c9a7c355: Status 404 returned error can't find the container with id 393170c49741fa085cba719ba9b88b6842669d0aef23a7631a8a98b8c9a7c355 Apr 21 17:42:54.995707 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.995674 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcdjg\" (UniqueName: \"kubernetes.io/projected/35bda4df-27c7-400d-bca3-30bd37f0fe76-kube-api-access-wcdjg\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb\" (UID: \"35bda4df-27c7-400d-bca3-30bd37f0fe76\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" Apr 21 17:42:54.995869 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.995723 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35bda4df-27c7-400d-bca3-30bd37f0fe76-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb\" (UID: \"35bda4df-27c7-400d-bca3-30bd37f0fe76\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" Apr 21 17:42:54.995869 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:54.995814 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35bda4df-27c7-400d-bca3-30bd37f0fe76-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb\" (UID: \"35bda4df-27c7-400d-bca3-30bd37f0fe76\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" Apr 21 17:42:55.097395 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.097354 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35bda4df-27c7-400d-bca3-30bd37f0fe76-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb\" (UID: \"35bda4df-27c7-400d-bca3-30bd37f0fe76\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" Apr 21 17:42:55.097589 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.097428 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcdjg\" (UniqueName: \"kubernetes.io/projected/35bda4df-27c7-400d-bca3-30bd37f0fe76-kube-api-access-wcdjg\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb\" (UID: \"35bda4df-27c7-400d-bca3-30bd37f0fe76\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" Apr 21 17:42:55.097589 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.097454 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35bda4df-27c7-400d-bca3-30bd37f0fe76-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb\" (UID: \"35bda4df-27c7-400d-bca3-30bd37f0fe76\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" Apr 21 17:42:55.097773 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.097751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35bda4df-27c7-400d-bca3-30bd37f0fe76-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb\" (UID: \"35bda4df-27c7-400d-bca3-30bd37f0fe76\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" Apr 21 17:42:55.097832 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.097781 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35bda4df-27c7-400d-bca3-30bd37f0fe76-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb\" (UID: \"35bda4df-27c7-400d-bca3-30bd37f0fe76\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" Apr 21 17:42:55.108053 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.108023 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcdjg\" (UniqueName: \"kubernetes.io/projected/35bda4df-27c7-400d-bca3-30bd37f0fe76-kube-api-access-wcdjg\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb\" (UID: \"35bda4df-27c7-400d-bca3-30bd37f0fe76\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" Apr 21 17:42:55.197103 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.197046 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" Apr 21 17:42:55.329057 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.329021 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb"] Apr 21 17:42:55.330774 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:42:55.330747 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35bda4df_27c7_400d_bca3_30bd37f0fe76.slice/crio-50c029a07bcad12715e7ef53b221bdda39eff3460f079592218b5385833bbb86 WatchSource:0}: Error finding container 50c029a07bcad12715e7ef53b221bdda39eff3460f079592218b5385833bbb86: Status 404 returned error can't find the container with id 50c029a07bcad12715e7ef53b221bdda39eff3460f079592218b5385833bbb86 Apr 21 17:42:55.785284 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.785248 2573 generic.go:358] "Generic (PLEG): container finished" podID="8942b5ef-5b9d-4ba2-8582-523f88c3feb6" containerID="2c8a504616d86f58c2a4ef3090943638ae79bfae5bb70157344e75706eba6bec" exitCode=0 Apr 21 17:42:55.785528 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.785331 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" event={"ID":"8942b5ef-5b9d-4ba2-8582-523f88c3feb6","Type":"ContainerDied","Data":"2c8a504616d86f58c2a4ef3090943638ae79bfae5bb70157344e75706eba6bec"} Apr 21 17:42:55.786947 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.786919 2573 generic.go:358] "Generic (PLEG): container finished" podID="35bda4df-27c7-400d-bca3-30bd37f0fe76" containerID="826de90c93f28f6b544d66d0aa19f47f4e02a17be93546bb7a7ea73e96441ffb" exitCode=0 Apr 21 17:42:55.787054 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.787001 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" event={"ID":"35bda4df-27c7-400d-bca3-30bd37f0fe76","Type":"ContainerDied","Data":"826de90c93f28f6b544d66d0aa19f47f4e02a17be93546bb7a7ea73e96441ffb"} Apr 21 17:42:55.787054 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.787023 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" event={"ID":"35bda4df-27c7-400d-bca3-30bd37f0fe76","Type":"ContainerStarted","Data":"50c029a07bcad12715e7ef53b221bdda39eff3460f079592218b5385833bbb86"} Apr 21 17:42:55.788737 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.788714 2573 generic.go:358] "Generic (PLEG): container finished" podID="c1cd980a-2ba1-4c37-9d39-0bdb967db655" containerID="bd17b2241d2e0f0182cb8bca30d651910fa1df6486bb5c9f2d8fcb1e8f59d5ce" exitCode=0 Apr 21 17:42:55.788830 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.788792 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" event={"ID":"c1cd980a-2ba1-4c37-9d39-0bdb967db655","Type":"ContainerDied","Data":"bd17b2241d2e0f0182cb8bca30d651910fa1df6486bb5c9f2d8fcb1e8f59d5ce"} Apr 21 17:42:55.790432 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.790418 2573 generic.go:358] "Generic (PLEG): container finished" podID="8af1143b-af39-4628-bce0-d70b72ae6de4" containerID="5adf621d9506000def2924c0a2463e9af3da19a68237941c4309d500e8be8c3f" exitCode=0 Apr 21 17:42:55.790482 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.790448 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" event={"ID":"8af1143b-af39-4628-bce0-d70b72ae6de4","Type":"ContainerDied","Data":"5adf621d9506000def2924c0a2463e9af3da19a68237941c4309d500e8be8c3f"} Apr 21 17:42:55.790482 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:55.790470 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" event={"ID":"8af1143b-af39-4628-bce0-d70b72ae6de4","Type":"ContainerStarted","Data":"393170c49741fa085cba719ba9b88b6842669d0aef23a7631a8a98b8c9a7c355"} Apr 21 17:42:56.798226 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:56.798169 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" event={"ID":"35bda4df-27c7-400d-bca3-30bd37f0fe76","Type":"ContainerStarted","Data":"dfd8f2a60999cd0f9694dbb465ff7095b4742048d30e979e824887ab952286f7"} Apr 21 17:42:56.800380 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:56.800349 2573 generic.go:358] "Generic (PLEG): container finished" podID="c1cd980a-2ba1-4c37-9d39-0bdb967db655" containerID="c07ec54e68677cf4c37742deaeaafefdd427b18dfec5771ee0ea08c8e20bb0a2" exitCode=0 Apr 21 17:42:56.800525 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:56.800440 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" event={"ID":"c1cd980a-2ba1-4c37-9d39-0bdb967db655","Type":"ContainerDied","Data":"c07ec54e68677cf4c37742deaeaafefdd427b18dfec5771ee0ea08c8e20bb0a2"} Apr 21 17:42:56.802299 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:56.802271 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" event={"ID":"8af1143b-af39-4628-bce0-d70b72ae6de4","Type":"ContainerStarted","Data":"0cd5d7d9001822c68538eac3bb042016e25a7e58a5c780b4eacdbb70c66795d0"} Apr 21 17:42:56.804532 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:56.804432 2573 generic.go:358] "Generic (PLEG): container finished" podID="8942b5ef-5b9d-4ba2-8582-523f88c3feb6" containerID="f5f2f7f8dbb16810f9fe5eef9bff0e2858a47985e2900e205a6317695d172dfa" exitCode=0 Apr 21 17:42:56.804532 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:56.804464 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" event={"ID":"8942b5ef-5b9d-4ba2-8582-523f88c3feb6","Type":"ContainerDied","Data":"f5f2f7f8dbb16810f9fe5eef9bff0e2858a47985e2900e205a6317695d172dfa"} Apr 21 17:42:57.813789 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:57.813756 2573 generic.go:358] "Generic (PLEG): container finished" podID="8af1143b-af39-4628-bce0-d70b72ae6de4" containerID="0cd5d7d9001822c68538eac3bb042016e25a7e58a5c780b4eacdbb70c66795d0" exitCode=0 Apr 21 17:42:57.814229 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:57.813842 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" event={"ID":"8af1143b-af39-4628-bce0-d70b72ae6de4","Type":"ContainerDied","Data":"0cd5d7d9001822c68538eac3bb042016e25a7e58a5c780b4eacdbb70c66795d0"} Apr 21 17:42:57.815819 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:57.815795 2573 generic.go:358] "Generic (PLEG): container finished" podID="35bda4df-27c7-400d-bca3-30bd37f0fe76" containerID="dfd8f2a60999cd0f9694dbb465ff7095b4742048d30e979e824887ab952286f7" exitCode=0 Apr 21 17:42:57.815900 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:57.815873 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" event={"ID":"35bda4df-27c7-400d-bca3-30bd37f0fe76","Type":"ContainerDied","Data":"dfd8f2a60999cd0f9694dbb465ff7095b4742048d30e979e824887ab952286f7"} Apr 21 17:42:57.963453 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:57.963171 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" Apr 21 17:42:58.080271 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.080193 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" Apr 21 17:42:58.123991 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.123957 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-util\") pod \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\" (UID: \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\") " Apr 21 17:42:58.124223 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.124030 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vll5r\" (UniqueName: \"kubernetes.io/projected/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-kube-api-access-vll5r\") pod \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\" (UID: \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\") " Apr 21 17:42:58.124223 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.124068 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-bundle\") pod \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\" (UID: \"8942b5ef-5b9d-4ba2-8582-523f88c3feb6\") " Apr 21 17:42:58.124616 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.124578 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-bundle" (OuterVolumeSpecName: "bundle") pod "8942b5ef-5b9d-4ba2-8582-523f88c3feb6" (UID: "8942b5ef-5b9d-4ba2-8582-523f88c3feb6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:42:58.126441 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.126408 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-kube-api-access-vll5r" (OuterVolumeSpecName: "kube-api-access-vll5r") pod "8942b5ef-5b9d-4ba2-8582-523f88c3feb6" (UID: "8942b5ef-5b9d-4ba2-8582-523f88c3feb6"). InnerVolumeSpecName "kube-api-access-vll5r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:42:58.129988 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.129940 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-util" (OuterVolumeSpecName: "util") pod "8942b5ef-5b9d-4ba2-8582-523f88c3feb6" (UID: "8942b5ef-5b9d-4ba2-8582-523f88c3feb6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:42:58.225435 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.225401 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz77q\" (UniqueName: \"kubernetes.io/projected/c1cd980a-2ba1-4c37-9d39-0bdb967db655-kube-api-access-gz77q\") pod \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\" (UID: \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\") " Apr 21 17:42:58.225586 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.225478 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1cd980a-2ba1-4c37-9d39-0bdb967db655-bundle\") pod \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\" (UID: \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\") " Apr 21 17:42:58.225673 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.225645 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1cd980a-2ba1-4c37-9d39-0bdb967db655-util\") pod \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\" (UID: \"c1cd980a-2ba1-4c37-9d39-0bdb967db655\") " Apr 21 17:42:58.225983 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.225963 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-util\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:42:58.226072 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.225984 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vll5r\" (UniqueName: \"kubernetes.io/projected/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-kube-api-access-vll5r\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:42:58.226072 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.225998 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8942b5ef-5b9d-4ba2-8582-523f88c3feb6-bundle\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:42:58.226072 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.226023 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1cd980a-2ba1-4c37-9d39-0bdb967db655-bundle" (OuterVolumeSpecName: "bundle") pod "c1cd980a-2ba1-4c37-9d39-0bdb967db655" (UID: "c1cd980a-2ba1-4c37-9d39-0bdb967db655"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:42:58.227674 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.227653 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1cd980a-2ba1-4c37-9d39-0bdb967db655-kube-api-access-gz77q" (OuterVolumeSpecName: "kube-api-access-gz77q") pod "c1cd980a-2ba1-4c37-9d39-0bdb967db655" (UID: "c1cd980a-2ba1-4c37-9d39-0bdb967db655"). InnerVolumeSpecName "kube-api-access-gz77q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:42:58.231236 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.231207 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1cd980a-2ba1-4c37-9d39-0bdb967db655-util" (OuterVolumeSpecName: "util") pod "c1cd980a-2ba1-4c37-9d39-0bdb967db655" (UID: "c1cd980a-2ba1-4c37-9d39-0bdb967db655"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:42:58.327486 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.327428 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gz77q\" (UniqueName: \"kubernetes.io/projected/c1cd980a-2ba1-4c37-9d39-0bdb967db655-kube-api-access-gz77q\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:42:58.327486 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.327478 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c1cd980a-2ba1-4c37-9d39-0bdb967db655-bundle\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:42:58.327486 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.327497 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c1cd980a-2ba1-4c37-9d39-0bdb967db655-util\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:42:58.821473 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.821434 2573 generic.go:358] "Generic (PLEG): container finished" podID="35bda4df-27c7-400d-bca3-30bd37f0fe76" containerID="5c27140a4489044adf98f7ff56d3d646f76a6389b4e011e5568e3993f7ddb12c" exitCode=0 Apr 21 17:42:58.821951 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.821522 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" event={"ID":"35bda4df-27c7-400d-bca3-30bd37f0fe76","Type":"ContainerDied","Data":"5c27140a4489044adf98f7ff56d3d646f76a6389b4e011e5568e3993f7ddb12c"} Apr 21 17:42:58.823313 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.823286 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" event={"ID":"c1cd980a-2ba1-4c37-9d39-0bdb967db655","Type":"ContainerDied","Data":"7f799aefd854d9c3e5937e3b593d10598ea0d32b4e250e535155187c0370551a"} Apr 21 17:42:58.823313 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.823304 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm" Apr 21 17:42:58.823492 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.823318 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f799aefd854d9c3e5937e3b593d10598ea0d32b4e250e535155187c0370551a" Apr 21 17:42:58.825302 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.825276 2573 generic.go:358] "Generic (PLEG): container finished" podID="8af1143b-af39-4628-bce0-d70b72ae6de4" containerID="b612f9c8ac58b4b6ce2828ec093eda4e5e511d75d4440e65bfa513bed04b674b" exitCode=0 Apr 21 17:42:58.825428 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.825356 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" event={"ID":"8af1143b-af39-4628-bce0-d70b72ae6de4","Type":"ContainerDied","Data":"b612f9c8ac58b4b6ce2828ec093eda4e5e511d75d4440e65bfa513bed04b674b"} Apr 21 17:42:58.827103 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.827071 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" event={"ID":"8942b5ef-5b9d-4ba2-8582-523f88c3feb6","Type":"ContainerDied","Data":"b2e87560d2caf091d2c9e86b47571f035f5551e38426b961a2651b956e00d3af"} Apr 21 17:42:58.827103 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.827088 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv" Apr 21 17:42:58.827103 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:58.827095 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2e87560d2caf091d2c9e86b47571f035f5551e38426b961a2651b956e00d3af" Apr 21 17:42:59.962558 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:59.962531 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" Apr 21 17:42:59.996431 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:42:59.996401 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" Apr 21 17:43:00.142565 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.142453 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35bda4df-27c7-400d-bca3-30bd37f0fe76-bundle\") pod \"35bda4df-27c7-400d-bca3-30bd37f0fe76\" (UID: \"35bda4df-27c7-400d-bca3-30bd37f0fe76\") " Apr 21 17:43:00.142565 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.142512 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35bda4df-27c7-400d-bca3-30bd37f0fe76-util\") pod \"35bda4df-27c7-400d-bca3-30bd37f0fe76\" (UID: \"35bda4df-27c7-400d-bca3-30bd37f0fe76\") " Apr 21 17:43:00.142565 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.142542 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcdjg\" (UniqueName: \"kubernetes.io/projected/35bda4df-27c7-400d-bca3-30bd37f0fe76-kube-api-access-wcdjg\") pod \"35bda4df-27c7-400d-bca3-30bd37f0fe76\" (UID: \"35bda4df-27c7-400d-bca3-30bd37f0fe76\") " Apr 21 17:43:00.142860 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.142590 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8af1143b-af39-4628-bce0-d70b72ae6de4-util\") pod \"8af1143b-af39-4628-bce0-d70b72ae6de4\" (UID: \"8af1143b-af39-4628-bce0-d70b72ae6de4\") " Apr 21 17:43:00.142860 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.142615 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66lzh\" (UniqueName: \"kubernetes.io/projected/8af1143b-af39-4628-bce0-d70b72ae6de4-kube-api-access-66lzh\") pod \"8af1143b-af39-4628-bce0-d70b72ae6de4\" (UID: \"8af1143b-af39-4628-bce0-d70b72ae6de4\") " Apr 21 17:43:00.142860 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.142682 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8af1143b-af39-4628-bce0-d70b72ae6de4-bundle\") pod \"8af1143b-af39-4628-bce0-d70b72ae6de4\" (UID: \"8af1143b-af39-4628-bce0-d70b72ae6de4\") " Apr 21 17:43:00.143270 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.143239 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8af1143b-af39-4628-bce0-d70b72ae6de4-bundle" (OuterVolumeSpecName: "bundle") pod "8af1143b-af39-4628-bce0-d70b72ae6de4" (UID: "8af1143b-af39-4628-bce0-d70b72ae6de4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:43:00.143347 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.143264 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35bda4df-27c7-400d-bca3-30bd37f0fe76-bundle" (OuterVolumeSpecName: "bundle") pod "35bda4df-27c7-400d-bca3-30bd37f0fe76" (UID: "35bda4df-27c7-400d-bca3-30bd37f0fe76"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:43:00.145328 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.145292 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35bda4df-27c7-400d-bca3-30bd37f0fe76-kube-api-access-wcdjg" (OuterVolumeSpecName: "kube-api-access-wcdjg") pod "35bda4df-27c7-400d-bca3-30bd37f0fe76" (UID: "35bda4df-27c7-400d-bca3-30bd37f0fe76"). InnerVolumeSpecName "kube-api-access-wcdjg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:43:00.145328 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.145315 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8af1143b-af39-4628-bce0-d70b72ae6de4-kube-api-access-66lzh" (OuterVolumeSpecName: "kube-api-access-66lzh") pod "8af1143b-af39-4628-bce0-d70b72ae6de4" (UID: "8af1143b-af39-4628-bce0-d70b72ae6de4"). InnerVolumeSpecName "kube-api-access-66lzh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:43:00.147676 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.147646 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8af1143b-af39-4628-bce0-d70b72ae6de4-util" (OuterVolumeSpecName: "util") pod "8af1143b-af39-4628-bce0-d70b72ae6de4" (UID: "8af1143b-af39-4628-bce0-d70b72ae6de4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:43:00.148258 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.148228 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35bda4df-27c7-400d-bca3-30bd37f0fe76-util" (OuterVolumeSpecName: "util") pod "35bda4df-27c7-400d-bca3-30bd37f0fe76" (UID: "35bda4df-27c7-400d-bca3-30bd37f0fe76"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 17:43:00.243846 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.243809 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35bda4df-27c7-400d-bca3-30bd37f0fe76-util\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:43:00.243846 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.243840 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wcdjg\" (UniqueName: \"kubernetes.io/projected/35bda4df-27c7-400d-bca3-30bd37f0fe76-kube-api-access-wcdjg\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:43:00.243846 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.243852 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8af1143b-af39-4628-bce0-d70b72ae6de4-util\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:43:00.244088 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.243861 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-66lzh\" (UniqueName: \"kubernetes.io/projected/8af1143b-af39-4628-bce0-d70b72ae6de4-kube-api-access-66lzh\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:43:00.244088 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.243870 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8af1143b-af39-4628-bce0-d70b72ae6de4-bundle\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:43:00.244088 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.243879 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35bda4df-27c7-400d-bca3-30bd37f0fe76-bundle\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:43:00.836608 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.836570 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" event={"ID":"35bda4df-27c7-400d-bca3-30bd37f0fe76","Type":"ContainerDied","Data":"50c029a07bcad12715e7ef53b221bdda39eff3460f079592218b5385833bbb86"} Apr 21 17:43:00.836608 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.836610 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50c029a07bcad12715e7ef53b221bdda39eff3460f079592218b5385833bbb86" Apr 21 17:43:00.836856 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.836580 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb" Apr 21 17:43:00.838451 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.838425 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" Apr 21 17:43:00.838584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.838460 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh" event={"ID":"8af1143b-af39-4628-bce0-d70b72ae6de4","Type":"ContainerDied","Data":"393170c49741fa085cba719ba9b88b6842669d0aef23a7631a8a98b8c9a7c355"} Apr 21 17:43:00.838584 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:00.838495 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="393170c49741fa085cba719ba9b88b6842669d0aef23a7631a8a98b8c9a7c355" Apr 21 17:43:07.221683 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.221637 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-844fdb48-rtshf"] Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.221992 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35bda4df-27c7-400d-bca3-30bd37f0fe76" containerName="pull" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222003 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="35bda4df-27c7-400d-bca3-30bd37f0fe76" containerName="pull" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222013 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1cd980a-2ba1-4c37-9d39-0bdb967db655" containerName="pull" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222018 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1cd980a-2ba1-4c37-9d39-0bdb967db655" containerName="pull" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222024 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8af1143b-af39-4628-bce0-d70b72ae6de4" containerName="extract" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222029 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af1143b-af39-4628-bce0-d70b72ae6de4" containerName="extract" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222037 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35bda4df-27c7-400d-bca3-30bd37f0fe76" containerName="extract" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222042 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="35bda4df-27c7-400d-bca3-30bd37f0fe76" containerName="extract" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222053 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1cd980a-2ba1-4c37-9d39-0bdb967db655" containerName="extract" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222061 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1cd980a-2ba1-4c37-9d39-0bdb967db655" containerName="extract" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222071 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35bda4df-27c7-400d-bca3-30bd37f0fe76" containerName="util" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222076 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="35bda4df-27c7-400d-bca3-30bd37f0fe76" containerName="util" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222083 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8942b5ef-5b9d-4ba2-8582-523f88c3feb6" containerName="util" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222089 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8942b5ef-5b9d-4ba2-8582-523f88c3feb6" containerName="util" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222094 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8942b5ef-5b9d-4ba2-8582-523f88c3feb6" containerName="extract" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222099 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8942b5ef-5b9d-4ba2-8582-523f88c3feb6" containerName="extract" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222107 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8af1143b-af39-4628-bce0-d70b72ae6de4" containerName="pull" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222112 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af1143b-af39-4628-bce0-d70b72ae6de4" containerName="pull" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222118 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8af1143b-af39-4628-bce0-d70b72ae6de4" containerName="util" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222123 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af1143b-af39-4628-bce0-d70b72ae6de4" containerName="util" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222146 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8942b5ef-5b9d-4ba2-8582-523f88c3feb6" containerName="pull" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222153 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8942b5ef-5b9d-4ba2-8582-523f88c3feb6" containerName="pull" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222162 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1cd980a-2ba1-4c37-9d39-0bdb967db655" containerName="util" Apr 21 17:43:07.222167 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222169 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1cd980a-2ba1-4c37-9d39-0bdb967db655" containerName="util" Apr 21 17:43:07.222889 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222226 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="35bda4df-27c7-400d-bca3-30bd37f0fe76" containerName="extract" Apr 21 17:43:07.222889 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222238 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8af1143b-af39-4628-bce0-d70b72ae6de4" containerName="extract" Apr 21 17:43:07.222889 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222245 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8942b5ef-5b9d-4ba2-8582-523f88c3feb6" containerName="extract" Apr 21 17:43:07.222889 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.222251 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1cd980a-2ba1-4c37-9d39-0bdb967db655" containerName="extract" Apr 21 17:43:07.226875 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.226845 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.239409 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.239379 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-844fdb48-rtshf"] Apr 21 17:43:07.402226 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.402186 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-console-serving-cert\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.402436 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.402246 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpzqr\" (UniqueName: \"kubernetes.io/projected/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-kube-api-access-qpzqr\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.402436 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.402316 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-console-oauth-config\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.402436 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.402356 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-console-config\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.402436 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.402390 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-trusted-ca-bundle\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.402436 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.402426 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-service-ca\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.402618 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.402446 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-oauth-serving-cert\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.503490 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.503452 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-console-oauth-config\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.503490 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.503489 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-console-config\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.503773 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.503508 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-trusted-ca-bundle\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.503773 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.503627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-service-ca\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.503773 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.503659 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-oauth-serving-cert\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.503928 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.503802 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-console-serving-cert\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.503928 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.503891 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpzqr\" (UniqueName: \"kubernetes.io/projected/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-kube-api-access-qpzqr\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.504312 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.504290 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-console-config\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.504431 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.504418 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-trusted-ca-bundle\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.504495 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.504432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-service-ca\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.504639 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.504621 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-oauth-serving-cert\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.506028 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.506008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-console-oauth-config\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.506307 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.506285 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-console-serving-cert\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.514202 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.514177 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpzqr\" (UniqueName: \"kubernetes.io/projected/f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0-kube-api-access-qpzqr\") pod \"console-844fdb48-rtshf\" (UID: \"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0\") " pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.539077 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.539034 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:07.706213 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.706182 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-844fdb48-rtshf"] Apr 21 17:43:07.708069 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:43:07.708030 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf739b2eb_e5d1_450e_aaf5_f2931d0c0ff0.slice/crio-e58607cbc404933bc672cb36046275ca8fcc41491681bf361943780b9f145a70 WatchSource:0}: Error finding container e58607cbc404933bc672cb36046275ca8fcc41491681bf361943780b9f145a70: Status 404 returned error can't find the container with id e58607cbc404933bc672cb36046275ca8fcc41491681bf361943780b9f145a70 Apr 21 17:43:07.867462 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.867418 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-844fdb48-rtshf" event={"ID":"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0","Type":"ContainerStarted","Data":"fb4c6bb241a8900d954a0341ba36561620b1aecb55414a6b44691da1c23b6b0d"} Apr 21 17:43:07.867462 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.867462 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-844fdb48-rtshf" event={"ID":"f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0","Type":"ContainerStarted","Data":"e58607cbc404933bc672cb36046275ca8fcc41491681bf361943780b9f145a70"} Apr 21 17:43:07.887892 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:07.887829 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-844fdb48-rtshf" podStartSLOduration=0.887813951 podStartE2EDuration="887.813951ms" podCreationTimestamp="2026-04-21 17:43:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 17:43:07.886030099 +0000 UTC m=+572.596724208" watchObservedRunningTime="2026-04-21 17:43:07.887813951 +0000 UTC m=+572.598508058" Apr 21 17:43:17.539931 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:17.539888 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:17.539931 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:17.539940 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:17.544749 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:17.544722 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:17.910234 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:17.910153 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-844fdb48-rtshf" Apr 21 17:43:17.959630 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:17.959589 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-696f884676-nlxbt"] Apr 21 17:43:27.286099 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.286061 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf"] Apr 21 17:43:27.289487 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.289466 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf" Apr 21 17:43:27.292236 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.292191 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 17:43:27.292382 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.292275 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 17:43:27.292382 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.292346 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 21 17:43:27.293365 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.293345 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 21 17:43:27.293431 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.293348 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-hdtws\"" Apr 21 17:43:27.299475 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.299448 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf"] Apr 21 17:43:27.384195 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.384117 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/21ac171b-33e1-45a6-b129-8f41f56b967e-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-rrmsf\" (UID: \"21ac171b-33e1-45a6-b129-8f41f56b967e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf" Apr 21 17:43:27.384376 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.384260 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rzfz\" (UniqueName: \"kubernetes.io/projected/21ac171b-33e1-45a6-b129-8f41f56b967e-kube-api-access-6rzfz\") pod \"kuadrant-console-plugin-6cb54b5c86-rrmsf\" (UID: \"21ac171b-33e1-45a6-b129-8f41f56b967e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf" Apr 21 17:43:27.384415 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.384374 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/21ac171b-33e1-45a6-b129-8f41f56b967e-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-rrmsf\" (UID: \"21ac171b-33e1-45a6-b129-8f41f56b967e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf" Apr 21 17:43:27.484903 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.484859 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/21ac171b-33e1-45a6-b129-8f41f56b967e-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-rrmsf\" (UID: \"21ac171b-33e1-45a6-b129-8f41f56b967e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf" Apr 21 17:43:27.484903 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.484907 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/21ac171b-33e1-45a6-b129-8f41f56b967e-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-rrmsf\" (UID: \"21ac171b-33e1-45a6-b129-8f41f56b967e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf" Apr 21 17:43:27.485170 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.484943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rzfz\" (UniqueName: \"kubernetes.io/projected/21ac171b-33e1-45a6-b129-8f41f56b967e-kube-api-access-6rzfz\") pod \"kuadrant-console-plugin-6cb54b5c86-rrmsf\" (UID: \"21ac171b-33e1-45a6-b129-8f41f56b967e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf" Apr 21 17:43:27.485673 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.485648 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/21ac171b-33e1-45a6-b129-8f41f56b967e-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-rrmsf\" (UID: \"21ac171b-33e1-45a6-b129-8f41f56b967e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf" Apr 21 17:43:27.487470 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.487445 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/21ac171b-33e1-45a6-b129-8f41f56b967e-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-rrmsf\" (UID: \"21ac171b-33e1-45a6-b129-8f41f56b967e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf" Apr 21 17:43:27.494397 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.494370 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rzfz\" (UniqueName: \"kubernetes.io/projected/21ac171b-33e1-45a6-b129-8f41f56b967e-kube-api-access-6rzfz\") pod \"kuadrant-console-plugin-6cb54b5c86-rrmsf\" (UID: \"21ac171b-33e1-45a6-b129-8f41f56b967e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf" Apr 21 17:43:27.600833 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.600738 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf" Apr 21 17:43:27.731429 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.731250 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf"] Apr 21 17:43:27.734349 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:43:27.734306 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21ac171b_33e1_45a6_b129_8f41f56b967e.slice/crio-1408240f94f8de95aa7389787ea85bacf591aa6711b75a5f9ac6e72ed82b736e WatchSource:0}: Error finding container 1408240f94f8de95aa7389787ea85bacf591aa6711b75a5f9ac6e72ed82b736e: Status 404 returned error can't find the container with id 1408240f94f8de95aa7389787ea85bacf591aa6711b75a5f9ac6e72ed82b736e Apr 21 17:43:27.940965 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:27.940875 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf" event={"ID":"21ac171b-33e1-45a6-b129-8f41f56b967e","Type":"ContainerStarted","Data":"1408240f94f8de95aa7389787ea85bacf591aa6711b75a5f9ac6e72ed82b736e"} Apr 21 17:43:35.818928 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:35.818849 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 17:43:35.819494 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:35.819003 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 17:43:35.825119 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:35.825095 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 17:43:35.825517 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:35.825494 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 17:43:42.979765 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:42.979689 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-696f884676-nlxbt" podUID="dd8df74c-1bed-46ce-9580-75cfa8ffc3d6" containerName="console" containerID="cri-o://34c77671ebdb05d6955a57cde0c327f8e6f8b1bccc617147d2045db0908b6254" gracePeriod=15 Apr 21 17:43:43.946109 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:43.946058 2573 patch_prober.go:28] interesting pod/console-696f884676-nlxbt container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.23:8443/health\": dial tcp 10.134.0.23:8443: connect: connection refused" start-of-body= Apr 21 17:43:43.946313 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:43.946126 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-696f884676-nlxbt" podUID="dd8df74c-1bed-46ce-9580-75cfa8ffc3d6" containerName="console" probeResult="failure" output="Get \"https://10.134.0.23:8443/health\": dial tcp 10.134.0.23:8443: connect: connection refused" Apr 21 17:43:52.600784 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.600755 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-696f884676-nlxbt_dd8df74c-1bed-46ce-9580-75cfa8ffc3d6/console/0.log" Apr 21 17:43:52.601124 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.600834 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:43:52.636807 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.636774 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmcg8\" (UniqueName: \"kubernetes.io/projected/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-kube-api-access-qmcg8\") pod \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " Apr 21 17:43:52.636991 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.636829 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-service-ca\") pod \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " Apr 21 17:43:52.636991 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.636889 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-trusted-ca-bundle\") pod \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " Apr 21 17:43:52.636991 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.636925 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-oauth-config\") pod \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " Apr 21 17:43:52.636991 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.636978 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-serving-cert\") pod \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " Apr 21 17:43:52.637238 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.637022 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-oauth-serving-cert\") pod \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " Apr 21 17:43:52.637238 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.637089 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-config\") pod \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\" (UID: \"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6\") " Apr 21 17:43:52.637453 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.637328 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-service-ca" (OuterVolumeSpecName: "service-ca") pod "dd8df74c-1bed-46ce-9580-75cfa8ffc3d6" (UID: "dd8df74c-1bed-46ce-9580-75cfa8ffc3d6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:43:52.637550 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.637516 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dd8df74c-1bed-46ce-9580-75cfa8ffc3d6" (UID: "dd8df74c-1bed-46ce-9580-75cfa8ffc3d6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:43:52.637550 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.637514 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dd8df74c-1bed-46ce-9580-75cfa8ffc3d6" (UID: "dd8df74c-1bed-46ce-9580-75cfa8ffc3d6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:43:52.637658 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.637599 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-config" (OuterVolumeSpecName: "console-config") pod "dd8df74c-1bed-46ce-9580-75cfa8ffc3d6" (UID: "dd8df74c-1bed-46ce-9580-75cfa8ffc3d6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 17:43:52.637808 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.637789 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-oauth-serving-cert\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:43:52.638025 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.637810 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-config\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:43:52.638025 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.637821 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-service-ca\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:43:52.638025 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.637833 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-trusted-ca-bundle\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:43:52.639718 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.639692 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-kube-api-access-qmcg8" (OuterVolumeSpecName: "kube-api-access-qmcg8") pod "dd8df74c-1bed-46ce-9580-75cfa8ffc3d6" (UID: "dd8df74c-1bed-46ce-9580-75cfa8ffc3d6"). InnerVolumeSpecName "kube-api-access-qmcg8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:43:52.640015 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.639986 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dd8df74c-1bed-46ce-9580-75cfa8ffc3d6" (UID: "dd8df74c-1bed-46ce-9580-75cfa8ffc3d6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:43:52.640105 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.640021 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dd8df74c-1bed-46ce-9580-75cfa8ffc3d6" (UID: "dd8df74c-1bed-46ce-9580-75cfa8ffc3d6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:43:52.738685 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.738645 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-serving-cert\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:43:52.738685 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.738682 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qmcg8\" (UniqueName: \"kubernetes.io/projected/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-kube-api-access-qmcg8\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:43:52.738685 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:52.738692 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6-console-oauth-config\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:43:53.056365 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:53.056269 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf" event={"ID":"21ac171b-33e1-45a6-b129-8f41f56b967e","Type":"ContainerStarted","Data":"9420d17b2f384f23c45b1176b87a3e3d775504f48b539d6709325ea0be2cdff2"} Apr 21 17:43:53.057585 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:53.057564 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-696f884676-nlxbt_dd8df74c-1bed-46ce-9580-75cfa8ffc3d6/console/0.log" Apr 21 17:43:53.057742 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:53.057600 2573 generic.go:358] "Generic (PLEG): container finished" podID="dd8df74c-1bed-46ce-9580-75cfa8ffc3d6" containerID="34c77671ebdb05d6955a57cde0c327f8e6f8b1bccc617147d2045db0908b6254" exitCode=2 Apr 21 17:43:53.057742 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:53.057666 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-696f884676-nlxbt" event={"ID":"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6","Type":"ContainerDied","Data":"34c77671ebdb05d6955a57cde0c327f8e6f8b1bccc617147d2045db0908b6254"} Apr 21 17:43:53.057742 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:53.057673 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-696f884676-nlxbt" Apr 21 17:43:53.057742 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:53.057689 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-696f884676-nlxbt" event={"ID":"dd8df74c-1bed-46ce-9580-75cfa8ffc3d6","Type":"ContainerDied","Data":"2cf6e1a6e4720944e9b8c68865c62b245fbadbb922d76d4d27d0a5db8880c5db"} Apr 21 17:43:53.057742 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:53.057704 2573 scope.go:117] "RemoveContainer" containerID="34c77671ebdb05d6955a57cde0c327f8e6f8b1bccc617147d2045db0908b6254" Apr 21 17:43:53.066667 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:53.066647 2573 scope.go:117] "RemoveContainer" containerID="34c77671ebdb05d6955a57cde0c327f8e6f8b1bccc617147d2045db0908b6254" Apr 21 17:43:53.066936 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:43:53.066917 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34c77671ebdb05d6955a57cde0c327f8e6f8b1bccc617147d2045db0908b6254\": container with ID starting with 34c77671ebdb05d6955a57cde0c327f8e6f8b1bccc617147d2045db0908b6254 not found: ID does not exist" containerID="34c77671ebdb05d6955a57cde0c327f8e6f8b1bccc617147d2045db0908b6254" Apr 21 17:43:53.066990 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:53.066944 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34c77671ebdb05d6955a57cde0c327f8e6f8b1bccc617147d2045db0908b6254"} err="failed to get container status \"34c77671ebdb05d6955a57cde0c327f8e6f8b1bccc617147d2045db0908b6254\": rpc error: code = NotFound desc = could not find container \"34c77671ebdb05d6955a57cde0c327f8e6f8b1bccc617147d2045db0908b6254\": container with ID starting with 34c77671ebdb05d6955a57cde0c327f8e6f8b1bccc617147d2045db0908b6254 not found: ID does not exist" Apr 21 17:43:53.076174 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:53.076104 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-rrmsf" podStartSLOduration=1.289422358 podStartE2EDuration="26.076088587s" podCreationTimestamp="2026-04-21 17:43:27 +0000 UTC" firstStartedPulling="2026-04-21 17:43:27.735706324 +0000 UTC m=+592.446400413" lastFinishedPulling="2026-04-21 17:43:52.522372549 +0000 UTC m=+617.233066642" observedRunningTime="2026-04-21 17:43:53.073206797 +0000 UTC m=+617.783900905" watchObservedRunningTime="2026-04-21 17:43:53.076088587 +0000 UTC m=+617.786782695" Apr 21 17:43:53.104392 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:53.104354 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-696f884676-nlxbt"] Apr 21 17:43:53.113764 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:53.113732 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-696f884676-nlxbt"] Apr 21 17:43:53.911512 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:43:53.911474 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd8df74c-1bed-46ce-9580-75cfa8ffc3d6" path="/var/lib/kubelet/pods/dd8df74c-1bed-46ce-9580-75cfa8ffc3d6/volumes" Apr 21 17:44:14.853938 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:14.853894 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-s67qs"] Apr 21 17:44:14.854498 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:14.854314 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd8df74c-1bed-46ce-9580-75cfa8ffc3d6" containerName="console" Apr 21 17:44:14.854498 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:14.854332 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8df74c-1bed-46ce-9580-75cfa8ffc3d6" containerName="console" Apr 21 17:44:14.854498 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:14.854405 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd8df74c-1bed-46ce-9580-75cfa8ffc3d6" containerName="console" Apr 21 17:44:14.872264 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:14.872218 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-s67qs"] Apr 21 17:44:14.872434 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:14.872344 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-s67qs" Apr 21 17:44:14.875402 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:14.875371 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-btn7q\"" Apr 21 17:44:14.930749 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:14.930708 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jndc\" (UniqueName: \"kubernetes.io/projected/96f87617-0d7d-4c1a-b148-6822af236822-kube-api-access-7jndc\") pod \"authorino-f99f4b5cd-s67qs\" (UID: \"96f87617-0d7d-4c1a-b148-6822af236822\") " pod="kuadrant-system/authorino-f99f4b5cd-s67qs" Apr 21 17:44:15.014654 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:15.014613 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-m29r6"] Apr 21 17:44:15.031831 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:15.031794 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jndc\" (UniqueName: \"kubernetes.io/projected/96f87617-0d7d-4c1a-b148-6822af236822-kube-api-access-7jndc\") pod \"authorino-f99f4b5cd-s67qs\" (UID: \"96f87617-0d7d-4c1a-b148-6822af236822\") " pod="kuadrant-system/authorino-f99f4b5cd-s67qs" Apr 21 17:44:15.040670 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:15.040642 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jndc\" (UniqueName: \"kubernetes.io/projected/96f87617-0d7d-4c1a-b148-6822af236822-kube-api-access-7jndc\") pod \"authorino-f99f4b5cd-s67qs\" (UID: \"96f87617-0d7d-4c1a-b148-6822af236822\") " pod="kuadrant-system/authorino-f99f4b5cd-s67qs" Apr 21 17:44:15.049476 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:15.049442 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-m29r6"] Apr 21 17:44:15.049615 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:15.049552 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-m29r6" Apr 21 17:44:15.132664 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:15.132574 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkwb8\" (UniqueName: \"kubernetes.io/projected/099b9855-2aca-4b14-919a-38fb218a47f5-kube-api-access-xkwb8\") pod \"authorino-7498df8756-m29r6\" (UID: \"099b9855-2aca-4b14-919a-38fb218a47f5\") " pod="kuadrant-system/authorino-7498df8756-m29r6" Apr 21 17:44:15.182252 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:15.182210 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-s67qs" Apr 21 17:44:15.233601 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:15.233409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xkwb8\" (UniqueName: \"kubernetes.io/projected/099b9855-2aca-4b14-919a-38fb218a47f5-kube-api-access-xkwb8\") pod \"authorino-7498df8756-m29r6\" (UID: \"099b9855-2aca-4b14-919a-38fb218a47f5\") " pod="kuadrant-system/authorino-7498df8756-m29r6" Apr 21 17:44:15.241894 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:15.241862 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkwb8\" (UniqueName: \"kubernetes.io/projected/099b9855-2aca-4b14-919a-38fb218a47f5-kube-api-access-xkwb8\") pod \"authorino-7498df8756-m29r6\" (UID: \"099b9855-2aca-4b14-919a-38fb218a47f5\") " pod="kuadrant-system/authorino-7498df8756-m29r6" Apr 21 17:44:15.314609 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:15.314580 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-s67qs"] Apr 21 17:44:15.316312 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:44:15.316271 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96f87617_0d7d_4c1a_b148_6822af236822.slice/crio-c16665ab2566acdcaeb7fe86368ca93f8b6a8965314e78bb0c1650d80a16fbfc WatchSource:0}: Error finding container c16665ab2566acdcaeb7fe86368ca93f8b6a8965314e78bb0c1650d80a16fbfc: Status 404 returned error can't find the container with id c16665ab2566acdcaeb7fe86368ca93f8b6a8965314e78bb0c1650d80a16fbfc Apr 21 17:44:15.317536 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:15.317517 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 17:44:15.358732 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:15.358694 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-m29r6" Apr 21 17:44:15.489191 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:15.489160 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-m29r6"] Apr 21 17:44:15.491161 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:44:15.491115 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod099b9855_2aca_4b14_919a_38fb218a47f5.slice/crio-946a8382e7ff5dbff699cf111f4d3d8a306ccaa3b40e137ff87a2a018582ae0f WatchSource:0}: Error finding container 946a8382e7ff5dbff699cf111f4d3d8a306ccaa3b40e137ff87a2a018582ae0f: Status 404 returned error can't find the container with id 946a8382e7ff5dbff699cf111f4d3d8a306ccaa3b40e137ff87a2a018582ae0f Apr 21 17:44:16.153587 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:16.153542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-m29r6" event={"ID":"099b9855-2aca-4b14-919a-38fb218a47f5","Type":"ContainerStarted","Data":"946a8382e7ff5dbff699cf111f4d3d8a306ccaa3b40e137ff87a2a018582ae0f"} Apr 21 17:44:16.155017 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:16.154980 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-s67qs" event={"ID":"96f87617-0d7d-4c1a-b148-6822af236822","Type":"ContainerStarted","Data":"c16665ab2566acdcaeb7fe86368ca93f8b6a8965314e78bb0c1650d80a16fbfc"} Apr 21 17:44:20.174112 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:20.174068 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-m29r6" event={"ID":"099b9855-2aca-4b14-919a-38fb218a47f5","Type":"ContainerStarted","Data":"e131ec772610f0248ac4d42ee0ab64500d8bd41669017785caf61cef19b92816"} Apr 21 17:44:20.175532 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:20.175506 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-s67qs" event={"ID":"96f87617-0d7d-4c1a-b148-6822af236822","Type":"ContainerStarted","Data":"84d7de24b7c82510ecf799f4c0383542791f2d09fb3242322d3af765a01ee2c2"} Apr 21 17:44:20.213599 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:20.213529 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-s67qs" podStartSLOduration=1.8425103699999998 podStartE2EDuration="6.213505994s" podCreationTimestamp="2026-04-21 17:44:14 +0000 UTC" firstStartedPulling="2026-04-21 17:44:15.317640502 +0000 UTC m=+640.028334588" lastFinishedPulling="2026-04-21 17:44:19.688636125 +0000 UTC m=+644.399330212" observedRunningTime="2026-04-21 17:44:20.209876362 +0000 UTC m=+644.920570471" watchObservedRunningTime="2026-04-21 17:44:20.213505994 +0000 UTC m=+644.924200103" Apr 21 17:44:20.213827 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:20.213642 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-m29r6" podStartSLOduration=2.002938679 podStartE2EDuration="6.213635129s" podCreationTimestamp="2026-04-21 17:44:14 +0000 UTC" firstStartedPulling="2026-04-21 17:44:15.492559454 +0000 UTC m=+640.203253540" lastFinishedPulling="2026-04-21 17:44:19.703255904 +0000 UTC m=+644.413949990" observedRunningTime="2026-04-21 17:44:20.190362301 +0000 UTC m=+644.901056408" watchObservedRunningTime="2026-04-21 17:44:20.213635129 +0000 UTC m=+644.924329238" Apr 21 17:44:20.221781 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:20.221734 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-s67qs"] Apr 21 17:44:22.184375 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:22.184303 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-s67qs" podUID="96f87617-0d7d-4c1a-b148-6822af236822" containerName="authorino" containerID="cri-o://84d7de24b7c82510ecf799f4c0383542791f2d09fb3242322d3af765a01ee2c2" gracePeriod=30 Apr 21 17:44:22.443848 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:22.443781 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-s67qs" Apr 21 17:44:22.499050 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:22.499014 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jndc\" (UniqueName: \"kubernetes.io/projected/96f87617-0d7d-4c1a-b148-6822af236822-kube-api-access-7jndc\") pod \"96f87617-0d7d-4c1a-b148-6822af236822\" (UID: \"96f87617-0d7d-4c1a-b148-6822af236822\") " Apr 21 17:44:22.501142 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:22.501101 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96f87617-0d7d-4c1a-b148-6822af236822-kube-api-access-7jndc" (OuterVolumeSpecName: "kube-api-access-7jndc") pod "96f87617-0d7d-4c1a-b148-6822af236822" (UID: "96f87617-0d7d-4c1a-b148-6822af236822"). InnerVolumeSpecName "kube-api-access-7jndc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:44:22.600068 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:22.600029 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7jndc\" (UniqueName: \"kubernetes.io/projected/96f87617-0d7d-4c1a-b148-6822af236822-kube-api-access-7jndc\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:44:23.189241 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:23.189204 2573 generic.go:358] "Generic (PLEG): container finished" podID="96f87617-0d7d-4c1a-b148-6822af236822" containerID="84d7de24b7c82510ecf799f4c0383542791f2d09fb3242322d3af765a01ee2c2" exitCode=0 Apr 21 17:44:23.189691 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:23.189258 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-s67qs" Apr 21 17:44:23.189691 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:23.189283 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-s67qs" event={"ID":"96f87617-0d7d-4c1a-b148-6822af236822","Type":"ContainerDied","Data":"84d7de24b7c82510ecf799f4c0383542791f2d09fb3242322d3af765a01ee2c2"} Apr 21 17:44:23.189691 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:23.189309 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-s67qs" event={"ID":"96f87617-0d7d-4c1a-b148-6822af236822","Type":"ContainerDied","Data":"c16665ab2566acdcaeb7fe86368ca93f8b6a8965314e78bb0c1650d80a16fbfc"} Apr 21 17:44:23.189691 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:23.189323 2573 scope.go:117] "RemoveContainer" containerID="84d7de24b7c82510ecf799f4c0383542791f2d09fb3242322d3af765a01ee2c2" Apr 21 17:44:23.199019 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:23.198994 2573 scope.go:117] "RemoveContainer" containerID="84d7de24b7c82510ecf799f4c0383542791f2d09fb3242322d3af765a01ee2c2" Apr 21 17:44:23.199321 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:44:23.199298 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84d7de24b7c82510ecf799f4c0383542791f2d09fb3242322d3af765a01ee2c2\": container with ID starting with 84d7de24b7c82510ecf799f4c0383542791f2d09fb3242322d3af765a01ee2c2 not found: ID does not exist" containerID="84d7de24b7c82510ecf799f4c0383542791f2d09fb3242322d3af765a01ee2c2" Apr 21 17:44:23.199375 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:23.199331 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d7de24b7c82510ecf799f4c0383542791f2d09fb3242322d3af765a01ee2c2"} err="failed to get container status \"84d7de24b7c82510ecf799f4c0383542791f2d09fb3242322d3af765a01ee2c2\": rpc error: code = NotFound desc = could not find container \"84d7de24b7c82510ecf799f4c0383542791f2d09fb3242322d3af765a01ee2c2\": container with ID starting with 84d7de24b7c82510ecf799f4c0383542791f2d09fb3242322d3af765a01ee2c2 not found: ID does not exist" Apr 21 17:44:23.211193 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:23.211162 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-s67qs"] Apr 21 17:44:23.214810 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:23.214782 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-s67qs"] Apr 21 17:44:23.910498 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:23.910467 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96f87617-0d7d-4c1a-b148-6822af236822" path="/var/lib/kubelet/pods/96f87617-0d7d-4c1a-b148-6822af236822/volumes" Apr 21 17:44:42.543969 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.543926 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7gxq8"] Apr 21 17:44:42.544602 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.544333 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96f87617-0d7d-4c1a-b148-6822af236822" containerName="authorino" Apr 21 17:44:42.544602 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.544347 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f87617-0d7d-4c1a-b148-6822af236822" containerName="authorino" Apr 21 17:44:42.544602 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.544425 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="96f87617-0d7d-4c1a-b148-6822af236822" containerName="authorino" Apr 21 17:44:42.558583 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.558540 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7gxq8"] Apr 21 17:44:42.558583 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.558579 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-7gxq8" Apr 21 17:44:42.675455 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.675410 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq47l\" (UniqueName: \"kubernetes.io/projected/6ccb0933-ee69-4c0b-bea9-6d07fae63d0a-kube-api-access-bq47l\") pod \"authorino-8b475cf9f-7gxq8\" (UID: \"6ccb0933-ee69-4c0b-bea9-6d07fae63d0a\") " pod="kuadrant-system/authorino-8b475cf9f-7gxq8" Apr 21 17:44:42.712198 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.712158 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7gxq8"] Apr 21 17:44:42.712448 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:44:42.712428 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-bq47l], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-7gxq8" podUID="6ccb0933-ee69-4c0b-bea9-6d07fae63d0a" Apr 21 17:44:42.741966 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.741922 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-56fdd757f5-zhsdm"] Apr 21 17:44:42.744464 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.744437 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-zhsdm" Apr 21 17:44:42.747354 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.747329 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 21 17:44:42.753482 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.753452 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-zhsdm"] Apr 21 17:44:42.776646 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.776609 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bq47l\" (UniqueName: \"kubernetes.io/projected/6ccb0933-ee69-4c0b-bea9-6d07fae63d0a-kube-api-access-bq47l\") pod \"authorino-8b475cf9f-7gxq8\" (UID: \"6ccb0933-ee69-4c0b-bea9-6d07fae63d0a\") " pod="kuadrant-system/authorino-8b475cf9f-7gxq8" Apr 21 17:44:42.788351 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.788312 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq47l\" (UniqueName: \"kubernetes.io/projected/6ccb0933-ee69-4c0b-bea9-6d07fae63d0a-kube-api-access-bq47l\") pod \"authorino-8b475cf9f-7gxq8\" (UID: \"6ccb0933-ee69-4c0b-bea9-6d07fae63d0a\") " pod="kuadrant-system/authorino-8b475cf9f-7gxq8" Apr 21 17:44:42.800122 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.800037 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-zhsdm"] Apr 21 17:44:42.800333 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:44:42.800312 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-44vg4 tls-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-56fdd757f5-zhsdm" podUID="88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c" Apr 21 17:44:42.825983 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.825949 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7bf786c64f-vq9br"] Apr 21 17:44:42.828667 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.828648 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7bf786c64f-vq9br" Apr 21 17:44:42.839267 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.839236 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7bf786c64f-vq9br"] Apr 21 17:44:42.877175 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.877105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c-tls-cert\") pod \"authorino-56fdd757f5-zhsdm\" (UID: \"88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c\") " pod="kuadrant-system/authorino-56fdd757f5-zhsdm" Apr 21 17:44:42.877361 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.877188 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44vg4\" (UniqueName: \"kubernetes.io/projected/88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c-kube-api-access-44vg4\") pod \"authorino-56fdd757f5-zhsdm\" (UID: \"88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c\") " pod="kuadrant-system/authorino-56fdd757f5-zhsdm" Apr 21 17:44:42.978489 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.978453 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/5c363099-cb1c-4058-bea1-764dec3576d2-tls-cert\") pod \"authorino-7bf786c64f-vq9br\" (UID: \"5c363099-cb1c-4058-bea1-764dec3576d2\") " pod="kuadrant-system/authorino-7bf786c64f-vq9br" Apr 21 17:44:42.978646 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.978504 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c-tls-cert\") pod \"authorino-56fdd757f5-zhsdm\" (UID: \"88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c\") " pod="kuadrant-system/authorino-56fdd757f5-zhsdm" Apr 21 17:44:42.978646 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.978526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44vg4\" (UniqueName: \"kubernetes.io/projected/88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c-kube-api-access-44vg4\") pod \"authorino-56fdd757f5-zhsdm\" (UID: \"88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c\") " pod="kuadrant-system/authorino-56fdd757f5-zhsdm" Apr 21 17:44:42.978646 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.978562 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pdw6\" (UniqueName: \"kubernetes.io/projected/5c363099-cb1c-4058-bea1-764dec3576d2-kube-api-access-5pdw6\") pod \"authorino-7bf786c64f-vq9br\" (UID: \"5c363099-cb1c-4058-bea1-764dec3576d2\") " pod="kuadrant-system/authorino-7bf786c64f-vq9br" Apr 21 17:44:42.981084 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.981061 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c-tls-cert\") pod \"authorino-56fdd757f5-zhsdm\" (UID: \"88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c\") " pod="kuadrant-system/authorino-56fdd757f5-zhsdm" Apr 21 17:44:42.987232 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:42.987202 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44vg4\" (UniqueName: \"kubernetes.io/projected/88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c-kube-api-access-44vg4\") pod \"authorino-56fdd757f5-zhsdm\" (UID: \"88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c\") " pod="kuadrant-system/authorino-56fdd757f5-zhsdm" Apr 21 17:44:43.079216 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.079098 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/5c363099-cb1c-4058-bea1-764dec3576d2-tls-cert\") pod \"authorino-7bf786c64f-vq9br\" (UID: \"5c363099-cb1c-4058-bea1-764dec3576d2\") " pod="kuadrant-system/authorino-7bf786c64f-vq9br" Apr 21 17:44:43.079216 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.079207 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pdw6\" (UniqueName: \"kubernetes.io/projected/5c363099-cb1c-4058-bea1-764dec3576d2-kube-api-access-5pdw6\") pod \"authorino-7bf786c64f-vq9br\" (UID: \"5c363099-cb1c-4058-bea1-764dec3576d2\") " pod="kuadrant-system/authorino-7bf786c64f-vq9br" Apr 21 17:44:43.081766 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.081729 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/5c363099-cb1c-4058-bea1-764dec3576d2-tls-cert\") pod \"authorino-7bf786c64f-vq9br\" (UID: \"5c363099-cb1c-4058-bea1-764dec3576d2\") " pod="kuadrant-system/authorino-7bf786c64f-vq9br" Apr 21 17:44:43.087777 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.087739 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pdw6\" (UniqueName: \"kubernetes.io/projected/5c363099-cb1c-4058-bea1-764dec3576d2-kube-api-access-5pdw6\") pod \"authorino-7bf786c64f-vq9br\" (UID: \"5c363099-cb1c-4058-bea1-764dec3576d2\") " pod="kuadrant-system/authorino-7bf786c64f-vq9br" Apr 21 17:44:43.139481 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.139440 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7bf786c64f-vq9br" Apr 21 17:44:43.266471 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.266442 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7bf786c64f-vq9br"] Apr 21 17:44:43.268646 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:44:43.268613 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c363099_cb1c_4058_bea1_764dec3576d2.slice/crio-6ce08d1c4140f36df346fa5d12d48be857714af809a8c52426e915d3712986f1 WatchSource:0}: Error finding container 6ce08d1c4140f36df346fa5d12d48be857714af809a8c52426e915d3712986f1: Status 404 returned error can't find the container with id 6ce08d1c4140f36df346fa5d12d48be857714af809a8c52426e915d3712986f1 Apr 21 17:44:43.278003 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.277966 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7bf786c64f-vq9br" event={"ID":"5c363099-cb1c-4058-bea1-764dec3576d2","Type":"ContainerStarted","Data":"6ce08d1c4140f36df346fa5d12d48be857714af809a8c52426e915d3712986f1"} Apr 21 17:44:43.278187 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.278026 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-7gxq8" Apr 21 17:44:43.278187 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.278028 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-zhsdm" Apr 21 17:44:43.283371 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.283349 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-7gxq8" Apr 21 17:44:43.287368 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.287341 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-zhsdm" Apr 21 17:44:43.381486 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.381397 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44vg4\" (UniqueName: \"kubernetes.io/projected/88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c-kube-api-access-44vg4\") pod \"88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c\" (UID: \"88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c\") " Apr 21 17:44:43.381486 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.381437 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c-tls-cert\") pod \"88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c\" (UID: \"88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c\") " Apr 21 17:44:43.381486 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.381491 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq47l\" (UniqueName: \"kubernetes.io/projected/6ccb0933-ee69-4c0b-bea9-6d07fae63d0a-kube-api-access-bq47l\") pod \"6ccb0933-ee69-4c0b-bea9-6d07fae63d0a\" (UID: \"6ccb0933-ee69-4c0b-bea9-6d07fae63d0a\") " Apr 21 17:44:43.383681 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.383641 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c" (UID: "88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:44:43.383681 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.383665 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ccb0933-ee69-4c0b-bea9-6d07fae63d0a-kube-api-access-bq47l" (OuterVolumeSpecName: "kube-api-access-bq47l") pod "6ccb0933-ee69-4c0b-bea9-6d07fae63d0a" (UID: "6ccb0933-ee69-4c0b-bea9-6d07fae63d0a"). InnerVolumeSpecName "kube-api-access-bq47l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:44:43.383681 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.383679 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c-kube-api-access-44vg4" (OuterVolumeSpecName: "kube-api-access-44vg4") pod "88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c" (UID: "88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c"). InnerVolumeSpecName "kube-api-access-44vg4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:44:43.482433 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.482393 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-44vg4\" (UniqueName: \"kubernetes.io/projected/88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c-kube-api-access-44vg4\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:44:43.482433 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.482427 2573 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c-tls-cert\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:44:43.482433 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:43.482439 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bq47l\" (UniqueName: \"kubernetes.io/projected/6ccb0933-ee69-4c0b-bea9-6d07fae63d0a-kube-api-access-bq47l\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:44:44.283212 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:44.283181 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-zhsdm" Apr 21 17:44:44.283212 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:44.283182 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7bf786c64f-vq9br" event={"ID":"5c363099-cb1c-4058-bea1-764dec3576d2","Type":"ContainerStarted","Data":"5ad5179a8b4d1f00f83b58792ae6ec199d41670399641c81c479c9ed16b259ae"} Apr 21 17:44:44.283628 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:44.283500 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-7gxq8" Apr 21 17:44:44.304473 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:44.304418 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7bf786c64f-vq9br" podStartSLOduration=1.937050155 podStartE2EDuration="2.304399775s" podCreationTimestamp="2026-04-21 17:44:42 +0000 UTC" firstStartedPulling="2026-04-21 17:44:43.26991434 +0000 UTC m=+667.980608427" lastFinishedPulling="2026-04-21 17:44:43.637263961 +0000 UTC m=+668.347958047" observedRunningTime="2026-04-21 17:44:44.301382095 +0000 UTC m=+669.012076204" watchObservedRunningTime="2026-04-21 17:44:44.304399775 +0000 UTC m=+669.015093880" Apr 21 17:44:44.332364 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:44.332327 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-zhsdm"] Apr 21 17:44:44.350848 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:44.350814 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-m29r6"] Apr 21 17:44:44.351417 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:44.351243 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-m29r6" podUID="099b9855-2aca-4b14-919a-38fb218a47f5" containerName="authorino" containerID="cri-o://e131ec772610f0248ac4d42ee0ab64500d8bd41669017785caf61cef19b92816" gracePeriod=30 Apr 21 17:44:44.354785 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:44.354753 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-zhsdm"] Apr 21 17:44:44.370098 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:44.370068 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7gxq8"] Apr 21 17:44:44.376698 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:44.376667 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-7gxq8"] Apr 21 17:44:44.613123 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:44.613096 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-m29r6" Apr 21 17:44:44.691732 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:44.691696 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkwb8\" (UniqueName: \"kubernetes.io/projected/099b9855-2aca-4b14-919a-38fb218a47f5-kube-api-access-xkwb8\") pod \"099b9855-2aca-4b14-919a-38fb218a47f5\" (UID: \"099b9855-2aca-4b14-919a-38fb218a47f5\") " Apr 21 17:44:44.693819 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:44.693792 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/099b9855-2aca-4b14-919a-38fb218a47f5-kube-api-access-xkwb8" (OuterVolumeSpecName: "kube-api-access-xkwb8") pod "099b9855-2aca-4b14-919a-38fb218a47f5" (UID: "099b9855-2aca-4b14-919a-38fb218a47f5"). InnerVolumeSpecName "kube-api-access-xkwb8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:44:44.792642 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:44.792607 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xkwb8\" (UniqueName: \"kubernetes.io/projected/099b9855-2aca-4b14-919a-38fb218a47f5-kube-api-access-xkwb8\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:44:45.286988 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.286952 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rc7n9"] Apr 21 17:44:45.287545 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.287525 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="099b9855-2aca-4b14-919a-38fb218a47f5" containerName="authorino" Apr 21 17:44:45.287639 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.287549 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="099b9855-2aca-4b14-919a-38fb218a47f5" containerName="authorino" Apr 21 17:44:45.287700 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.287642 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="099b9855-2aca-4b14-919a-38fb218a47f5" containerName="authorino" Apr 21 17:44:45.289017 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.288981 2573 generic.go:358] "Generic (PLEG): container finished" podID="099b9855-2aca-4b14-919a-38fb218a47f5" containerID="e131ec772610f0248ac4d42ee0ab64500d8bd41669017785caf61cef19b92816" exitCode=0 Apr 21 17:44:45.290023 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.289992 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-m29r6" event={"ID":"099b9855-2aca-4b14-919a-38fb218a47f5","Type":"ContainerDied","Data":"e131ec772610f0248ac4d42ee0ab64500d8bd41669017785caf61cef19b92816"} Apr 21 17:44:45.290023 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.290016 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-m29r6" Apr 21 17:44:45.290023 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.290021 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-m29r6" event={"ID":"099b9855-2aca-4b14-919a-38fb218a47f5","Type":"ContainerDied","Data":"946a8382e7ff5dbff699cf111f4d3d8a306ccaa3b40e137ff87a2a018582ae0f"} Apr 21 17:44:45.290278 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.290039 2573 scope.go:117] "RemoveContainer" containerID="e131ec772610f0248ac4d42ee0ab64500d8bd41669017785caf61cef19b92816" Apr 21 17:44:45.290278 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.290226 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" Apr 21 17:44:45.293563 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.293545 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-lnh65\"" Apr 21 17:44:45.296423 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.296390 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5zml\" (UniqueName: \"kubernetes.io/projected/e9a2fd19-54de-44f8-9b84-3b863c63cee6-kube-api-access-l5zml\") pod \"maas-controller-6d4c8f55f9-rc7n9\" (UID: \"e9a2fd19-54de-44f8-9b84-3b863c63cee6\") " pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" Apr 21 17:44:45.301436 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.301407 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rc7n9"] Apr 21 17:44:45.301641 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.301619 2573 scope.go:117] "RemoveContainer" containerID="e131ec772610f0248ac4d42ee0ab64500d8bd41669017785caf61cef19b92816" Apr 21 17:44:45.302001 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:44:45.301966 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e131ec772610f0248ac4d42ee0ab64500d8bd41669017785caf61cef19b92816\": container with ID starting with e131ec772610f0248ac4d42ee0ab64500d8bd41669017785caf61cef19b92816 not found: ID does not exist" containerID="e131ec772610f0248ac4d42ee0ab64500d8bd41669017785caf61cef19b92816" Apr 21 17:44:45.302114 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.301996 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e131ec772610f0248ac4d42ee0ab64500d8bd41669017785caf61cef19b92816"} err="failed to get container status \"e131ec772610f0248ac4d42ee0ab64500d8bd41669017785caf61cef19b92816\": rpc error: code = NotFound desc = could not find container \"e131ec772610f0248ac4d42ee0ab64500d8bd41669017785caf61cef19b92816\": container with ID starting with e131ec772610f0248ac4d42ee0ab64500d8bd41669017785caf61cef19b92816 not found: ID does not exist" Apr 21 17:44:45.335644 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.335608 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-m29r6"] Apr 21 17:44:45.340641 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.340605 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-m29r6"] Apr 21 17:44:45.396925 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.396889 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5zml\" (UniqueName: \"kubernetes.io/projected/e9a2fd19-54de-44f8-9b84-3b863c63cee6-kube-api-access-l5zml\") pod \"maas-controller-6d4c8f55f9-rc7n9\" (UID: \"e9a2fd19-54de-44f8-9b84-3b863c63cee6\") " pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" Apr 21 17:44:45.409266 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.409231 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5zml\" (UniqueName: \"kubernetes.io/projected/e9a2fd19-54de-44f8-9b84-3b863c63cee6-kube-api-access-l5zml\") pod \"maas-controller-6d4c8f55f9-rc7n9\" (UID: \"e9a2fd19-54de-44f8-9b84-3b863c63cee6\") " pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" Apr 21 17:44:45.600723 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.600629 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6dc454bff8-26pw4"] Apr 21 17:44:45.604074 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.604048 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6dc454bff8-26pw4" Apr 21 17:44:45.604074 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.604071 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" Apr 21 17:44:45.614153 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.614105 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6dc454bff8-26pw4"] Apr 21 17:44:45.700148 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.700052 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c2j4\" (UniqueName: \"kubernetes.io/projected/5f6b3420-422a-4205-a988-ad23c8e00733-kube-api-access-9c2j4\") pod \"maas-controller-6dc454bff8-26pw4\" (UID: \"5f6b3420-422a-4205-a988-ad23c8e00733\") " pod="opendatahub/maas-controller-6dc454bff8-26pw4" Apr 21 17:44:45.737086 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.737059 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rc7n9"] Apr 21 17:44:45.739654 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:44:45.739620 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9a2fd19_54de_44f8_9b84_3b863c63cee6.slice/crio-69b55f46ff93198c6af65fd00e0c0f1b751103dc17427d92f137289d463c7f55 WatchSource:0}: Error finding container 69b55f46ff93198c6af65fd00e0c0f1b751103dc17427d92f137289d463c7f55: Status 404 returned error can't find the container with id 69b55f46ff93198c6af65fd00e0c0f1b751103dc17427d92f137289d463c7f55 Apr 21 17:44:45.801490 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.801451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9c2j4\" (UniqueName: \"kubernetes.io/projected/5f6b3420-422a-4205-a988-ad23c8e00733-kube-api-access-9c2j4\") pod \"maas-controller-6dc454bff8-26pw4\" (UID: \"5f6b3420-422a-4205-a988-ad23c8e00733\") " pod="opendatahub/maas-controller-6dc454bff8-26pw4" Apr 21 17:44:45.810197 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.810161 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c2j4\" (UniqueName: \"kubernetes.io/projected/5f6b3420-422a-4205-a988-ad23c8e00733-kube-api-access-9c2j4\") pod \"maas-controller-6dc454bff8-26pw4\" (UID: \"5f6b3420-422a-4205-a988-ad23c8e00733\") " pod="opendatahub/maas-controller-6dc454bff8-26pw4" Apr 21 17:44:45.910428 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.910345 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="099b9855-2aca-4b14-919a-38fb218a47f5" path="/var/lib/kubelet/pods/099b9855-2aca-4b14-919a-38fb218a47f5/volumes" Apr 21 17:44:45.910678 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.910664 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ccb0933-ee69-4c0b-bea9-6d07fae63d0a" path="/var/lib/kubelet/pods/6ccb0933-ee69-4c0b-bea9-6d07fae63d0a/volumes" Apr 21 17:44:45.910882 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.910872 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c" path="/var/lib/kubelet/pods/88db1bc5-c9ce-4ee5-b58f-26f75ad0f93c/volumes" Apr 21 17:44:45.936266 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:45.936229 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6dc454bff8-26pw4" Apr 21 17:44:46.066720 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:46.066693 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6dc454bff8-26pw4"] Apr 21 17:44:46.068740 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:44:46.068709 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f6b3420_422a_4205_a988_ad23c8e00733.slice/crio-6b26db6dc6a89de540031d3d8f817cc336c22bf8fac99ec8543bd7934cf8e48b WatchSource:0}: Error finding container 6b26db6dc6a89de540031d3d8f817cc336c22bf8fac99ec8543bd7934cf8e48b: Status 404 returned error can't find the container with id 6b26db6dc6a89de540031d3d8f817cc336c22bf8fac99ec8543bd7934cf8e48b Apr 21 17:44:46.295845 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:46.295803 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" event={"ID":"e9a2fd19-54de-44f8-9b84-3b863c63cee6","Type":"ContainerStarted","Data":"69b55f46ff93198c6af65fd00e0c0f1b751103dc17427d92f137289d463c7f55"} Apr 21 17:44:46.297201 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:46.297173 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6dc454bff8-26pw4" event={"ID":"5f6b3420-422a-4205-a988-ad23c8e00733","Type":"ContainerStarted","Data":"6b26db6dc6a89de540031d3d8f817cc336c22bf8fac99ec8543bd7934cf8e48b"} Apr 21 17:44:49.317982 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:49.317943 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6dc454bff8-26pw4" event={"ID":"5f6b3420-422a-4205-a988-ad23c8e00733","Type":"ContainerStarted","Data":"60753605ef18bfc396701ac8f14958ff4e2a95d859162b057e3a1b6413ab11bc"} Apr 21 17:44:49.318493 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:49.318048 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6dc454bff8-26pw4" Apr 21 17:44:49.319524 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:49.319502 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" event={"ID":"e9a2fd19-54de-44f8-9b84-3b863c63cee6","Type":"ContainerStarted","Data":"02c91dc2425f7ac03ae793784f67bdc08f936155604bda984d4a5fd25134c246"} Apr 21 17:44:49.319676 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:49.319659 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" Apr 21 17:44:49.337833 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:49.337774 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6dc454bff8-26pw4" podStartSLOduration=1.679632596 podStartE2EDuration="4.337756912s" podCreationTimestamp="2026-04-21 17:44:45 +0000 UTC" firstStartedPulling="2026-04-21 17:44:46.070110543 +0000 UTC m=+670.780804629" lastFinishedPulling="2026-04-21 17:44:48.728234856 +0000 UTC m=+673.438928945" observedRunningTime="2026-04-21 17:44:49.336105538 +0000 UTC m=+674.046799647" watchObservedRunningTime="2026-04-21 17:44:49.337756912 +0000 UTC m=+674.048451020" Apr 21 17:44:49.356084 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:44:49.356017 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" podStartSLOduration=1.371041226 podStartE2EDuration="4.35599861s" podCreationTimestamp="2026-04-21 17:44:45 +0000 UTC" firstStartedPulling="2026-04-21 17:44:45.741036833 +0000 UTC m=+670.451730918" lastFinishedPulling="2026-04-21 17:44:48.725994212 +0000 UTC m=+673.436688302" observedRunningTime="2026-04-21 17:44:49.353485149 +0000 UTC m=+674.064179254" watchObservedRunningTime="2026-04-21 17:44:49.35599861 +0000 UTC m=+674.066692717" Apr 21 17:45:00.144354 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.144319 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29613225-nvgzl"] Apr 21 17:45:00.147790 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.147767 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" Apr 21 17:45:00.151227 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.151199 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-fq944\"" Apr 21 17:45:00.158040 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.158010 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613225-nvgzl"] Apr 21 17:45:00.239210 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.239171 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmghv\" (UniqueName: \"kubernetes.io/projected/980a9aa1-f89d-4089-b35a-a317ab3eb2f5-kube-api-access-bmghv\") pod \"maas-api-key-cleanup-29613225-nvgzl\" (UID: \"980a9aa1-f89d-4089-b35a-a317ab3eb2f5\") " pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" Apr 21 17:45:00.328650 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.328608 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" Apr 21 17:45:00.329065 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.329046 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6dc454bff8-26pw4" Apr 21 17:45:00.340149 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.340106 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmghv\" (UniqueName: \"kubernetes.io/projected/980a9aa1-f89d-4089-b35a-a317ab3eb2f5-kube-api-access-bmghv\") pod \"maas-api-key-cleanup-29613225-nvgzl\" (UID: \"980a9aa1-f89d-4089-b35a-a317ab3eb2f5\") " pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" Apr 21 17:45:00.349499 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.349467 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmghv\" (UniqueName: \"kubernetes.io/projected/980a9aa1-f89d-4089-b35a-a317ab3eb2f5-kube-api-access-bmghv\") pod \"maas-api-key-cleanup-29613225-nvgzl\" (UID: \"980a9aa1-f89d-4089-b35a-a317ab3eb2f5\") " pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" Apr 21 17:45:00.384548 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.384514 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rc7n9"] Apr 21 17:45:00.384737 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.384720 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" podUID="e9a2fd19-54de-44f8-9b84-3b863c63cee6" containerName="manager" containerID="cri-o://02c91dc2425f7ac03ae793784f67bdc08f936155604bda984d4a5fd25134c246" gracePeriod=10 Apr 21 17:45:00.459108 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.459071 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" Apr 21 17:45:00.651393 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.651367 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" Apr 21 17:45:00.689259 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.689173 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6b496cc5d9-bl8kw"] Apr 21 17:45:00.689568 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.689555 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9a2fd19-54de-44f8-9b84-3b863c63cee6" containerName="manager" Apr 21 17:45:00.689610 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.689571 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9a2fd19-54de-44f8-9b84-3b863c63cee6" containerName="manager" Apr 21 17:45:00.689654 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.689645 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9a2fd19-54de-44f8-9b84-3b863c63cee6" containerName="manager" Apr 21 17:45:00.692833 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.692809 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" Apr 21 17:45:00.701543 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.701515 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6b496cc5d9-bl8kw"] Apr 21 17:45:00.744812 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.744770 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5zml\" (UniqueName: \"kubernetes.io/projected/e9a2fd19-54de-44f8-9b84-3b863c63cee6-kube-api-access-l5zml\") pod \"e9a2fd19-54de-44f8-9b84-3b863c63cee6\" (UID: \"e9a2fd19-54de-44f8-9b84-3b863c63cee6\") " Apr 21 17:45:00.745027 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.744939 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh5vt\" (UniqueName: \"kubernetes.io/projected/d4a79093-1cd9-4a27-9358-91f072a5294b-kube-api-access-zh5vt\") pod \"maas-controller-6b496cc5d9-bl8kw\" (UID: \"d4a79093-1cd9-4a27-9358-91f072a5294b\") " pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" Apr 21 17:45:00.747099 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.747070 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9a2fd19-54de-44f8-9b84-3b863c63cee6-kube-api-access-l5zml" (OuterVolumeSpecName: "kube-api-access-l5zml") pod "e9a2fd19-54de-44f8-9b84-3b863c63cee6" (UID: "e9a2fd19-54de-44f8-9b84-3b863c63cee6"). InnerVolumeSpecName "kube-api-access-l5zml". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:45:00.804125 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.804096 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613225-nvgzl"] Apr 21 17:45:00.806078 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:45:00.806041 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod980a9aa1_f89d_4089_b35a_a317ab3eb2f5.slice/crio-7c0470dc323a12c25ebdf18503605652373fd5600afb6d55a4fb6823c15e5b32 WatchSource:0}: Error finding container 7c0470dc323a12c25ebdf18503605652373fd5600afb6d55a4fb6823c15e5b32: Status 404 returned error can't find the container with id 7c0470dc323a12c25ebdf18503605652373fd5600afb6d55a4fb6823c15e5b32 Apr 21 17:45:00.846391 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.846349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zh5vt\" (UniqueName: \"kubernetes.io/projected/d4a79093-1cd9-4a27-9358-91f072a5294b-kube-api-access-zh5vt\") pod \"maas-controller-6b496cc5d9-bl8kw\" (UID: \"d4a79093-1cd9-4a27-9358-91f072a5294b\") " pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" Apr 21 17:45:00.846564 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.846499 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l5zml\" (UniqueName: \"kubernetes.io/projected/e9a2fd19-54de-44f8-9b84-3b863c63cee6-kube-api-access-l5zml\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:45:00.855605 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:00.855572 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh5vt\" (UniqueName: \"kubernetes.io/projected/d4a79093-1cd9-4a27-9358-91f072a5294b-kube-api-access-zh5vt\") pod \"maas-controller-6b496cc5d9-bl8kw\" (UID: \"d4a79093-1cd9-4a27-9358-91f072a5294b\") " pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" Apr 21 17:45:01.004766 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:01.004724 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" Apr 21 17:45:01.137485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:01.137456 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6b496cc5d9-bl8kw"] Apr 21 17:45:01.140397 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:45:01.140361 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4a79093_1cd9_4a27_9358_91f072a5294b.slice/crio-6c9b2b5c8aa2d5200321fb67d40a875ddfc215591634e805a97976a47e74b8f8 WatchSource:0}: Error finding container 6c9b2b5c8aa2d5200321fb67d40a875ddfc215591634e805a97976a47e74b8f8: Status 404 returned error can't find the container with id 6c9b2b5c8aa2d5200321fb67d40a875ddfc215591634e805a97976a47e74b8f8 Apr 21 17:45:01.368214 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:01.368173 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" event={"ID":"d4a79093-1cd9-4a27-9358-91f072a5294b","Type":"ContainerStarted","Data":"6c9b2b5c8aa2d5200321fb67d40a875ddfc215591634e805a97976a47e74b8f8"} Apr 21 17:45:01.369366 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:01.369338 2573 generic.go:358] "Generic (PLEG): container finished" podID="e9a2fd19-54de-44f8-9b84-3b863c63cee6" containerID="02c91dc2425f7ac03ae793784f67bdc08f936155604bda984d4a5fd25134c246" exitCode=0 Apr 21 17:45:01.369482 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:01.369410 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" Apr 21 17:45:01.369482 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:01.369426 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" event={"ID":"e9a2fd19-54de-44f8-9b84-3b863c63cee6","Type":"ContainerDied","Data":"02c91dc2425f7ac03ae793784f67bdc08f936155604bda984d4a5fd25134c246"} Apr 21 17:45:01.369482 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:01.369460 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-rc7n9" event={"ID":"e9a2fd19-54de-44f8-9b84-3b863c63cee6","Type":"ContainerDied","Data":"69b55f46ff93198c6af65fd00e0c0f1b751103dc17427d92f137289d463c7f55"} Apr 21 17:45:01.369606 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:01.369483 2573 scope.go:117] "RemoveContainer" containerID="02c91dc2425f7ac03ae793784f67bdc08f936155604bda984d4a5fd25134c246" Apr 21 17:45:01.370780 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:01.370755 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" event={"ID":"980a9aa1-f89d-4089-b35a-a317ab3eb2f5","Type":"ContainerStarted","Data":"7c0470dc323a12c25ebdf18503605652373fd5600afb6d55a4fb6823c15e5b32"} Apr 21 17:45:01.379878 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:01.379859 2573 scope.go:117] "RemoveContainer" containerID="02c91dc2425f7ac03ae793784f67bdc08f936155604bda984d4a5fd25134c246" Apr 21 17:45:01.380182 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:45:01.380163 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02c91dc2425f7ac03ae793784f67bdc08f936155604bda984d4a5fd25134c246\": container with ID starting with 02c91dc2425f7ac03ae793784f67bdc08f936155604bda984d4a5fd25134c246 not found: ID does not exist" containerID="02c91dc2425f7ac03ae793784f67bdc08f936155604bda984d4a5fd25134c246" Apr 21 17:45:01.380244 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:01.380191 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02c91dc2425f7ac03ae793784f67bdc08f936155604bda984d4a5fd25134c246"} err="failed to get container status \"02c91dc2425f7ac03ae793784f67bdc08f936155604bda984d4a5fd25134c246\": rpc error: code = NotFound desc = could not find container \"02c91dc2425f7ac03ae793784f67bdc08f936155604bda984d4a5fd25134c246\": container with ID starting with 02c91dc2425f7ac03ae793784f67bdc08f936155604bda984d4a5fd25134c246 not found: ID does not exist" Apr 21 17:45:01.392802 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:01.392764 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rc7n9"] Apr 21 17:45:01.396459 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:01.396430 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rc7n9"] Apr 21 17:45:01.910124 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:01.910039 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9a2fd19-54de-44f8-9b84-3b863c63cee6" path="/var/lib/kubelet/pods/e9a2fd19-54de-44f8-9b84-3b863c63cee6/volumes" Apr 21 17:45:02.376284 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:02.376241 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" event={"ID":"980a9aa1-f89d-4089-b35a-a317ab3eb2f5","Type":"ContainerStarted","Data":"4779cae0b11c81012d18ce148ad31f72f2842a039e86ea6edac53ff9ec907341"} Apr 21 17:45:02.377865 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:02.377838 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" event={"ID":"d4a79093-1cd9-4a27-9358-91f072a5294b","Type":"ContainerStarted","Data":"1c9848887d92b77a8a70a854f442c02e7f333b3cf8343f2dd0628375f621893e"} Apr 21 17:45:02.378019 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:02.377961 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" Apr 21 17:45:02.392899 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:02.392841 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" podStartSLOduration=1.429257104 podStartE2EDuration="2.392825641s" podCreationTimestamp="2026-04-21 17:45:00 +0000 UTC" firstStartedPulling="2026-04-21 17:45:00.807890057 +0000 UTC m=+685.518584147" lastFinishedPulling="2026-04-21 17:45:01.771458594 +0000 UTC m=+686.482152684" observedRunningTime="2026-04-21 17:45:02.391157059 +0000 UTC m=+687.101851161" watchObservedRunningTime="2026-04-21 17:45:02.392825641 +0000 UTC m=+687.103519769" Apr 21 17:45:02.407304 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:02.407242 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" podStartSLOduration=1.993825278 podStartE2EDuration="2.407225913s" podCreationTimestamp="2026-04-21 17:45:00 +0000 UTC" firstStartedPulling="2026-04-21 17:45:01.1417421 +0000 UTC m=+685.852436186" lastFinishedPulling="2026-04-21 17:45:01.555142733 +0000 UTC m=+686.265836821" observedRunningTime="2026-04-21 17:45:02.405931047 +0000 UTC m=+687.116625156" watchObservedRunningTime="2026-04-21 17:45:02.407225913 +0000 UTC m=+687.117920021" Apr 21 17:45:13.388079 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:13.388045 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" Apr 21 17:45:13.426980 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:13.426936 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6dc454bff8-26pw4"] Apr 21 17:45:13.427315 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:13.427269 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6dc454bff8-26pw4" podUID="5f6b3420-422a-4205-a988-ad23c8e00733" containerName="manager" containerID="cri-o://60753605ef18bfc396701ac8f14958ff4e2a95d859162b057e3a1b6413ab11bc" gracePeriod=10 Apr 21 17:45:13.678166 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:13.678123 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6dc454bff8-26pw4" Apr 21 17:45:13.766117 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:13.766072 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c2j4\" (UniqueName: \"kubernetes.io/projected/5f6b3420-422a-4205-a988-ad23c8e00733-kube-api-access-9c2j4\") pod \"5f6b3420-422a-4205-a988-ad23c8e00733\" (UID: \"5f6b3420-422a-4205-a988-ad23c8e00733\") " Apr 21 17:45:13.768279 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:13.768248 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f6b3420-422a-4205-a988-ad23c8e00733-kube-api-access-9c2j4" (OuterVolumeSpecName: "kube-api-access-9c2j4") pod "5f6b3420-422a-4205-a988-ad23c8e00733" (UID: "5f6b3420-422a-4205-a988-ad23c8e00733"). InnerVolumeSpecName "kube-api-access-9c2j4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:45:13.867284 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:13.867244 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9c2j4\" (UniqueName: \"kubernetes.io/projected/5f6b3420-422a-4205-a988-ad23c8e00733-kube-api-access-9c2j4\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:45:14.423982 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:14.423945 2573 generic.go:358] "Generic (PLEG): container finished" podID="5f6b3420-422a-4205-a988-ad23c8e00733" containerID="60753605ef18bfc396701ac8f14958ff4e2a95d859162b057e3a1b6413ab11bc" exitCode=0 Apr 21 17:45:14.424438 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:14.423994 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6dc454bff8-26pw4" event={"ID":"5f6b3420-422a-4205-a988-ad23c8e00733","Type":"ContainerDied","Data":"60753605ef18bfc396701ac8f14958ff4e2a95d859162b057e3a1b6413ab11bc"} Apr 21 17:45:14.424438 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:14.424013 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6dc454bff8-26pw4" Apr 21 17:45:14.424438 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:14.424031 2573 scope.go:117] "RemoveContainer" containerID="60753605ef18bfc396701ac8f14958ff4e2a95d859162b057e3a1b6413ab11bc" Apr 21 17:45:14.424438 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:14.424021 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6dc454bff8-26pw4" event={"ID":"5f6b3420-422a-4205-a988-ad23c8e00733","Type":"ContainerDied","Data":"6b26db6dc6a89de540031d3d8f817cc336c22bf8fac99ec8543bd7934cf8e48b"} Apr 21 17:45:14.434516 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:14.434335 2573 scope.go:117] "RemoveContainer" containerID="60753605ef18bfc396701ac8f14958ff4e2a95d859162b057e3a1b6413ab11bc" Apr 21 17:45:14.434693 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:45:14.434671 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60753605ef18bfc396701ac8f14958ff4e2a95d859162b057e3a1b6413ab11bc\": container with ID starting with 60753605ef18bfc396701ac8f14958ff4e2a95d859162b057e3a1b6413ab11bc not found: ID does not exist" containerID="60753605ef18bfc396701ac8f14958ff4e2a95d859162b057e3a1b6413ab11bc" Apr 21 17:45:14.434741 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:14.434704 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60753605ef18bfc396701ac8f14958ff4e2a95d859162b057e3a1b6413ab11bc"} err="failed to get container status \"60753605ef18bfc396701ac8f14958ff4e2a95d859162b057e3a1b6413ab11bc\": rpc error: code = NotFound desc = could not find container \"60753605ef18bfc396701ac8f14958ff4e2a95d859162b057e3a1b6413ab11bc\": container with ID starting with 60753605ef18bfc396701ac8f14958ff4e2a95d859162b057e3a1b6413ab11bc not found: ID does not exist" Apr 21 17:45:14.444385 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:14.444343 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6dc454bff8-26pw4"] Apr 21 17:45:14.453721 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:14.453689 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6dc454bff8-26pw4"] Apr 21 17:45:15.916223 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:15.916183 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f6b3420-422a-4205-a988-ad23c8e00733" path="/var/lib/kubelet/pods/5f6b3420-422a-4205-a988-ad23c8e00733/volumes" Apr 21 17:45:19.740284 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:19.740249 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-cb7c66f54-r2wd6"] Apr 21 17:45:19.740675 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:19.740627 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f6b3420-422a-4205-a988-ad23c8e00733" containerName="manager" Apr 21 17:45:19.740675 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:19.740638 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f6b3420-422a-4205-a988-ad23c8e00733" containerName="manager" Apr 21 17:45:19.740748 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:19.740700 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f6b3420-422a-4205-a988-ad23c8e00733" containerName="manager" Apr 21 17:45:19.745114 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:19.745090 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-cb7c66f54-r2wd6" Apr 21 17:45:19.747835 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:19.747804 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 21 17:45:19.747835 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:19.747813 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 21 17:45:19.754462 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:19.754437 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-cb7c66f54-r2wd6"] Apr 21 17:45:19.820829 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:19.820790 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz22n\" (UniqueName: \"kubernetes.io/projected/3f0600be-e396-4b46-b5bd-0fb26082ab1e-kube-api-access-pz22n\") pod \"maas-api-cb7c66f54-r2wd6\" (UID: \"3f0600be-e396-4b46-b5bd-0fb26082ab1e\") " pod="opendatahub/maas-api-cb7c66f54-r2wd6" Apr 21 17:45:19.821035 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:19.820878 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3f0600be-e396-4b46-b5bd-0fb26082ab1e-maas-api-tls\") pod \"maas-api-cb7c66f54-r2wd6\" (UID: \"3f0600be-e396-4b46-b5bd-0fb26082ab1e\") " pod="opendatahub/maas-api-cb7c66f54-r2wd6" Apr 21 17:45:19.921868 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:19.921832 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pz22n\" (UniqueName: \"kubernetes.io/projected/3f0600be-e396-4b46-b5bd-0fb26082ab1e-kube-api-access-pz22n\") pod \"maas-api-cb7c66f54-r2wd6\" (UID: \"3f0600be-e396-4b46-b5bd-0fb26082ab1e\") " pod="opendatahub/maas-api-cb7c66f54-r2wd6" Apr 21 17:45:19.922070 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:19.921901 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3f0600be-e396-4b46-b5bd-0fb26082ab1e-maas-api-tls\") pod \"maas-api-cb7c66f54-r2wd6\" (UID: \"3f0600be-e396-4b46-b5bd-0fb26082ab1e\") " pod="opendatahub/maas-api-cb7c66f54-r2wd6" Apr 21 17:45:19.924725 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:19.924690 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3f0600be-e396-4b46-b5bd-0fb26082ab1e-maas-api-tls\") pod \"maas-api-cb7c66f54-r2wd6\" (UID: \"3f0600be-e396-4b46-b5bd-0fb26082ab1e\") " pod="opendatahub/maas-api-cb7c66f54-r2wd6" Apr 21 17:45:19.931657 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:19.931623 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz22n\" (UniqueName: \"kubernetes.io/projected/3f0600be-e396-4b46-b5bd-0fb26082ab1e-kube-api-access-pz22n\") pod \"maas-api-cb7c66f54-r2wd6\" (UID: \"3f0600be-e396-4b46-b5bd-0fb26082ab1e\") " pod="opendatahub/maas-api-cb7c66f54-r2wd6" Apr 21 17:45:20.057487 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:20.057395 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-cb7c66f54-r2wd6" Apr 21 17:45:20.193961 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:20.193922 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-cb7c66f54-r2wd6"] Apr 21 17:45:20.197546 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:45:20.197515 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f0600be_e396_4b46_b5bd_0fb26082ab1e.slice/crio-94aaec0fe28da0b6c366ae67560aa49b30e5ec7f1f9e332b24df90cd7ff04555 WatchSource:0}: Error finding container 94aaec0fe28da0b6c366ae67560aa49b30e5ec7f1f9e332b24df90cd7ff04555: Status 404 returned error can't find the container with id 94aaec0fe28da0b6c366ae67560aa49b30e5ec7f1f9e332b24df90cd7ff04555 Apr 21 17:45:20.449180 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:20.449043 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-cb7c66f54-r2wd6" event={"ID":"3f0600be-e396-4b46-b5bd-0fb26082ab1e","Type":"ContainerStarted","Data":"94aaec0fe28da0b6c366ae67560aa49b30e5ec7f1f9e332b24df90cd7ff04555"} Apr 21 17:45:23.465383 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:23.465346 2573 generic.go:358] "Generic (PLEG): container finished" podID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" containerID="4779cae0b11c81012d18ce148ad31f72f2842a039e86ea6edac53ff9ec907341" exitCode=6 Apr 21 17:45:23.465885 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:23.465425 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" event={"ID":"980a9aa1-f89d-4089-b35a-a317ab3eb2f5","Type":"ContainerDied","Data":"4779cae0b11c81012d18ce148ad31f72f2842a039e86ea6edac53ff9ec907341"} Apr 21 17:45:23.465885 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:23.465880 2573 scope.go:117] "RemoveContainer" containerID="4779cae0b11c81012d18ce148ad31f72f2842a039e86ea6edac53ff9ec907341" Apr 21 17:45:23.467032 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:23.467001 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-cb7c66f54-r2wd6" event={"ID":"3f0600be-e396-4b46-b5bd-0fb26082ab1e","Type":"ContainerStarted","Data":"cf068736a56f1d1aee9c3559a69c2550d3d401095ddcca9504ff67f0b230ae17"} Apr 21 17:45:23.467206 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:23.467193 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-cb7c66f54-r2wd6" Apr 21 17:45:23.512162 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:23.510734 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-cb7c66f54-r2wd6" podStartSLOduration=2.22078518 podStartE2EDuration="4.510710613s" podCreationTimestamp="2026-04-21 17:45:19 +0000 UTC" firstStartedPulling="2026-04-21 17:45:20.198898 +0000 UTC m=+704.909592086" lastFinishedPulling="2026-04-21 17:45:22.488823427 +0000 UTC m=+707.199517519" observedRunningTime="2026-04-21 17:45:23.509167811 +0000 UTC m=+708.219861920" watchObservedRunningTime="2026-04-21 17:45:23.510710613 +0000 UTC m=+708.221404724" Apr 21 17:45:24.472728 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:24.472688 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" event={"ID":"980a9aa1-f89d-4089-b35a-a317ab3eb2f5","Type":"ContainerStarted","Data":"47c53bde5e1c9ab0518d8735ab801f42abaa7337491e5a9544b7ee8e61a335df"} Apr 21 17:45:29.478498 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:29.478461 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-cb7c66f54-r2wd6" Apr 21 17:45:34.567501 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.567460 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf"] Apr 21 17:45:34.571713 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.571691 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.575766 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.575735 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 17:45:34.575944 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.575798 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-nnhrd\"" Apr 21 17:45:34.575944 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.575815 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 17:45:34.575944 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.575798 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 21 17:45:34.581192 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.581164 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf"] Apr 21 17:45:34.757426 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.757382 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.757673 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.757439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.757673 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.757570 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.757673 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.757613 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.757673 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.757633 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.757673 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.757668 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frc87\" (UniqueName: \"kubernetes.io/projected/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-kube-api-access-frc87\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.859223 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.859114 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frc87\" (UniqueName: \"kubernetes.io/projected/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-kube-api-access-frc87\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.859223 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.859215 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.859457 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.859242 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.859457 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.859273 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.859457 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.859308 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.859457 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.859335 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.859898 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.859759 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.859898 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.859865 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.860149 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.860109 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.862370 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.862055 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.862615 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.862593 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.867457 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.867430 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frc87\" (UniqueName: \"kubernetes.io/projected/cbc4077b-de1b-43b8-86cc-3a2d5ae9a316-kube-api-access-frc87\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-7t6vf\" (UID: \"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:34.882870 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:34.882834 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:35.014846 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:35.014817 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf"] Apr 21 17:45:35.016599 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:45:35.016569 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbc4077b_de1b_43b8_86cc_3a2d5ae9a316.slice/crio-2b6f8e8df883d6a198a3975186738df98f21c460de0199ee902b091c2f4ef154 WatchSource:0}: Error finding container 2b6f8e8df883d6a198a3975186738df98f21c460de0199ee902b091c2f4ef154: Status 404 returned error can't find the container with id 2b6f8e8df883d6a198a3975186738df98f21c460de0199ee902b091c2f4ef154 Apr 21 17:45:35.516928 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:35.516879 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" event={"ID":"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316","Type":"ContainerStarted","Data":"2b6f8e8df883d6a198a3975186738df98f21c460de0199ee902b091c2f4ef154"} Apr 21 17:45:40.540571 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:40.540526 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" event={"ID":"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316","Type":"ContainerStarted","Data":"4af3dbd5bce0e47a92976ff6f187a2dd4b892a9f046902c945d7e264320e8ae8"} Apr 21 17:45:44.557531 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:44.557496 2573 generic.go:358] "Generic (PLEG): container finished" podID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" containerID="47c53bde5e1c9ab0518d8735ab801f42abaa7337491e5a9544b7ee8e61a335df" exitCode=6 Apr 21 17:45:44.558002 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:44.557568 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" event={"ID":"980a9aa1-f89d-4089-b35a-a317ab3eb2f5","Type":"ContainerDied","Data":"47c53bde5e1c9ab0518d8735ab801f42abaa7337491e5a9544b7ee8e61a335df"} Apr 21 17:45:44.558002 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:44.557627 2573 scope.go:117] "RemoveContainer" containerID="4779cae0b11c81012d18ce148ad31f72f2842a039e86ea6edac53ff9ec907341" Apr 21 17:45:44.558002 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:44.557996 2573 scope.go:117] "RemoveContainer" containerID="47c53bde5e1c9ab0518d8735ab801f42abaa7337491e5a9544b7ee8e61a335df" Apr 21 17:45:44.558263 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:45:44.558245 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29613225-nvgzl_opendatahub(980a9aa1-f89d-4089-b35a-a317ab3eb2f5)\"" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" podUID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" Apr 21 17:45:46.569232 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:46.569196 2573 generic.go:358] "Generic (PLEG): container finished" podID="cbc4077b-de1b-43b8-86cc-3a2d5ae9a316" containerID="4af3dbd5bce0e47a92976ff6f187a2dd4b892a9f046902c945d7e264320e8ae8" exitCode=0 Apr 21 17:45:46.569701 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:46.569270 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" event={"ID":"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316","Type":"ContainerDied","Data":"4af3dbd5bce0e47a92976ff6f187a2dd4b892a9f046902c945d7e264320e8ae8"} Apr 21 17:45:48.580869 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:48.580835 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" event={"ID":"cbc4077b-de1b-43b8-86cc-3a2d5ae9a316","Type":"ContainerStarted","Data":"5421f3928207fa432a28ad7fcee6d23c6ecebabc6545abdadf85c8dbe3cbb6fa"} Apr 21 17:45:48.581270 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:48.581055 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:48.600288 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:48.600242 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" podStartSLOduration=1.9466676189999998 podStartE2EDuration="14.600227207s" podCreationTimestamp="2026-04-21 17:45:34 +0000 UTC" firstStartedPulling="2026-04-21 17:45:35.021709316 +0000 UTC m=+719.732403403" lastFinishedPulling="2026-04-21 17:45:47.675268905 +0000 UTC m=+732.385962991" observedRunningTime="2026-04-21 17:45:48.599085211 +0000 UTC m=+733.309779320" watchObservedRunningTime="2026-04-21 17:45:48.600227207 +0000 UTC m=+733.310921315" Apr 21 17:45:57.904995 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:57.904950 2573 scope.go:117] "RemoveContainer" containerID="47c53bde5e1c9ab0518d8735ab801f42abaa7337491e5a9544b7ee8e61a335df" Apr 21 17:45:58.632538 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:58.632501 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" event={"ID":"980a9aa1-f89d-4089-b35a-a317ab3eb2f5","Type":"ContainerStarted","Data":"a6e425fcf2edd6af18aacf2da047d37dc7c8755848b11836112b63fa1e5b09e4"} Apr 21 17:45:58.935313 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:58.935218 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613225-nvgzl"] Apr 21 17:45:59.599529 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:59.599499 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-7t6vf" Apr 21 17:45:59.637559 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:45:59.637523 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" podUID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" containerName="cleanup" containerID="cri-o://a6e425fcf2edd6af18aacf2da047d37dc7c8755848b11836112b63fa1e5b09e4" gracePeriod=30 Apr 21 17:46:18.712958 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:18.712923 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" Apr 21 17:46:18.716998 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:18.716965 2573 generic.go:358] "Generic (PLEG): container finished" podID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" containerID="a6e425fcf2edd6af18aacf2da047d37dc7c8755848b11836112b63fa1e5b09e4" exitCode=6 Apr 21 17:46:18.717184 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:18.717044 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" Apr 21 17:46:18.717184 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:18.717042 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" event={"ID":"980a9aa1-f89d-4089-b35a-a317ab3eb2f5","Type":"ContainerDied","Data":"a6e425fcf2edd6af18aacf2da047d37dc7c8755848b11836112b63fa1e5b09e4"} Apr 21 17:46:18.717184 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:18.717168 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29613225-nvgzl" event={"ID":"980a9aa1-f89d-4089-b35a-a317ab3eb2f5","Type":"ContainerDied","Data":"7c0470dc323a12c25ebdf18503605652373fd5600afb6d55a4fb6823c15e5b32"} Apr 21 17:46:18.717184 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:18.717185 2573 scope.go:117] "RemoveContainer" containerID="a6e425fcf2edd6af18aacf2da047d37dc7c8755848b11836112b63fa1e5b09e4" Apr 21 17:46:18.728847 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:18.728825 2573 scope.go:117] "RemoveContainer" containerID="47c53bde5e1c9ab0518d8735ab801f42abaa7337491e5a9544b7ee8e61a335df" Apr 21 17:46:18.740087 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:18.740065 2573 scope.go:117] "RemoveContainer" containerID="a6e425fcf2edd6af18aacf2da047d37dc7c8755848b11836112b63fa1e5b09e4" Apr 21 17:46:18.740476 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:46:18.740454 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6e425fcf2edd6af18aacf2da047d37dc7c8755848b11836112b63fa1e5b09e4\": container with ID starting with a6e425fcf2edd6af18aacf2da047d37dc7c8755848b11836112b63fa1e5b09e4 not found: ID does not exist" containerID="a6e425fcf2edd6af18aacf2da047d37dc7c8755848b11836112b63fa1e5b09e4" Apr 21 17:46:18.740576 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:18.740491 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6e425fcf2edd6af18aacf2da047d37dc7c8755848b11836112b63fa1e5b09e4"} err="failed to get container status \"a6e425fcf2edd6af18aacf2da047d37dc7c8755848b11836112b63fa1e5b09e4\": rpc error: code = NotFound desc = could not find container \"a6e425fcf2edd6af18aacf2da047d37dc7c8755848b11836112b63fa1e5b09e4\": container with ID starting with a6e425fcf2edd6af18aacf2da047d37dc7c8755848b11836112b63fa1e5b09e4 not found: ID does not exist" Apr 21 17:46:18.740576 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:18.740519 2573 scope.go:117] "RemoveContainer" containerID="47c53bde5e1c9ab0518d8735ab801f42abaa7337491e5a9544b7ee8e61a335df" Apr 21 17:46:18.740811 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:46:18.740790 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47c53bde5e1c9ab0518d8735ab801f42abaa7337491e5a9544b7ee8e61a335df\": container with ID starting with 47c53bde5e1c9ab0518d8735ab801f42abaa7337491e5a9544b7ee8e61a335df not found: ID does not exist" containerID="47c53bde5e1c9ab0518d8735ab801f42abaa7337491e5a9544b7ee8e61a335df" Apr 21 17:46:18.740851 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:18.740819 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c53bde5e1c9ab0518d8735ab801f42abaa7337491e5a9544b7ee8e61a335df"} err="failed to get container status \"47c53bde5e1c9ab0518d8735ab801f42abaa7337491e5a9544b7ee8e61a335df\": rpc error: code = NotFound desc = could not find container \"47c53bde5e1c9ab0518d8735ab801f42abaa7337491e5a9544b7ee8e61a335df\": container with ID starting with 47c53bde5e1c9ab0518d8735ab801f42abaa7337491e5a9544b7ee8e61a335df not found: ID does not exist" Apr 21 17:46:18.853978 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:18.853935 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmghv\" (UniqueName: \"kubernetes.io/projected/980a9aa1-f89d-4089-b35a-a317ab3eb2f5-kube-api-access-bmghv\") pod \"980a9aa1-f89d-4089-b35a-a317ab3eb2f5\" (UID: \"980a9aa1-f89d-4089-b35a-a317ab3eb2f5\") " Apr 21 17:46:18.856156 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:18.856091 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/980a9aa1-f89d-4089-b35a-a317ab3eb2f5-kube-api-access-bmghv" (OuterVolumeSpecName: "kube-api-access-bmghv") pod "980a9aa1-f89d-4089-b35a-a317ab3eb2f5" (UID: "980a9aa1-f89d-4089-b35a-a317ab3eb2f5"). InnerVolumeSpecName "kube-api-access-bmghv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:46:18.955598 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:18.955552 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bmghv\" (UniqueName: \"kubernetes.io/projected/980a9aa1-f89d-4089-b35a-a317ab3eb2f5-kube-api-access-bmghv\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:46:19.043054 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:19.043017 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613225-nvgzl"] Apr 21 17:46:19.046276 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:19.046244 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29613225-nvgzl"] Apr 21 17:46:19.910045 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:46:19.910011 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" path="/var/lib/kubelet/pods/980a9aa1-f89d-4089-b35a-a317ab3eb2f5/volumes" Apr 21 17:47:04.418430 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.418345 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79f9bcfb6d-6ld2b"] Apr 21 17:47:04.418989 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.418842 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" containerName="cleanup" Apr 21 17:47:04.418989 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.418857 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" containerName="cleanup" Apr 21 17:47:04.418989 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.418886 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" containerName="cleanup" Apr 21 17:47:04.418989 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.418895 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" containerName="cleanup" Apr 21 17:47:04.418989 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.418961 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" containerName="cleanup" Apr 21 17:47:04.418989 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.418971 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" containerName="cleanup" Apr 21 17:47:04.418989 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.418978 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" containerName="cleanup" Apr 21 17:47:04.421029 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.421013 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79f9bcfb6d-6ld2b" Apr 21 17:47:04.428190 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.428158 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79f9bcfb6d-6ld2b"] Apr 21 17:47:04.572056 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.572011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jll2\" (UniqueName: \"kubernetes.io/projected/ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7-kube-api-access-5jll2\") pod \"authorino-79f9bcfb6d-6ld2b\" (UID: \"ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7\") " pod="kuadrant-system/authorino-79f9bcfb6d-6ld2b" Apr 21 17:47:04.572056 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.572063 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7-tls-cert\") pod \"authorino-79f9bcfb6d-6ld2b\" (UID: \"ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7\") " pod="kuadrant-system/authorino-79f9bcfb6d-6ld2b" Apr 21 17:47:04.672875 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.672766 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jll2\" (UniqueName: \"kubernetes.io/projected/ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7-kube-api-access-5jll2\") pod \"authorino-79f9bcfb6d-6ld2b\" (UID: \"ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7\") " pod="kuadrant-system/authorino-79f9bcfb6d-6ld2b" Apr 21 17:47:04.672875 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.672815 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7-tls-cert\") pod \"authorino-79f9bcfb6d-6ld2b\" (UID: \"ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7\") " pod="kuadrant-system/authorino-79f9bcfb6d-6ld2b" Apr 21 17:47:04.675355 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.675332 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7-tls-cert\") pod \"authorino-79f9bcfb6d-6ld2b\" (UID: \"ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7\") " pod="kuadrant-system/authorino-79f9bcfb6d-6ld2b" Apr 21 17:47:04.680467 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.680435 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jll2\" (UniqueName: \"kubernetes.io/projected/ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7-kube-api-access-5jll2\") pod \"authorino-79f9bcfb6d-6ld2b\" (UID: \"ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7\") " pod="kuadrant-system/authorino-79f9bcfb6d-6ld2b" Apr 21 17:47:04.732274 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.732240 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79f9bcfb6d-6ld2b" Apr 21 17:47:04.873621 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.873591 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79f9bcfb6d-6ld2b"] Apr 21 17:47:04.875905 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:47:04.875877 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded56a9b9_7b80_48d9_ad78_1ca3034ae3c7.slice/crio-cc25536ce0143911d6a7a14f01c129bae6a9e657bde5dcd8145b77790d2c454a WatchSource:0}: Error finding container cc25536ce0143911d6a7a14f01c129bae6a9e657bde5dcd8145b77790d2c454a: Status 404 returned error can't find the container with id cc25536ce0143911d6a7a14f01c129bae6a9e657bde5dcd8145b77790d2c454a Apr 21 17:47:04.908205 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:04.908164 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79f9bcfb6d-6ld2b" event={"ID":"ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7","Type":"ContainerStarted","Data":"cc25536ce0143911d6a7a14f01c129bae6a9e657bde5dcd8145b77790d2c454a"} Apr 21 17:47:05.914351 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:05.914298 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79f9bcfb6d-6ld2b" event={"ID":"ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7","Type":"ContainerStarted","Data":"f410b8a5052fc9f106de5b26cb9a5d43e43695695d882af5849553e98ed16164"} Apr 21 17:47:05.958029 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:05.957973 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79f9bcfb6d-6ld2b" podStartSLOduration=1.447186575 podStartE2EDuration="1.957954903s" podCreationTimestamp="2026-04-21 17:47:04 +0000 UTC" firstStartedPulling="2026-04-21 17:47:04.877659697 +0000 UTC m=+809.588353784" lastFinishedPulling="2026-04-21 17:47:05.388428012 +0000 UTC m=+810.099122112" observedRunningTime="2026-04-21 17:47:05.956390708 +0000 UTC m=+810.667084819" watchObservedRunningTime="2026-04-21 17:47:05.957954903 +0000 UTC m=+810.668649010" Apr 21 17:47:05.994176 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:05.994120 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7bf786c64f-vq9br"] Apr 21 17:47:05.994458 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:05.994398 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7bf786c64f-vq9br" podUID="5c363099-cb1c-4058-bea1-764dec3576d2" containerName="authorino" containerID="cri-o://5ad5179a8b4d1f00f83b58792ae6ec199d41670399641c81c479c9ed16b259ae" gracePeriod=30 Apr 21 17:47:06.250787 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.250759 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7bf786c64f-vq9br" Apr 21 17:47:06.388064 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.388020 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pdw6\" (UniqueName: \"kubernetes.io/projected/5c363099-cb1c-4058-bea1-764dec3576d2-kube-api-access-5pdw6\") pod \"5c363099-cb1c-4058-bea1-764dec3576d2\" (UID: \"5c363099-cb1c-4058-bea1-764dec3576d2\") " Apr 21 17:47:06.388294 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.388098 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/5c363099-cb1c-4058-bea1-764dec3576d2-tls-cert\") pod \"5c363099-cb1c-4058-bea1-764dec3576d2\" (UID: \"5c363099-cb1c-4058-bea1-764dec3576d2\") " Apr 21 17:47:06.390235 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.390206 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c363099-cb1c-4058-bea1-764dec3576d2-kube-api-access-5pdw6" (OuterVolumeSpecName: "kube-api-access-5pdw6") pod "5c363099-cb1c-4058-bea1-764dec3576d2" (UID: "5c363099-cb1c-4058-bea1-764dec3576d2"). InnerVolumeSpecName "kube-api-access-5pdw6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:47:06.398549 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.398520 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c363099-cb1c-4058-bea1-764dec3576d2-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "5c363099-cb1c-4058-bea1-764dec3576d2" (UID: "5c363099-cb1c-4058-bea1-764dec3576d2"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 17:47:06.489226 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.489189 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5pdw6\" (UniqueName: \"kubernetes.io/projected/5c363099-cb1c-4058-bea1-764dec3576d2-kube-api-access-5pdw6\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:47:06.489226 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.489222 2573 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/5c363099-cb1c-4058-bea1-764dec3576d2-tls-cert\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:47:06.919373 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.919276 2573 generic.go:358] "Generic (PLEG): container finished" podID="5c363099-cb1c-4058-bea1-764dec3576d2" containerID="5ad5179a8b4d1f00f83b58792ae6ec199d41670399641c81c479c9ed16b259ae" exitCode=0 Apr 21 17:47:06.919373 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.919335 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7bf786c64f-vq9br" Apr 21 17:47:06.919373 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.919353 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7bf786c64f-vq9br" event={"ID":"5c363099-cb1c-4058-bea1-764dec3576d2","Type":"ContainerDied","Data":"5ad5179a8b4d1f00f83b58792ae6ec199d41670399641c81c479c9ed16b259ae"} Apr 21 17:47:06.919960 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.919393 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7bf786c64f-vq9br" event={"ID":"5c363099-cb1c-4058-bea1-764dec3576d2","Type":"ContainerDied","Data":"6ce08d1c4140f36df346fa5d12d48be857714af809a8c52426e915d3712986f1"} Apr 21 17:47:06.919960 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.919409 2573 scope.go:117] "RemoveContainer" containerID="5ad5179a8b4d1f00f83b58792ae6ec199d41670399641c81c479c9ed16b259ae" Apr 21 17:47:06.929482 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.929462 2573 scope.go:117] "RemoveContainer" containerID="5ad5179a8b4d1f00f83b58792ae6ec199d41670399641c81c479c9ed16b259ae" Apr 21 17:47:06.929773 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:47:06.929754 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ad5179a8b4d1f00f83b58792ae6ec199d41670399641c81c479c9ed16b259ae\": container with ID starting with 5ad5179a8b4d1f00f83b58792ae6ec199d41670399641c81c479c9ed16b259ae not found: ID does not exist" containerID="5ad5179a8b4d1f00f83b58792ae6ec199d41670399641c81c479c9ed16b259ae" Apr 21 17:47:06.929821 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.929784 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad5179a8b4d1f00f83b58792ae6ec199d41670399641c81c479c9ed16b259ae"} err="failed to get container status \"5ad5179a8b4d1f00f83b58792ae6ec199d41670399641c81c479c9ed16b259ae\": rpc error: code = NotFound desc = could not find container \"5ad5179a8b4d1f00f83b58792ae6ec199d41670399641c81c479c9ed16b259ae\": container with ID starting with 5ad5179a8b4d1f00f83b58792ae6ec199d41670399641c81c479c9ed16b259ae not found: ID does not exist" Apr 21 17:47:06.945694 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.945657 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7bf786c64f-vq9br"] Apr 21 17:47:06.948955 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:06.948927 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7bf786c64f-vq9br"] Apr 21 17:47:07.913317 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:47:07.911698 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c363099-cb1c-4058-bea1-764dec3576d2" path="/var/lib/kubelet/pods/5c363099-cb1c-4058-bea1-764dec3576d2/volumes" Apr 21 17:48:28.365127 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:28.365043 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6b496cc5d9-bl8kw"] Apr 21 17:48:28.365694 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:28.365307 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" podUID="d4a79093-1cd9-4a27-9358-91f072a5294b" containerName="manager" containerID="cri-o://1c9848887d92b77a8a70a854f442c02e7f333b3cf8343f2dd0628375f621893e" gracePeriod=10 Apr 21 17:48:28.611026 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:28.610998 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" Apr 21 17:48:28.739780 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:28.739744 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh5vt\" (UniqueName: \"kubernetes.io/projected/d4a79093-1cd9-4a27-9358-91f072a5294b-kube-api-access-zh5vt\") pod \"d4a79093-1cd9-4a27-9358-91f072a5294b\" (UID: \"d4a79093-1cd9-4a27-9358-91f072a5294b\") " Apr 21 17:48:28.742044 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:28.742009 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a79093-1cd9-4a27-9358-91f072a5294b-kube-api-access-zh5vt" (OuterVolumeSpecName: "kube-api-access-zh5vt") pod "d4a79093-1cd9-4a27-9358-91f072a5294b" (UID: "d4a79093-1cd9-4a27-9358-91f072a5294b"). InnerVolumeSpecName "kube-api-access-zh5vt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 17:48:28.840415 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:28.840369 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zh5vt\" (UniqueName: \"kubernetes.io/projected/d4a79093-1cd9-4a27-9358-91f072a5294b-kube-api-access-zh5vt\") on node \"ip-10-0-134-77.ec2.internal\" DevicePath \"\"" Apr 21 17:48:29.246738 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.246699 2573 generic.go:358] "Generic (PLEG): container finished" podID="d4a79093-1cd9-4a27-9358-91f072a5294b" containerID="1c9848887d92b77a8a70a854f442c02e7f333b3cf8343f2dd0628375f621893e" exitCode=0 Apr 21 17:48:29.246927 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.246782 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" event={"ID":"d4a79093-1cd9-4a27-9358-91f072a5294b","Type":"ContainerDied","Data":"1c9848887d92b77a8a70a854f442c02e7f333b3cf8343f2dd0628375f621893e"} Apr 21 17:48:29.246927 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.246799 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" Apr 21 17:48:29.246927 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.246816 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6b496cc5d9-bl8kw" event={"ID":"d4a79093-1cd9-4a27-9358-91f072a5294b","Type":"ContainerDied","Data":"6c9b2b5c8aa2d5200321fb67d40a875ddfc215591634e805a97976a47e74b8f8"} Apr 21 17:48:29.246927 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.246836 2573 scope.go:117] "RemoveContainer" containerID="1c9848887d92b77a8a70a854f442c02e7f333b3cf8343f2dd0628375f621893e" Apr 21 17:48:29.256172 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.256145 2573 scope.go:117] "RemoveContainer" containerID="1c9848887d92b77a8a70a854f442c02e7f333b3cf8343f2dd0628375f621893e" Apr 21 17:48:29.256524 ip-10-0-134-77 kubenswrapper[2573]: E0421 17:48:29.256492 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c9848887d92b77a8a70a854f442c02e7f333b3cf8343f2dd0628375f621893e\": container with ID starting with 1c9848887d92b77a8a70a854f442c02e7f333b3cf8343f2dd0628375f621893e not found: ID does not exist" containerID="1c9848887d92b77a8a70a854f442c02e7f333b3cf8343f2dd0628375f621893e" Apr 21 17:48:29.256603 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.256530 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c9848887d92b77a8a70a854f442c02e7f333b3cf8343f2dd0628375f621893e"} err="failed to get container status \"1c9848887d92b77a8a70a854f442c02e7f333b3cf8343f2dd0628375f621893e\": rpc error: code = NotFound desc = could not find container \"1c9848887d92b77a8a70a854f442c02e7f333b3cf8343f2dd0628375f621893e\": container with ID starting with 1c9848887d92b77a8a70a854f442c02e7f333b3cf8343f2dd0628375f621893e not found: ID does not exist" Apr 21 17:48:29.269344 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.269307 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6b496cc5d9-bl8kw"] Apr 21 17:48:29.272714 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.272684 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6b496cc5d9-bl8kw"] Apr 21 17:48:29.910671 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.910630 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4a79093-1cd9-4a27-9358-91f072a5294b" path="/var/lib/kubelet/pods/d4a79093-1cd9-4a27-9358-91f072a5294b/volumes" Apr 21 17:48:29.953642 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.953607 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6b496cc5d9-wkts6"] Apr 21 17:48:29.953973 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.953961 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" containerName="cleanup" Apr 21 17:48:29.954019 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.953975 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="980a9aa1-f89d-4089-b35a-a317ab3eb2f5" containerName="cleanup" Apr 21 17:48:29.954019 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.954010 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4a79093-1cd9-4a27-9358-91f072a5294b" containerName="manager" Apr 21 17:48:29.954019 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.954017 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a79093-1cd9-4a27-9358-91f072a5294b" containerName="manager" Apr 21 17:48:29.954112 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.954037 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c363099-cb1c-4058-bea1-764dec3576d2" containerName="authorino" Apr 21 17:48:29.954112 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.954043 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c363099-cb1c-4058-bea1-764dec3576d2" containerName="authorino" Apr 21 17:48:29.954112 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.954094 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c363099-cb1c-4058-bea1-764dec3576d2" containerName="authorino" Apr 21 17:48:29.954112 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.954106 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4a79093-1cd9-4a27-9358-91f072a5294b" containerName="manager" Apr 21 17:48:29.956942 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.956924 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b496cc5d9-wkts6" Apr 21 17:48:29.959426 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.959399 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-lnh65\"" Apr 21 17:48:29.965197 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:29.965123 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6b496cc5d9-wkts6"] Apr 21 17:48:30.052076 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:30.052031 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrfdb\" (UniqueName: \"kubernetes.io/projected/683f3fd7-ef1d-4fa0-9b06-57707aa39ade-kube-api-access-lrfdb\") pod \"maas-controller-6b496cc5d9-wkts6\" (UID: \"683f3fd7-ef1d-4fa0-9b06-57707aa39ade\") " pod="opendatahub/maas-controller-6b496cc5d9-wkts6" Apr 21 17:48:30.152847 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:30.152806 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrfdb\" (UniqueName: \"kubernetes.io/projected/683f3fd7-ef1d-4fa0-9b06-57707aa39ade-kube-api-access-lrfdb\") pod \"maas-controller-6b496cc5d9-wkts6\" (UID: \"683f3fd7-ef1d-4fa0-9b06-57707aa39ade\") " pod="opendatahub/maas-controller-6b496cc5d9-wkts6" Apr 21 17:48:30.163087 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:30.163016 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrfdb\" (UniqueName: \"kubernetes.io/projected/683f3fd7-ef1d-4fa0-9b06-57707aa39ade-kube-api-access-lrfdb\") pod \"maas-controller-6b496cc5d9-wkts6\" (UID: \"683f3fd7-ef1d-4fa0-9b06-57707aa39ade\") " pod="opendatahub/maas-controller-6b496cc5d9-wkts6" Apr 21 17:48:30.268822 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:30.268784 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b496cc5d9-wkts6" Apr 21 17:48:30.404472 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:30.404445 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6b496cc5d9-wkts6"] Apr 21 17:48:30.406304 ip-10-0-134-77 kubenswrapper[2573]: W0421 17:48:30.406272 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod683f3fd7_ef1d_4fa0_9b06_57707aa39ade.slice/crio-c200ec57ef9c731c0b0455e7d25b15a7d24d35a3edc3ce5e63e2ebfab7b1d5a1 WatchSource:0}: Error finding container c200ec57ef9c731c0b0455e7d25b15a7d24d35a3edc3ce5e63e2ebfab7b1d5a1: Status 404 returned error can't find the container with id c200ec57ef9c731c0b0455e7d25b15a7d24d35a3edc3ce5e63e2ebfab7b1d5a1 Apr 21 17:48:31.257095 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:31.257060 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6b496cc5d9-wkts6" event={"ID":"683f3fd7-ef1d-4fa0-9b06-57707aa39ade","Type":"ContainerStarted","Data":"567ba5d1eadd5fd790a8091254279d70237b21a5008bb4a828546b081b6f51f4"} Apr 21 17:48:31.257095 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:31.257099 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6b496cc5d9-wkts6" event={"ID":"683f3fd7-ef1d-4fa0-9b06-57707aa39ade","Type":"ContainerStarted","Data":"c200ec57ef9c731c0b0455e7d25b15a7d24d35a3edc3ce5e63e2ebfab7b1d5a1"} Apr 21 17:48:31.257558 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:31.257178 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6b496cc5d9-wkts6" Apr 21 17:48:31.273412 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:31.273360 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6b496cc5d9-wkts6" podStartSLOduration=1.7725954769999999 podStartE2EDuration="2.273342984s" podCreationTimestamp="2026-04-21 17:48:29 +0000 UTC" firstStartedPulling="2026-04-21 17:48:30.407588467 +0000 UTC m=+895.118282560" lastFinishedPulling="2026-04-21 17:48:30.908335981 +0000 UTC m=+895.619030067" observedRunningTime="2026-04-21 17:48:31.272390251 +0000 UTC m=+895.983084383" watchObservedRunningTime="2026-04-21 17:48:31.273342984 +0000 UTC m=+895.984037092" Apr 21 17:48:35.855306 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:35.855274 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 17:48:35.857913 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:35.857877 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 17:48:35.859653 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:35.859631 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 17:48:35.862465 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:35.862442 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 17:48:42.267286 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:48:42.267254 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6b496cc5d9-wkts6" Apr 21 17:53:35.894485 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:53:35.894451 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 17:53:35.897833 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:53:35.897804 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 17:53:35.899267 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:53:35.899244 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 17:53:35.902294 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:53:35.902272 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 17:58:35.935713 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:58:35.935683 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 17:58:35.940324 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:58:35.940295 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 17:58:35.946755 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:58:35.946723 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 17:58:35.951744 ip-10-0-134-77 kubenswrapper[2573]: I0421 17:58:35.951720 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 18:03:35.971461 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:03:35.971418 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 18:03:35.975917 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:03:35.975895 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 18:03:35.985949 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:03:35.985918 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 18:03:35.991456 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:03:35.991432 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 18:08:36.005351 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:08:36.005317 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 18:08:36.009871 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:08:36.009847 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 18:08:36.022340 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:08:36.022307 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 18:08:36.026753 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:08:36.026729 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 18:09:29.177015 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:29.176924 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-79f9bcfb6d-6ld2b_ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7/authorino/0.log" Apr 21 18:09:32.919328 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:32.919293 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-cb7c66f54-r2wd6_3f0600be-e396-4b46-b5bd-0fb26082ab1e/maas-api/0.log" Apr 21 18:09:33.027771 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:33.027725 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-6b496cc5d9-wkts6_683f3fd7-ef1d-4fa0-9b06-57707aa39ade/manager/0.log" Apr 21 18:09:33.242447 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:33.242416 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb_e8a8ce54-79d0-4e08-9b00-73835796a047/manager/0.log" Apr 21 18:09:34.331655 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:34.331619 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh_8af1143b-af39-4628-bce0-d70b72ae6de4/util/0.log" Apr 21 18:09:34.338096 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:34.338068 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh_8af1143b-af39-4628-bce0-d70b72ae6de4/pull/0.log" Apr 21 18:09:34.343808 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:34.343784 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh_8af1143b-af39-4628-bce0-d70b72ae6de4/extract/0.log" Apr 21 18:09:34.447975 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:34.447947 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm_c1cd980a-2ba1-4c37-9d39-0bdb967db655/util/0.log" Apr 21 18:09:34.454847 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:34.454813 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm_c1cd980a-2ba1-4c37-9d39-0bdb967db655/pull/0.log" Apr 21 18:09:34.460453 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:34.460430 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm_c1cd980a-2ba1-4c37-9d39-0bdb967db655/extract/0.log" Apr 21 18:09:34.564051 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:34.564025 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb_35bda4df-27c7-400d-bca3-30bd37f0fe76/util/0.log" Apr 21 18:09:34.569931 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:34.569906 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb_35bda4df-27c7-400d-bca3-30bd37f0fe76/pull/0.log" Apr 21 18:09:34.575643 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:34.575625 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb_35bda4df-27c7-400d-bca3-30bd37f0fe76/extract/0.log" Apr 21 18:09:34.683335 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:34.683256 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv_8942b5ef-5b9d-4ba2-8582-523f88c3feb6/util/0.log" Apr 21 18:09:34.689473 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:34.689450 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv_8942b5ef-5b9d-4ba2-8582-523f88c3feb6/pull/0.log" Apr 21 18:09:34.697008 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:34.696977 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv_8942b5ef-5b9d-4ba2-8582-523f88c3feb6/extract/0.log" Apr 21 18:09:34.828694 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:34.828659 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-79f9bcfb6d-6ld2b_ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7/authorino/0.log" Apr 21 18:09:35.161377 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:35.161348 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-rrmsf_21ac171b-33e1-45a6-b129-8f41f56b967e/kuadrant-console-plugin/0.log" Apr 21 18:09:36.317497 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:36.317468 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-548b8d8fcb-zgz5s_9cfc99ce-60a1-496f-91e7-3032cca09532/kube-auth-proxy/0.log" Apr 21 18:09:37.105706 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:37.105671 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-7t6vf_cbc4077b-de1b-43b8-86cc-3a2d5ae9a316/main/0.log" Apr 21 18:09:37.111461 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:37.111433 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-7t6vf_cbc4077b-de1b-43b8-86cc-3a2d5ae9a316/storage-initializer/0.log" Apr 21 18:09:41.125050 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.124993 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nf5lv/must-gather-42vdv"] Apr 21 18:09:41.128767 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.128737 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nf5lv/must-gather-42vdv" Apr 21 18:09:41.131883 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.131849 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nf5lv\"/\"openshift-service-ca.crt\"" Apr 21 18:09:41.132032 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.131971 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nf5lv\"/\"default-dockercfg-28xmb\"" Apr 21 18:09:41.133313 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.133287 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nf5lv\"/\"kube-root-ca.crt\"" Apr 21 18:09:41.150448 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.150417 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nf5lv/must-gather-42vdv"] Apr 21 18:09:41.193990 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.193951 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aa9a97e2-79e8-4471-8194-73175f0dee72-must-gather-output\") pod \"must-gather-42vdv\" (UID: \"aa9a97e2-79e8-4471-8194-73175f0dee72\") " pod="openshift-must-gather-nf5lv/must-gather-42vdv" Apr 21 18:09:41.194209 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.194007 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6czl\" (UniqueName: \"kubernetes.io/projected/aa9a97e2-79e8-4471-8194-73175f0dee72-kube-api-access-q6czl\") pod \"must-gather-42vdv\" (UID: \"aa9a97e2-79e8-4471-8194-73175f0dee72\") " pod="openshift-must-gather-nf5lv/must-gather-42vdv" Apr 21 18:09:41.294452 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.294402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6czl\" (UniqueName: \"kubernetes.io/projected/aa9a97e2-79e8-4471-8194-73175f0dee72-kube-api-access-q6czl\") pod \"must-gather-42vdv\" (UID: \"aa9a97e2-79e8-4471-8194-73175f0dee72\") " pod="openshift-must-gather-nf5lv/must-gather-42vdv" Apr 21 18:09:41.294649 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.294523 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aa9a97e2-79e8-4471-8194-73175f0dee72-must-gather-output\") pod \"must-gather-42vdv\" (UID: \"aa9a97e2-79e8-4471-8194-73175f0dee72\") " pod="openshift-must-gather-nf5lv/must-gather-42vdv" Apr 21 18:09:41.294874 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.294852 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/aa9a97e2-79e8-4471-8194-73175f0dee72-must-gather-output\") pod \"must-gather-42vdv\" (UID: \"aa9a97e2-79e8-4471-8194-73175f0dee72\") " pod="openshift-must-gather-nf5lv/must-gather-42vdv" Apr 21 18:09:41.302548 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.302520 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6czl\" (UniqueName: \"kubernetes.io/projected/aa9a97e2-79e8-4471-8194-73175f0dee72-kube-api-access-q6czl\") pod \"must-gather-42vdv\" (UID: \"aa9a97e2-79e8-4471-8194-73175f0dee72\") " pod="openshift-must-gather-nf5lv/must-gather-42vdv" Apr 21 18:09:41.439436 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.439344 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nf5lv/must-gather-42vdv" Apr 21 18:09:41.584016 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.583984 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nf5lv/must-gather-42vdv"] Apr 21 18:09:41.585999 ip-10-0-134-77 kubenswrapper[2573]: W0421 18:09:41.585971 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa9a97e2_79e8_4471_8194_73175f0dee72.slice/crio-215c79090ecdd8b527b0d73ba06e0fc39be979b3cb3f4f4f4de78231dd539078 WatchSource:0}: Error finding container 215c79090ecdd8b527b0d73ba06e0fc39be979b3cb3f4f4f4de78231dd539078: Status 404 returned error can't find the container with id 215c79090ecdd8b527b0d73ba06e0fc39be979b3cb3f4f4f4de78231dd539078 Apr 21 18:09:41.587773 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:41.587753 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 18:09:42.405718 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:42.405678 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nf5lv/must-gather-42vdv" event={"ID":"aa9a97e2-79e8-4471-8194-73175f0dee72","Type":"ContainerStarted","Data":"215c79090ecdd8b527b0d73ba06e0fc39be979b3cb3f4f4f4de78231dd539078"} Apr 21 18:09:43.413651 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:43.413605 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nf5lv/must-gather-42vdv" event={"ID":"aa9a97e2-79e8-4471-8194-73175f0dee72","Type":"ContainerStarted","Data":"0cf867409605de24624dedcf88e6e07006fe8e61273de384b2456ff1b02cad28"} Apr 21 18:09:43.414156 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:43.413658 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nf5lv/must-gather-42vdv" event={"ID":"aa9a97e2-79e8-4471-8194-73175f0dee72","Type":"ContainerStarted","Data":"815c21e72a5deedb5bc8bd9fcf6ad71429618a07a5cdead04c1bfcb83fc7e358"} Apr 21 18:09:43.434797 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:43.434715 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nf5lv/must-gather-42vdv" podStartSLOduration=1.595027595 podStartE2EDuration="2.434691067s" podCreationTimestamp="2026-04-21 18:09:41 +0000 UTC" firstStartedPulling="2026-04-21 18:09:41.587889277 +0000 UTC m=+2166.298583364" lastFinishedPulling="2026-04-21 18:09:42.427552748 +0000 UTC m=+2167.138246836" observedRunningTime="2026-04-21 18:09:43.430989101 +0000 UTC m=+2168.141683205" watchObservedRunningTime="2026-04-21 18:09:43.434691067 +0000 UTC m=+2168.145385177" Apr 21 18:09:44.042710 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:44.042680 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fxbtq_e4b0f8ab-bbb9-4342-8f16-fd9dd2ec095c/global-pull-secret-syncer/0.log" Apr 21 18:09:44.092106 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:44.092067 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-ct269_065ac800-604a-42e9-b0cc-e56598f56081/konnectivity-agent/0.log" Apr 21 18:09:44.204701 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:44.204670 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-77.ec2.internal_454bf7e88903cb3fed5cc9e7d8cf5d0d/haproxy/0.log" Apr 21 18:09:47.563669 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:47.563633 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh_8af1143b-af39-4628-bce0-d70b72ae6de4/extract/0.log" Apr 21 18:09:47.583030 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:47.582989 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh_8af1143b-af39-4628-bce0-d70b72ae6de4/util/0.log" Apr 21 18:09:47.608429 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:47.608402 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759s89fh_8af1143b-af39-4628-bce0-d70b72ae6de4/pull/0.log" Apr 21 18:09:47.635035 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:47.635001 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm_c1cd980a-2ba1-4c37-9d39-0bdb967db655/extract/0.log" Apr 21 18:09:47.670776 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:47.670747 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm_c1cd980a-2ba1-4c37-9d39-0bdb967db655/util/0.log" Apr 21 18:09:47.706728 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:47.706694 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rl8cm_c1cd980a-2ba1-4c37-9d39-0bdb967db655/pull/0.log" Apr 21 18:09:47.740663 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:47.740626 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb_35bda4df-27c7-400d-bca3-30bd37f0fe76/extract/0.log" Apr 21 18:09:47.764279 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:47.764244 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb_35bda4df-27c7-400d-bca3-30bd37f0fe76/util/0.log" Apr 21 18:09:47.786741 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:47.786708 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73szjtb_35bda4df-27c7-400d-bca3-30bd37f0fe76/pull/0.log" Apr 21 18:09:47.821952 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:47.821850 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv_8942b5ef-5b9d-4ba2-8582-523f88c3feb6/extract/0.log" Apr 21 18:09:47.856878 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:47.856840 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv_8942b5ef-5b9d-4ba2-8582-523f88c3feb6/util/0.log" Apr 21 18:09:47.876837 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:47.876810 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef17mpnv_8942b5ef-5b9d-4ba2-8582-523f88c3feb6/pull/0.log" Apr 21 18:09:48.133772 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:48.132894 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-79f9bcfb6d-6ld2b_ed56a9b9-7b80-48d9-ad78-1ca3034ae3c7/authorino/0.log" Apr 21 18:09:48.200081 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:48.200040 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-rrmsf_21ac171b-33e1-45a6-b129-8f41f56b967e/kuadrant-console-plugin/0.log" Apr 21 18:09:49.934239 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:49.934203 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-45p2m_433814e5-34ad-4fb0-bc66-58a825bc82e4/cluster-monitoring-operator/0.log" Apr 21 18:09:49.956736 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:49.956646 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-vm4tx_2725cd96-5201-42ad-93cd-e934fa8eb17e/kube-state-metrics/0.log" Apr 21 18:09:49.974745 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:49.974718 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-vm4tx_2725cd96-5201-42ad-93cd-e934fa8eb17e/kube-rbac-proxy-main/0.log" Apr 21 18:09:49.993537 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:49.993400 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-vm4tx_2725cd96-5201-42ad-93cd-e934fa8eb17e/kube-rbac-proxy-self/0.log" Apr 21 18:09:50.044161 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.044115 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-vxpw5_2bae9fa3-c4a6-4863-9da7-c77595420218/monitoring-plugin/0.log" Apr 21 18:09:50.092995 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.092953 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mn2tn_1cb4fd4e-5662-4538-8827-31633f56c7ed/node-exporter/0.log" Apr 21 18:09:50.110359 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.110267 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mn2tn_1cb4fd4e-5662-4538-8827-31633f56c7ed/kube-rbac-proxy/0.log" Apr 21 18:09:50.130354 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.130329 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mn2tn_1cb4fd4e-5662-4538-8827-31633f56c7ed/init-textfile/0.log" Apr 21 18:09:50.558844 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.558740 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-m84vb_1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e/prometheus-operator/0.log" Apr 21 18:09:50.578117 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.578079 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-m84vb_1046fd5a-af2d-4faa-84b3-25b9c6ca7b9e/kube-rbac-proxy/0.log" Apr 21 18:09:50.640405 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.640367 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b98c9d888-h4g5h_fa555626-6c87-484e-9f9b-4f1ff5732351/telemeter-client/0.log" Apr 21 18:09:50.662877 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.662845 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b98c9d888-h4g5h_fa555626-6c87-484e-9f9b-4f1ff5732351/reload/0.log" Apr 21 18:09:50.687500 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.687469 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-b98c9d888-h4g5h_fa555626-6c87-484e-9f9b-4f1ff5732351/kube-rbac-proxy/0.log" Apr 21 18:09:50.716092 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.716062 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-569f6b9d8b-r5tpr_8166e55e-1dae-40ee-ae0f-4833d4cff10c/thanos-query/0.log" Apr 21 18:09:50.735278 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.735247 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-569f6b9d8b-r5tpr_8166e55e-1dae-40ee-ae0f-4833d4cff10c/kube-rbac-proxy-web/0.log" Apr 21 18:09:50.777681 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.777612 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-569f6b9d8b-r5tpr_8166e55e-1dae-40ee-ae0f-4833d4cff10c/kube-rbac-proxy/0.log" Apr 21 18:09:50.797678 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.797648 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-569f6b9d8b-r5tpr_8166e55e-1dae-40ee-ae0f-4833d4cff10c/prom-label-proxy/0.log" Apr 21 18:09:50.818531 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.818425 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-569f6b9d8b-r5tpr_8166e55e-1dae-40ee-ae0f-4833d4cff10c/kube-rbac-proxy-rules/0.log" Apr 21 18:09:50.837330 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:50.837304 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-569f6b9d8b-r5tpr_8166e55e-1dae-40ee-ae0f-4833d4cff10c/kube-rbac-proxy-metrics/0.log" Apr 21 18:09:52.680832 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:52.680800 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/2.log" Apr 21 18:09:52.686327 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:52.686293 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-pz6fc_8a1fd903-a226-46d4-8e61-54eac7ea70b3/console-operator/3.log" Apr 21 18:09:53.125196 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.125170 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-844fdb48-rtshf_f739b2eb-e5d1-450e-aaf5-f2931d0c0ff0/console/0.log" Apr 21 18:09:53.466527 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.466447 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9"] Apr 21 18:09:53.471397 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.471357 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.482371 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.482336 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9"] Apr 21 18:09:53.518741 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.518697 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/407d5fab-095b-40da-87c1-749ef8b1a131-podres\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.518975 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.518761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/407d5fab-095b-40da-87c1-749ef8b1a131-proc\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.518975 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.518807 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/407d5fab-095b-40da-87c1-749ef8b1a131-lib-modules\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.518975 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.518864 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/407d5fab-095b-40da-87c1-749ef8b1a131-sys\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.518975 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.518895 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxj56\" (UniqueName: \"kubernetes.io/projected/407d5fab-095b-40da-87c1-749ef8b1a131-kube-api-access-pxj56\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.620525 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.620478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/407d5fab-095b-40da-87c1-749ef8b1a131-podres\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.620730 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.620541 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/407d5fab-095b-40da-87c1-749ef8b1a131-proc\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.620730 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.620597 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/407d5fab-095b-40da-87c1-749ef8b1a131-lib-modules\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.620730 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.620626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/407d5fab-095b-40da-87c1-749ef8b1a131-proc\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.620730 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.620656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/407d5fab-095b-40da-87c1-749ef8b1a131-sys\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.620730 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.620679 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/407d5fab-095b-40da-87c1-749ef8b1a131-podres\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.620730 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.620688 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxj56\" (UniqueName: \"kubernetes.io/projected/407d5fab-095b-40da-87c1-749ef8b1a131-kube-api-access-pxj56\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.621091 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.620742 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/407d5fab-095b-40da-87c1-749ef8b1a131-lib-modules\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.621091 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.620790 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/407d5fab-095b-40da-87c1-749ef8b1a131-sys\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.629387 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.629349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxj56\" (UniqueName: \"kubernetes.io/projected/407d5fab-095b-40da-87c1-749ef8b1a131-kube-api-access-pxj56\") pod \"perf-node-gather-daemonset-r46m9\" (UID: \"407d5fab-095b-40da-87c1-749ef8b1a131\") " pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.699018 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.698990 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-lhmd6_a4e0f79c-332d-452e-95ac-b42786fe90d2/volume-data-source-validator/0.log" Apr 21 18:09:53.794352 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.794309 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:53.965299 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:53.965256 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9"] Apr 21 18:09:53.971154 ip-10-0-134-77 kubenswrapper[2573]: W0421 18:09:53.969177 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod407d5fab_095b_40da_87c1_749ef8b1a131.slice/crio-000a43204272014e8e0c0384cf18d5ea7bedf6c704d9d02eb330e66942695f69 WatchSource:0}: Error finding container 000a43204272014e8e0c0384cf18d5ea7bedf6c704d9d02eb330e66942695f69: Status 404 returned error can't find the container with id 000a43204272014e8e0c0384cf18d5ea7bedf6c704d9d02eb330e66942695f69 Apr 21 18:09:54.501151 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:54.501104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" event={"ID":"407d5fab-095b-40da-87c1-749ef8b1a131","Type":"ContainerStarted","Data":"97255a344e573ee025d99c0aacb01bcbc1eea5fe9d0d91f6158f67d902cf42a2"} Apr 21 18:09:54.501335 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:54.501165 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" event={"ID":"407d5fab-095b-40da-87c1-749ef8b1a131","Type":"ContainerStarted","Data":"000a43204272014e8e0c0384cf18d5ea7bedf6c704d9d02eb330e66942695f69"} Apr 21 18:09:54.501335 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:54.501306 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:09:54.517944 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:54.517878 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" podStartSLOduration=1.517861516 podStartE2EDuration="1.517861516s" podCreationTimestamp="2026-04-21 18:09:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 18:09:54.517061125 +0000 UTC m=+2179.227755231" watchObservedRunningTime="2026-04-21 18:09:54.517861516 +0000 UTC m=+2179.228555627" Apr 21 18:09:54.581748 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:54.581717 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4vw75_480585ac-7f87-43b8-98a2-398a61a10ad3/dns/0.log" Apr 21 18:09:54.598757 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:54.598723 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-4vw75_480585ac-7f87-43b8-98a2-398a61a10ad3/kube-rbac-proxy/0.log" Apr 21 18:09:54.707985 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:54.707957 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zvdf9_5dd32873-afbe-4cda-ad81-5c8d17abaeb9/dns-node-resolver/0.log" Apr 21 18:09:55.242913 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:55.242887 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6fqcj_755e2315-552e-4e00-ba7b-cf1e07a5c8d1/node-ca/0.log" Apr 21 18:09:56.253069 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:56.253040 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-548b8d8fcb-zgz5s_9cfc99ce-60a1-496f-91e7-3032cca09532/kube-auth-proxy/0.log" Apr 21 18:09:56.915055 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:56.915024 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dzd2m_85cdcbba-06ac-4178-a25d-5d6eb221a155/serve-healthcheck-canary/0.log" Apr 21 18:09:57.463247 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:57.463219 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-jb47l_a784f755-6ef2-4edf-993b-25f3e45d1082/insights-operator/0.log" Apr 21 18:09:57.463791 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:57.463771 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-jb47l_a784f755-6ef2-4edf-993b-25f3e45d1082/insights-operator/1.log" Apr 21 18:09:57.531688 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:57.531660 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nv7nt_ac21d64f-1975-43d4-bed8-a9c4ecce476f/kube-rbac-proxy/0.log" Apr 21 18:09:57.548517 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:57.548490 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nv7nt_ac21d64f-1975-43d4-bed8-a9c4ecce476f/exporter/0.log" Apr 21 18:09:57.564812 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:57.564780 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nv7nt_ac21d64f-1975-43d4-bed8-a9c4ecce476f/extractor/0.log" Apr 21 18:09:59.593144 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:59.593092 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-cb7c66f54-r2wd6_3f0600be-e396-4b46-b5bd-0fb26082ab1e/maas-api/0.log" Apr 21 18:09:59.649296 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:59.649254 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-6b496cc5d9-wkts6_683f3fd7-ef1d-4fa0-9b06-57707aa39ade/manager/0.log" Apr 21 18:09:59.709707 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:09:59.709665 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d5f5c78f5-2w8jb_e8a8ce54-79d0-4e08-9b00-73835796a047/manager/0.log" Apr 21 18:10:00.524512 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:00.524483 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nf5lv/perf-node-gather-daemonset-r46m9" Apr 21 18:10:00.895251 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:00.895176 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-bdd4f6877-g28kp_dc724ecf-4612-46dc-8525-fb6475593865/manager/0.log" Apr 21 18:10:05.372684 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:05.372610 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-flx88_3892d71e-83cc-4628-8e38-087a4c7c7787/migrator/0.log" Apr 21 18:10:05.391096 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:05.391069 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-flx88_3892d71e-83cc-4628-8e38-087a4c7c7787/graceful-termination/0.log" Apr 21 18:10:07.073815 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:07.073785 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zcgfw_d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0/kube-multus-additional-cni-plugins/0.log" Apr 21 18:10:07.092938 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:07.092906 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zcgfw_d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0/egress-router-binary-copy/0.log" Apr 21 18:10:07.111435 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:07.111407 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zcgfw_d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0/cni-plugins/0.log" Apr 21 18:10:07.127678 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:07.127648 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zcgfw_d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0/bond-cni-plugin/0.log" Apr 21 18:10:07.144977 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:07.144947 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zcgfw_d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0/routeoverride-cni/0.log" Apr 21 18:10:07.161878 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:07.161848 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zcgfw_d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0/whereabouts-cni-bincopy/0.log" Apr 21 18:10:07.178800 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:07.178745 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zcgfw_d12b2fdf-28ef-4cd3-8fe8-ff7631cfc4b0/whereabouts-cni/0.log" Apr 21 18:10:07.239320 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:07.239273 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vmg6z_97f71749-bd4e-478c-9148-e08f6598c072/kube-multus/0.log" Apr 21 18:10:07.294010 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:07.293979 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wtk7c_dcd8a3eb-e25c-4dcb-9468-d578a60a826c/network-metrics-daemon/0.log" Apr 21 18:10:07.308628 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:07.308602 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wtk7c_dcd8a3eb-e25c-4dcb-9468-d578a60a826c/kube-rbac-proxy/0.log" Apr 21 18:10:08.137467 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:08.137438 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-controller/0.log" Apr 21 18:10:08.153578 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:08.153549 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/0.log" Apr 21 18:10:08.163842 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:08.163811 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovn-acl-logging/1.log" Apr 21 18:10:08.183200 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:08.183160 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/kube-rbac-proxy-node/0.log" Apr 21 18:10:08.200518 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:08.200487 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 18:10:08.214180 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:08.214120 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/northd/0.log" Apr 21 18:10:08.233685 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:08.233655 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/nbdb/0.log" Apr 21 18:10:08.250441 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:08.250415 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/sbdb/0.log" Apr 21 18:10:08.367783 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:08.367750 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-74gml_3f956cbc-c15f-455e-8caf-a1b6e26e74ca/ovnkube-controller/0.log" Apr 21 18:10:10.092976 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:10.092949 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-ndw4b_fc19cc8c-7c6d-4680-948c-31fb8fb0f70e/check-endpoints/0.log" Apr 21 18:10:10.166280 ip-10-0-134-77 kubenswrapper[2573]: I0421 18:10:10.166244 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-8f9r2_b9e16605-c885-47fb-ba9d-4b218cc44030/network-check-target-container/0.log"