Apr 20 19:08:11.172756 ip-10-0-136-5 systemd[1]: Starting Kubernetes Kubelet... Apr 20 19:08:11.586578 ip-10-0-136-5 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:08:11.586578 ip-10-0-136-5 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 19:08:11.586578 ip-10-0-136-5 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:08:11.586578 ip-10-0-136-5 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 19:08:11.586578 ip-10-0-136-5 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:08:11.588198 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.588111 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 19:08:11.590447 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590431 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:08:11.590493 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590448 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:08:11.590493 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590452 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:08:11.590493 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590455 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:08:11.590493 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590458 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:08:11.590493 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590462 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:08:11.590493 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590465 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:08:11.590493 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590481 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:08:11.590493 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590484 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:08:11.590493 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590487 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:08:11.590493 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590490 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:08:11.590493 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590492 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:08:11.590493 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590495 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:08:11.590493 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590498 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590501 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590504 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590507 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590516 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590519 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590522 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590525 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590527 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590530 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590532 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590535 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590538 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590540 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590543 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590545 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590548 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590550 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590553 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590556 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:08:11.590824 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590558 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590560 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590565 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590569 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590573 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590577 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590580 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590583 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590586 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590588 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590591 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590593 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590595 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590598 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590601 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590604 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590607 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590610 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590612 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:08:11.591368 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590616 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590618 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590621 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590624 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590626 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590629 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590632 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590634 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590637 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590639 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590642 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590644 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590647 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590649 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590651 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590654 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590657 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590659 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590662 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590664 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:08:11.592077 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590666 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590670 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590672 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590675 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590677 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590680 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590683 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590686 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590689 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590692 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590695 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590698 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590700 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.590703 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591337 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591355 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591360 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591365 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591370 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:08:11.592590 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591374 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591379 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591384 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591395 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591399 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591403 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591408 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591412 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591416 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591420 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591424 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591429 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591433 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591437 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591441 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591445 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591454 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591458 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591463 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591481 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:08:11.593347 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591486 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591490 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591494 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591501 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591506 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591510 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591514 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591519 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591530 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591534 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591539 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591543 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591547 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591551 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591555 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591560 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591563 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591568 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591571 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:08:11.594270 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591580 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591589 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591593 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591597 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591602 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591606 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591610 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591614 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591619 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591623 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591626 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591631 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591635 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591639 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591648 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591652 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591656 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591660 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591664 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591668 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:08:11.594789 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591676 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591684 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591690 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591695 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591699 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591704 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591713 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591717 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591722 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591727 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591734 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591739 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591743 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591747 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591752 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591756 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591761 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591765 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591774 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591779 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:08:11.595377 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591783 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.591788 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592607 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592650 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592664 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592676 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592697 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592851 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592875 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592882 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592888 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592894 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592900 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592905 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592910 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592916 2572 flags.go:64] FLAG: --cgroup-root="" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592920 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592925 2572 flags.go:64] FLAG: --client-ca-file="" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592931 2572 flags.go:64] FLAG: --cloud-config="" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592935 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592940 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592948 2572 flags.go:64] FLAG: --cluster-domain="" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592953 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592959 2572 flags.go:64] FLAG: --config-dir="" Apr 20 19:08:11.596117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592963 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592968 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592975 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592981 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592986 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592991 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.592997 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593002 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593007 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593012 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593017 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593024 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593029 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593034 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593039 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593045 2572 flags.go:64] FLAG: --enable-server="true" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593049 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593057 2572 flags.go:64] FLAG: --event-burst="100" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593063 2572 flags.go:64] FLAG: --event-qps="50" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593068 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593073 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593078 2572 flags.go:64] FLAG: --eviction-hard="" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593084 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593089 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593094 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 19:08:11.596719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593099 2572 flags.go:64] FLAG: --eviction-soft="" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593104 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593109 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593114 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593119 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593124 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593129 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593133 2572 flags.go:64] FLAG: --feature-gates="" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593139 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593144 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593149 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593155 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593160 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593166 2572 flags.go:64] FLAG: --help="false" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593171 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-136-5.ec2.internal" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593176 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593181 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593186 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593193 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593199 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593204 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593209 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593214 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 19:08:11.597413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593219 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593224 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593230 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593235 2572 flags.go:64] FLAG: --kube-reserved="" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593240 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593245 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593250 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593255 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593259 2572 flags.go:64] FLAG: --lock-file="" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593264 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593269 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593274 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593284 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593289 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593293 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593298 2572 flags.go:64] FLAG: --logging-format="text" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593303 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593308 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593313 2572 flags.go:64] FLAG: --manifest-url="" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593318 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593325 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593332 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593339 2572 flags.go:64] FLAG: --max-pods="110" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593344 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593348 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 19:08:11.598003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593353 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593357 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593362 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593367 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593372 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593387 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593392 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593397 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593403 2572 flags.go:64] FLAG: --pod-cidr="" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593407 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593415 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593421 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593426 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593431 2572 flags.go:64] FLAG: --port="10250" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593436 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593441 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-049d181984007ca8b" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593446 2572 flags.go:64] FLAG: --qos-reserved="" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593451 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593456 2572 flags.go:64] FLAG: --register-node="true" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593460 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593465 2572 flags.go:64] FLAG: --register-with-taints="" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593490 2572 flags.go:64] FLAG: --registry-burst="10" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593496 2572 flags.go:64] FLAG: --registry-qps="5" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593500 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593505 2572 flags.go:64] FLAG: --reserved-memory="" Apr 20 19:08:11.598647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593511 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593516 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593521 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593526 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593532 2572 flags.go:64] FLAG: --runonce="false" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593541 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593547 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593551 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593556 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593561 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593565 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593571 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593576 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593581 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593586 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593591 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593595 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593601 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593605 2572 flags.go:64] FLAG: --system-cgroups="" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593610 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593620 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593625 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593630 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593637 2572 flags.go:64] FLAG: --tls-min-version="" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593641 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 19:08:11.599263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593646 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593651 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593656 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593660 2572 flags.go:64] FLAG: --v="2" Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593667 2572 flags.go:64] FLAG: --version="false" Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593674 2572 flags.go:64] FLAG: --vmodule="" Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593680 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.593685 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593874 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593882 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593887 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593892 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593901 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593906 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593911 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593916 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593921 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593925 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593930 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593934 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593938 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:08:11.599907 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593942 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593947 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593952 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593956 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593960 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593966 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593970 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593975 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593979 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593983 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593987 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593992 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.593996 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594000 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594004 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594008 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594019 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594025 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594030 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:08:11.600465 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594037 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594044 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594048 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594053 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594059 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594065 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594069 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594073 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594077 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594082 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594086 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594090 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594094 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594098 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594102 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594107 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594111 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594116 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594120 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594124 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:08:11.600989 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594128 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594133 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594137 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594141 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594145 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594150 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594154 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594158 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594163 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594169 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594173 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594177 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594182 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594186 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594190 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594194 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594200 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594204 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594209 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594214 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:08:11.601504 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594218 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594222 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594226 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594231 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594235 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594239 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594243 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594247 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594251 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594255 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594260 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594264 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594268 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.594272 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.594914 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.601412 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 19:08:11.601995 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.601533 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601580 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601584 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601587 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601590 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601593 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601595 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601598 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601601 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601603 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601606 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601608 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601611 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601614 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601618 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601621 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601623 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601626 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601629 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601632 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:08:11.602403 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601636 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601639 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601641 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601644 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601646 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601649 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601651 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601654 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601656 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601659 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601662 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601664 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601668 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601670 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601673 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601676 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601678 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601681 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601684 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601686 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:08:11.602960 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601689 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601691 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601694 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601696 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601699 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601702 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601704 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601708 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601710 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601713 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601716 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601719 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601721 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601725 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601729 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601732 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601735 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601738 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:08:11.603456 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601741 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601744 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601747 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601749 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601752 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601755 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601758 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601761 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601763 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601766 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601768 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601771 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601774 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601776 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601779 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601781 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601784 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601786 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601789 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601792 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:08:11.603995 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601795 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601798 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601801 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601804 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601807 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601809 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601812 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601814 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601817 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.601822 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601921 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601926 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601929 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601932 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601935 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601938 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:08:11.604503 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601941 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601945 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601948 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601951 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601954 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601957 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601960 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601963 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601966 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601969 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601971 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601974 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601976 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601979 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601982 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601984 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601986 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601989 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601991 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601994 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:08:11.604905 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.601997 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602000 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602003 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602006 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602009 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602011 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602014 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602017 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602019 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602022 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602024 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602027 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602029 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602032 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602034 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602037 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602040 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602043 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602045 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602048 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:08:11.605409 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602050 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602053 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602055 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602058 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602061 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602063 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602066 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602068 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602072 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602076 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602078 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602081 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602084 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602087 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602089 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602092 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602096 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602099 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602102 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:08:11.605930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602105 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602108 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602111 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602114 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602117 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602119 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602122 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602125 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602128 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602130 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602133 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602135 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602138 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602140 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602143 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602145 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602148 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602150 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602153 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602156 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:08:11.606394 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:11.602158 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:08:11.606920 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.602163 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:08:11.606920 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.602837 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 19:08:11.606920 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.606056 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 19:08:11.607008 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.606937 2572 server.go:1019] "Starting client certificate rotation" Apr 20 19:08:11.607046 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.607030 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:08:11.607078 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.607068 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:08:11.631566 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.631548 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:08:11.633900 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.633883 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:08:11.650072 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.650053 2572 log.go:25] "Validated CRI v1 runtime API" Apr 20 19:08:11.655537 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.655522 2572 log.go:25] "Validated CRI v1 image API" Apr 20 19:08:11.657173 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.657160 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 19:08:11.662072 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.662027 2572 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a5e5429c-8dfd-46db-be59-a4962b2ea86b:/dev/nvme0n1p4 c976adae-ef17-4dcf-9816-0d824e6b8e53:/dev/nvme0n1p3] Apr 20 19:08:11.662072 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.662046 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 19:08:11.662307 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.662292 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:08:11.667667 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.667563 2572 manager.go:217] Machine: {Timestamp:2026-04-20 19:08:11.665747311 +0000 UTC m=+0.385050073 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099679 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27c63a3a8ee426b2969e23dde280d2 SystemUUID:ec27c63a-3a8e-e426-b296-9e23dde280d2 BootID:a3523276-3e87-4a4f-90f9-749779e93876 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a8:da:6a:0f:73 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a8:da:6a:0f:73 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:56:a3:3e:df:cb:c1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 19:08:11.667667 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.667662 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 19:08:11.667791 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.667780 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 19:08:11.668709 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.668688 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 19:08:11.668847 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.668712 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-5.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 19:08:11.668892 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.668856 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 19:08:11.668892 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.668865 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 19:08:11.668892 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.668878 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:08:11.669387 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.669375 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:08:11.670584 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.670571 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:08:11.670688 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.670678 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 19:08:11.672594 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.672583 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 20 19:08:11.672633 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.672598 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 19:08:11.672633 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.672610 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 19:08:11.672633 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.672619 2572 kubelet.go:397] "Adding apiserver pod source" Apr 20 19:08:11.672633 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.672628 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 19:08:11.673627 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.673611 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:08:11.673761 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.673638 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:08:11.676627 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.676611 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 19:08:11.677889 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.677873 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 19:08:11.679783 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.679771 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 19:08:11.679835 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.679788 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 19:08:11.679835 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.679795 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 19:08:11.679835 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.679803 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 19:08:11.679835 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.679812 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 19:08:11.679835 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.679820 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 19:08:11.679835 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.679826 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 19:08:11.679835 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.679831 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 19:08:11.679835 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.679839 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 19:08:11.680033 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.679845 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 19:08:11.680033 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.679854 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 19:08:11.680033 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.679862 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 19:08:11.680650 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.680636 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 19:08:11.680650 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.680650 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 19:08:11.682523 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:11.682502 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-5.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 19:08:11.682614 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:11.682509 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 19:08:11.682819 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.682790 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-5.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 19:08:11.682977 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.682961 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mvnv8" Apr 20 19:08:11.684262 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.684249 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 19:08:11.684311 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.684285 2572 server.go:1295] "Started kubelet" Apr 20 19:08:11.684408 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.684379 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 19:08:11.684490 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.684431 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 19:08:11.684548 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.684501 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 19:08:11.685167 ip-10-0-136-5 systemd[1]: Started Kubernetes Kubelet. Apr 20 19:08:11.685523 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.685505 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 19:08:11.685590 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.685548 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 20 19:08:11.690458 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.690440 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 19:08:11.690970 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.690956 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 19:08:11.691050 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.690957 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mvnv8" Apr 20 19:08:11.691529 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.691511 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 19:08:11.691626 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.691536 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 19:08:11.691626 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.691513 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 19:08:11.691745 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.691656 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 20 19:08:11.691745 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.691664 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 20 19:08:11.691845 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:11.691760 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 20 19:08:11.694177 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.694024 2572 factory.go:55] Registering systemd factory Apr 20 19:08:11.694177 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.694044 2572 factory.go:223] Registration of the systemd container factory successfully Apr 20 19:08:11.694586 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.694563 2572 factory.go:153] Registering CRI-O factory Apr 20 19:08:11.694586 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.694581 2572 factory.go:223] Registration of the crio container factory successfully Apr 20 19:08:11.694732 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.694631 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 19:08:11.694732 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.694664 2572 factory.go:103] Registering Raw factory Apr 20 19:08:11.694732 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.694679 2572 manager.go:1196] Started watching for new ooms in manager Apr 20 19:08:11.695146 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.695130 2572 manager.go:319] Starting recovery of all containers Apr 20 19:08:11.704644 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.704614 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:08:11.708093 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.708076 2572 manager.go:324] Recovery completed Apr 20 19:08:11.712381 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.712367 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:08:11.713909 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:11.713891 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-136-5.ec2.internal\" not found" node="ip-10-0-136-5.ec2.internal" Apr 20 19:08:11.716319 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.716267 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:08:11.716373 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.716333 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:08:11.716373 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.716354 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:08:11.716915 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.716897 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 19:08:11.716915 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.716913 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 19:08:11.717012 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.716931 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:08:11.719026 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.719015 2572 policy_none.go:49] "None policy: Start" Apr 20 19:08:11.719090 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.719030 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 19:08:11.719090 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.719040 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 20 19:08:11.763077 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.761984 2572 manager.go:341] "Starting Device Plugin manager" Apr 20 19:08:11.763077 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:11.762018 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 19:08:11.763077 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.762028 2572 server.go:85] "Starting device plugin registration server" Apr 20 19:08:11.763077 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.762261 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 19:08:11.763077 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.762271 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 19:08:11.763077 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.762360 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 19:08:11.763077 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.762435 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 19:08:11.763077 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.762443 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 19:08:11.763077 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:11.763065 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 19:08:11.763389 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:11.763105 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-5.ec2.internal\" not found" Apr 20 19:08:11.822944 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.822915 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 19:08:11.824101 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.824086 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 19:08:11.824159 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.824112 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 19:08:11.824159 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.824130 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 19:08:11.824159 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.824137 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 19:08:11.824272 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:11.824210 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 19:08:11.828919 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.828901 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:08:11.862465 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.862400 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:08:11.864347 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.864329 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:08:11.864440 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.864365 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:08:11.864440 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.864383 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:08:11.864440 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.864415 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-5.ec2.internal" Apr 20 19:08:11.873953 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.873934 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-5.ec2.internal" Apr 20 19:08:11.874035 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:11.873961 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-5.ec2.internal\": node \"ip-10-0-136-5.ec2.internal\" not found" Apr 20 19:08:11.892986 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:11.892964 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 20 19:08:11.924828 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.924789 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal"] Apr 20 19:08:11.924944 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.924865 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:08:11.926375 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.926359 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:08:11.926441 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.926390 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:08:11.926441 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.926400 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:08:11.927411 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.927397 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:08:11.927542 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.927528 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" Apr 20 19:08:11.927583 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.927555 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:08:11.928125 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.928111 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:08:11.928176 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.928144 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:08:11.928210 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.928178 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:08:11.928210 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.928204 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:08:11.928288 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.928223 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:08:11.928288 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.928234 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:08:11.929220 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.929208 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 20 19:08:11.929263 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.929231 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:08:11.929878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.929863 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:08:11.930001 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.929890 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:08:11.930001 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.929903 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:08:11.957153 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:11.957133 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-5.ec2.internal\" not found" node="ip-10-0-136-5.ec2.internal" Apr 20 19:08:11.961410 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:11.961394 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-5.ec2.internal\" not found" node="ip-10-0-136-5.ec2.internal" Apr 20 19:08:11.993404 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:11.993381 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 20 19:08:11.993503 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.993448 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c759d37c693dd42a96a635e33d6e4429-config\") pod \"kube-apiserver-proxy-ip-10-0-136-5.ec2.internal\" (UID: \"c759d37c693dd42a96a635e33d6e4429\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" Apr 20 19:08:11.993503 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.993486 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e2c41a124527f124f4d00e3a7bbddbf2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal\" (UID: \"e2c41a124527f124f4d00e3a7bbddbf2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 20 19:08:11.993571 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:11.993504 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2c41a124527f124f4d00e3a7bbddbf2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal\" (UID: \"e2c41a124527f124f4d00e3a7bbddbf2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 20 19:08:12.093642 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:12.093603 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 20 19:08:12.093810 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.093653 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e2c41a124527f124f4d00e3a7bbddbf2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal\" (UID: \"e2c41a124527f124f4d00e3a7bbddbf2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 20 19:08:12.093810 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.093684 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2c41a124527f124f4d00e3a7bbddbf2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal\" (UID: \"e2c41a124527f124f4d00e3a7bbddbf2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 20 19:08:12.093810 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.093707 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c759d37c693dd42a96a635e33d6e4429-config\") pod \"kube-apiserver-proxy-ip-10-0-136-5.ec2.internal\" (UID: \"c759d37c693dd42a96a635e33d6e4429\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" Apr 20 19:08:12.093810 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.093736 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e2c41a124527f124f4d00e3a7bbddbf2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal\" (UID: \"e2c41a124527f124f4d00e3a7bbddbf2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 20 19:08:12.093810 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.093744 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c759d37c693dd42a96a635e33d6e4429-config\") pod \"kube-apiserver-proxy-ip-10-0-136-5.ec2.internal\" (UID: \"c759d37c693dd42a96a635e33d6e4429\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" Apr 20 19:08:12.093810 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.093744 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2c41a124527f124f4d00e3a7bbddbf2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal\" (UID: \"e2c41a124527f124f4d00e3a7bbddbf2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 20 19:08:12.194226 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:12.194148 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 20 19:08:12.259372 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.259346 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" Apr 20 19:08:12.264758 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.264740 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 20 19:08:12.294500 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:12.294465 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 20 19:08:12.394897 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:12.394856 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 20 19:08:12.495249 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:12.495170 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 20 19:08:12.595706 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:12.595671 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 20 19:08:12.606852 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.606830 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 19:08:12.606996 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.606976 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:08:12.607060 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.606990 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:08:12.682670 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.682640 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:08:12.691031 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.691005 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 19:08:12.696072 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:12.696047 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-5.ec2.internal\" not found" Apr 20 19:08:12.696192 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.696078 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 19:03:11 +0000 UTC" deadline="2027-10-04 20:28:24.314192283 +0000 UTC" Apr 20 19:08:12.696192 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.696103 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12769h20m11.618091616s" Apr 20 19:08:12.700831 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.700813 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:08:12.721264 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.721235 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hp6z8" Apr 20 19:08:12.728747 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.728728 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hp6z8" Apr 20 19:08:12.737112 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.737092 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:08:12.791937 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.791748 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" Apr 20 19:08:12.801929 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.801911 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:08:12.802842 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.802817 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" Apr 20 19:08:12.804900 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:12.804877 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2c41a124527f124f4d00e3a7bbddbf2.slice/crio-6e27b3f2e3a71c2cc2dac262f8804d864dcd531ac0f7ac272c6828d0d27dc707 WatchSource:0}: Error finding container 6e27b3f2e3a71c2cc2dac262f8804d864dcd531ac0f7ac272c6828d0d27dc707: Status 404 returned error can't find the container with id 6e27b3f2e3a71c2cc2dac262f8804d864dcd531ac0f7ac272c6828d0d27dc707 Apr 20 19:08:12.805337 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:12.805314 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc759d37c693dd42a96a635e33d6e4429.slice/crio-ec7f422bc25c7f53b2b16dcb7fe47c2bb67a4ab0cf8cfdd6e358e95a3b1ec25c WatchSource:0}: Error finding container ec7f422bc25c7f53b2b16dcb7fe47c2bb67a4ab0cf8cfdd6e358e95a3b1ec25c: Status 404 returned error can't find the container with id ec7f422bc25c7f53b2b16dcb7fe47c2bb67a4ab0cf8cfdd6e358e95a3b1ec25c Apr 20 19:08:12.808947 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.808934 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:08:12.812337 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.812321 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:08:12.827738 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.827695 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" event={"ID":"e2c41a124527f124f4d00e3a7bbddbf2","Type":"ContainerStarted","Data":"6e27b3f2e3a71c2cc2dac262f8804d864dcd531ac0f7ac272c6828d0d27dc707"} Apr 20 19:08:12.828607 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:12.828590 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" event={"ID":"c759d37c693dd42a96a635e33d6e4429","Type":"ContainerStarted","Data":"ec7f422bc25c7f53b2b16dcb7fe47c2bb67a4ab0cf8cfdd6e358e95a3b1ec25c"} Apr 20 19:08:13.457526 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.457495 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:08:13.674407 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.674378 2572 apiserver.go:52] "Watching apiserver" Apr 20 19:08:13.680778 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.680747 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 19:08:13.681179 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.681157 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-hqs9s","openshift-dns/node-resolver-lghm5","openshift-image-registry/node-ca-dzjv7","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal","openshift-multus/multus-additional-cni-plugins-tng2d","openshift-multus/multus-hktrf","openshift-multus/network-metrics-daemon-rwnnv","openshift-network-diagnostics/network-check-target-nvhzk","openshift-network-operator/iptables-alerter-dz9mm","kube-system/konnectivity-agent-pt5md","kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv","openshift-cluster-node-tuning-operator/tuned-h94wx"] Apr 20 19:08:13.682649 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.682622 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pt5md" Apr 20 19:08:13.683610 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.683587 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lghm5" Apr 20 19:08:13.684759 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.684738 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 19:08:13.684886 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.684795 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 19:08:13.685508 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.685052 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-j2hm6\"" Apr 20 19:08:13.686371 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.685924 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 19:08:13.686371 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.686288 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-45mgx\"" Apr 20 19:08:13.686371 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.686299 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 19:08:13.687517 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.687349 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dzjv7" Apr 20 19:08:13.691177 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.689845 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4thbx\"" Apr 20 19:08:13.691177 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.690129 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 19:08:13.691177 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.690278 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 19:08:13.691177 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.690426 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 19:08:13.692130 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.691429 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.692130 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.691437 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.693343 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.692954 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:13.693343 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:13.693032 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:13.693343 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.693300 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 19:08:13.693854 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.693662 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 19:08:13.693854 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.693686 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 19:08:13.693854 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.693701 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jzjqs\"" Apr 20 19:08:13.693854 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.693742 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 19:08:13.694649 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.694124 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 19:08:13.694649 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.694181 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-wgvsw\"" Apr 20 19:08:13.694649 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.694333 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:13.694649 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:13.694438 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:13.694649 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.694600 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 19:08:13.695460 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.695443 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dz9mm" Apr 20 19:08:13.697272 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.697216 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:08:13.697591 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.697382 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 19:08:13.697883 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.697703 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-rwvq8\"" Apr 20 19:08:13.698524 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.698397 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 19:08:13.698649 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.698627 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.699768 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.699745 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.700176 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.699974 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.700497 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.700342 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 19:08:13.700629 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.700613 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 19:08:13.700697 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.700682 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-6lv2k\"" Apr 20 19:08:13.700755 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.700713 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 19:08:13.700849 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.700816 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 19:08:13.700986 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.700969 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 19:08:13.701974 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.701948 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-multus-daemon-config\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.702081 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.701988 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:13.702081 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702017 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w7q6\" (UniqueName: \"kubernetes.io/projected/e9c238c6-ab0d-4140-b842-f59e7642479c-kube-api-access-7w7q6\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:13.702081 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702041 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpfw8\" (UniqueName: \"kubernetes.io/projected/89b86827-3229-4d28-8418-3ba07654afdd-kube-api-access-bpfw8\") pod \"node-ca-dzjv7\" (UID: \"89b86827-3229-4d28-8418-3ba07654afdd\") " pod="openshift-image-registry/node-ca-dzjv7" Apr 20 19:08:13.702081 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702057 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-cnibin\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.702309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702079 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-run-netns\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.702309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702127 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-etc-kubernetes\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.702309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702142 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:08:13.702309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702149 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f9b82167-c5ea-4998-8d5c-93cec402a0fa-konnectivity-ca\") pod \"konnectivity-agent-pt5md\" (UID: \"f9b82167-c5ea-4998-8d5c-93cec402a0fa\") " pod="kube-system/konnectivity-agent-pt5md" Apr 20 19:08:13.702309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702156 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 19:08:13.702309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702167 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 19:08:13.702309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702174 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnc2n\" (UniqueName: \"kubernetes.io/projected/a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08-kube-api-access-lnc2n\") pod \"node-resolver-lghm5\" (UID: \"a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08\") " pod="openshift-dns/node-resolver-lghm5" Apr 20 19:08:13.702309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702182 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 19:08:13.702309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702197 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89b86827-3229-4d28-8418-3ba07654afdd-host\") pod \"node-ca-dzjv7\" (UID: \"89b86827-3229-4d28-8418-3ba07654afdd\") " pod="openshift-image-registry/node-ca-dzjv7" Apr 20 19:08:13.702309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702221 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvmtt\" (UniqueName: \"kubernetes.io/projected/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-kube-api-access-pvmtt\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.702309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702243 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-system-cni-dir\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.702309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702266 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-run-k8s-cni-cncf-io\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.702309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702288 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f9b82167-c5ea-4998-8d5c-93cec402a0fa-agent-certs\") pod \"konnectivity-agent-pt5md\" (UID: \"f9b82167-c5ea-4998-8d5c-93cec402a0fa\") " pod="kube-system/konnectivity-agent-pt5md" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702326 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/89b86827-3229-4d28-8418-3ba07654afdd-serviceca\") pod \"node-ca-dzjv7\" (UID: \"89b86827-3229-4d28-8418-3ba07654afdd\") " pod="openshift-image-registry/node-ca-dzjv7" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702358 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-cnibin\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702384 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-cni-binary-copy\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702414 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702434 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-crld4\"" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702454 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-run-multus-certs\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702506 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-44k7p\"" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702505 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-multus-socket-dir-parent\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702553 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-cni-binary-copy\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702528 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702569 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702586 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smwmx\" (UniqueName: \"kubernetes.io/projected/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-kube-api-access-smwmx\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702611 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-var-lib-cni-multus\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702642 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-multus-conf-dir\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702685 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08-tmp-dir\") pod \"node-resolver-lghm5\" (UID: \"a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08\") " pod="openshift-dns/node-resolver-lghm5" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702713 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702733 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-hostroot\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.703051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-os-release\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.703650 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702783 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-system-cni-dir\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.703650 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702798 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-multus-cni-dir\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.703650 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702823 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-os-release\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.703650 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702848 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-var-lib-cni-bin\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.703650 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702870 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08-hosts-file\") pod \"node-resolver-lghm5\" (UID: \"a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08\") " pod="openshift-dns/node-resolver-lghm5" Apr 20 19:08:13.703650 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702885 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.703650 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.702921 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-var-lib-kubelet\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.729360 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.729333 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:03:12 +0000 UTC" deadline="2027-10-01 20:00:56.653149173 +0000 UTC" Apr 20 19:08:13.729360 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.729358 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12696h52m42.923793027s" Apr 20 19:08:13.792513 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.792461 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 19:08:13.803227 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803203 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-registration-dir\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.803346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803231 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-sys\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.803346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803251 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-slash\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.803346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803272 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhx87\" (UniqueName: \"kubernetes.io/projected/73d65bde-5605-46fa-9c02-62fdc3f51501-kube-api-access-jhx87\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.803346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803295 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-system-cni-dir\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.803346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803313 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-os-release\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.803346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803331 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-run-netns\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.803663 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.803663 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803377 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-var-lib-kubelet\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.803663 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803397 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-system-cni-dir\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.803663 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803396 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-os-release\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.803663 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803403 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-multus-daemon-config\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.803663 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803443 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-var-lib-kubelet\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.803663 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803465 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8snmq\" (UniqueName: \"kubernetes.io/projected/a96f7ddd-9780-4b41-abdb-dc8d64c0cb84-kube-api-access-8snmq\") pod \"iptables-alerter-dz9mm\" (UID: \"a96f7ddd-9780-4b41-abdb-dc8d64c0cb84\") " pod="openshift-network-operator/iptables-alerter-dz9mm" Apr 20 19:08:13.803663 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803521 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-node-log\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.803663 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803572 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803692 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7w7q6\" (UniqueName: \"kubernetes.io/projected/e9c238c6-ab0d-4140-b842-f59e7642479c-kube-api-access-7w7q6\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:13.803711 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803727 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-cnibin\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803751 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-etc-kubernetes\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803789 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-etc-kubernetes\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:13.803801 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs podName:e9c238c6-ab0d-4140-b842-f59e7642479c nodeName:}" failed. No retries permitted until 2026-04-20 19:08:14.303758508 +0000 UTC m=+3.023061258 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs") pod "network-metrics-daemon-rwnnv" (UID: "e9c238c6-ab0d-4140-b842-f59e7642479c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803838 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-etc-openvswitch\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803839 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-cnibin\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803875 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-run-openvswitch\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803905 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-sysctl-d\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803935 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f9b82167-c5ea-4998-8d5c-93cec402a0fa-konnectivity-ca\") pod \"konnectivity-agent-pt5md\" (UID: \"f9b82167-c5ea-4998-8d5c-93cec402a0fa\") " pod="kube-system/konnectivity-agent-pt5md" Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89b86827-3229-4d28-8418-3ba07654afdd-host\") pod \"node-ca-dzjv7\" (UID: \"89b86827-3229-4d28-8418-3ba07654afdd\") " pod="openshift-image-registry/node-ca-dzjv7" Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803971 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-multus-daemon-config\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.803983 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-host\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.804068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804032 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89b86827-3229-4d28-8418-3ba07654afdd-host\") pod \"node-ca-dzjv7\" (UID: \"89b86827-3229-4d28-8418-3ba07654afdd\") " pod="openshift-image-registry/node-ca-dzjv7" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804160 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-system-cni-dir\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804195 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-systemd-units\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804228 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-system-cni-dir\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804240 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-log-socket\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804265 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-sysctl-conf\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804289 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/377f4c68-c512-417b-92f2-ebb124f472ba-tmp\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f9b82167-c5ea-4998-8d5c-93cec402a0fa-agent-certs\") pod \"konnectivity-agent-pt5md\" (UID: \"f9b82167-c5ea-4998-8d5c-93cec402a0fa\") " pod="kube-system/konnectivity-agent-pt5md" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804344 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/89b86827-3229-4d28-8418-3ba07654afdd-serviceca\") pod \"node-ca-dzjv7\" (UID: \"89b86827-3229-4d28-8418-3ba07654afdd\") " pod="openshift-image-registry/node-ca-dzjv7" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804371 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a96f7ddd-9780-4b41-abdb-dc8d64c0cb84-iptables-alerter-script\") pod \"iptables-alerter-dz9mm\" (UID: \"a96f7ddd-9780-4b41-abdb-dc8d64c0cb84\") " pod="openshift-network-operator/iptables-alerter-dz9mm" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804413 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/73d65bde-5605-46fa-9c02-62fdc3f51501-ovnkube-config\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804446 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/377f4c68-c512-417b-92f2-ebb124f472ba-etc-tuned\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804493 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-cni-binary-copy\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804559 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-var-lib-cni-multus\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804615 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-socket-dir\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f9b82167-c5ea-4998-8d5c-93cec402a0fa-konnectivity-ca\") pod \"konnectivity-agent-pt5md\" (UID: \"f9b82167-c5ea-4998-8d5c-93cec402a0fa\") " pod="kube-system/konnectivity-agent-pt5md" Apr 20 19:08:13.804673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804644 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08-tmp-dir\") pod \"node-resolver-lghm5\" (UID: \"a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08\") " pod="openshift-dns/node-resolver-lghm5" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804687 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804696 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-var-lib-cni-multus\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804683 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804716 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-hostroot\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804758 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/73d65bde-5605-46fa-9c02-62fdc3f51501-env-overrides\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804779 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtl9r\" (UniqueName: \"kubernetes.io/projected/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-kube-api-access-xtl9r\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804794 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5bcf\" (UniqueName: \"kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf\") pod \"network-check-target-nvhzk\" (UID: \"d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e\") " pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-os-release\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804850 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-multus-cni-dir\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804858 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-hostroot\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804864 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-var-lib-cni-bin\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804949 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-var-lib-cni-bin\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804964 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-os-release\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804977 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-multus-cni-dir\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.804995 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-kubelet\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805049 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08-tmp-dir\") pod \"node-resolver-lghm5\" (UID: \"a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08\") " pod="openshift-dns/node-resolver-lghm5" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-cni-binary-copy\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.805393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805085 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/89b86827-3229-4d28-8418-3ba07654afdd-serviceca\") pod \"node-ca-dzjv7\" (UID: \"89b86827-3229-4d28-8418-3ba07654afdd\") " pod="openshift-image-registry/node-ca-dzjv7" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805163 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805208 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-run-systemd\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805249 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-run-ovn\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/73d65bde-5605-46fa-9c02-62fdc3f51501-ovnkube-script-lib\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805342 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-device-dir\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805372 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08-hosts-file\") pod \"node-resolver-lghm5\" (UID: \"a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08\") " pod="openshift-dns/node-resolver-lghm5" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805453 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08-hosts-file\") pod \"node-resolver-lghm5\" (UID: \"a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08\") " pod="openshift-dns/node-resolver-lghm5" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805539 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-etc-selinux\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpfw8\" (UniqueName: \"kubernetes.io/projected/89b86827-3229-4d28-8418-3ba07654afdd-kube-api-access-bpfw8\") pod \"node-ca-dzjv7\" (UID: \"89b86827-3229-4d28-8418-3ba07654afdd\") " pod="openshift-image-registry/node-ca-dzjv7" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805608 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-run-netns\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805652 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-run\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805677 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnc2n\" (UniqueName: \"kubernetes.io/projected/a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08-kube-api-access-lnc2n\") pod \"node-resolver-lghm5\" (UID: \"a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08\") " pod="openshift-dns/node-resolver-lghm5" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805714 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-run-netns\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805740 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvmtt\" (UniqueName: \"kubernetes.io/projected/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-kube-api-access-pvmtt\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805770 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-var-lib-openvswitch\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.806271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805813 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-sys-fs\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805858 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-modprobe-d\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805927 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-run-k8s-cni-cncf-io\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805963 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-cni-bin\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.805993 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/73d65bde-5605-46fa-9c02-62fdc3f51501-ovn-node-metrics-cert\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806035 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-run-k8s-cni-cncf-io\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806069 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-sysconfig\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806122 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-lib-modules\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ph44\" (UniqueName: \"kubernetes.io/projected/377f4c68-c512-417b-92f2-ebb124f472ba-kube-api-access-4ph44\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806175 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-cnibin\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806199 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-cni-binary-copy\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806225 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-run-multus-certs\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806254 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-cnibin\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806265 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-host-run-multus-certs\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806316 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a96f7ddd-9780-4b41-abdb-dc8d64c0cb84-host-slash\") pod \"iptables-alerter-dz9mm\" (UID: \"a96f7ddd-9780-4b41-abdb-dc8d64c0cb84\") " pod="openshift-network-operator/iptables-alerter-dz9mm" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806363 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-cni-netd\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806379 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-kubernetes\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.807055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806394 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-systemd\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.807689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806433 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-var-lib-kubelet\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.807689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806462 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-multus-socket-dir-parent\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.807689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806509 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.807689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806536 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smwmx\" (UniqueName: \"kubernetes.io/projected/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-kube-api-access-smwmx\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.807689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806549 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-multus-socket-dir-parent\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.807689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806552 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.807689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806562 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-multus-conf-dir\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.807689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806589 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-run-ovn-kubernetes\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.807689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806616 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.807689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806621 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-multus-conf-dir\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.807689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.806759 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-cni-binary-copy\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.807689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.807104 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.808363 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.808344 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f9b82167-c5ea-4998-8d5c-93cec402a0fa-agent-certs\") pod \"konnectivity-agent-pt5md\" (UID: \"f9b82167-c5ea-4998-8d5c-93cec402a0fa\") " pod="kube-system/konnectivity-agent-pt5md" Apr 20 19:08:13.814084 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.814065 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w7q6\" (UniqueName: \"kubernetes.io/projected/e9c238c6-ab0d-4140-b842-f59e7642479c-kube-api-access-7w7q6\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:13.815676 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.815635 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smwmx\" (UniqueName: \"kubernetes.io/projected/09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b-kube-api-access-smwmx\") pod \"multus-additional-cni-plugins-tng2d\" (UID: \"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b\") " pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:13.815814 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.815793 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnc2n\" (UniqueName: \"kubernetes.io/projected/a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08-kube-api-access-lnc2n\") pod \"node-resolver-lghm5\" (UID: \"a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08\") " pod="openshift-dns/node-resolver-lghm5" Apr 20 19:08:13.815893 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.815879 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvmtt\" (UniqueName: \"kubernetes.io/projected/5cfa2a80-feb6-4f70-8ca4-75225aa4dae7-kube-api-access-pvmtt\") pod \"multus-hktrf\" (UID: \"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7\") " pod="openshift-multus/multus-hktrf" Apr 20 19:08:13.816316 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.816299 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpfw8\" (UniqueName: \"kubernetes.io/projected/89b86827-3229-4d28-8418-3ba07654afdd-kube-api-access-bpfw8\") pod \"node-ca-dzjv7\" (UID: \"89b86827-3229-4d28-8418-3ba07654afdd\") " pod="openshift-image-registry/node-ca-dzjv7" Apr 20 19:08:13.907768 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907732 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-run-netns\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.907942 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907777 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.907942 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907804 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8snmq\" (UniqueName: \"kubernetes.io/projected/a96f7ddd-9780-4b41-abdb-dc8d64c0cb84-kube-api-access-8snmq\") pod \"iptables-alerter-dz9mm\" (UID: \"a96f7ddd-9780-4b41-abdb-dc8d64c0cb84\") " pod="openshift-network-operator/iptables-alerter-dz9mm" Apr 20 19:08:13.907942 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907821 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-node-log\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.907942 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907835 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-run-netns\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.907942 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907856 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-etc-openvswitch\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.907942 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907879 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-run-openvswitch\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.907942 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907905 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-sysctl-d\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.907942 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907906 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-kubelet-dir\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.907942 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907928 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-host\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907948 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-run-openvswitch\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907954 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-systemd-units\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907957 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-node-log\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907978 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-log-socket\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.907998 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-etc-openvswitch\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-host\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908003 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-sysctl-conf\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908033 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/377f4c68-c512-417b-92f2-ebb124f472ba-tmp\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908043 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-log-socket\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908046 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-systemd-units\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-sysctl-d\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a96f7ddd-9780-4b41-abdb-dc8d64c0cb84-iptables-alerter-script\") pod \"iptables-alerter-dz9mm\" (UID: \"a96f7ddd-9780-4b41-abdb-dc8d64c0cb84\") " pod="openshift-network-operator/iptables-alerter-dz9mm" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/73d65bde-5605-46fa-9c02-62fdc3f51501-ovnkube-config\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/377f4c68-c512-417b-92f2-ebb124f472ba-etc-tuned\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908132 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-sysctl-conf\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908147 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-socket-dir\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.908379 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/73d65bde-5605-46fa-9c02-62fdc3f51501-env-overrides\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtl9r\" (UniqueName: \"kubernetes.io/projected/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-kube-api-access-xtl9r\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908224 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5bcf\" (UniqueName: \"kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf\") pod \"network-check-target-nvhzk\" (UID: \"d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e\") " pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-kubelet\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908277 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-run-systemd\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-socket-dir\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-run-ovn\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/73d65bde-5605-46fa-9c02-62fdc3f51501-ovnkube-script-lib\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908378 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-device-dir\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908403 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-etc-selinux\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908427 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-run\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908452 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-var-lib-openvswitch\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-sys-fs\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908526 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-modprobe-d\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-cni-bin\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908576 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/73d65bde-5605-46fa-9c02-62fdc3f51501-ovn-node-metrics-cert\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908600 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-sysconfig\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.909256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-lib-modules\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a96f7ddd-9780-4b41-abdb-dc8d64c0cb84-iptables-alerter-script\") pod \"iptables-alerter-dz9mm\" (UID: \"a96f7ddd-9780-4b41-abdb-dc8d64c0cb84\") " pod="openshift-network-operator/iptables-alerter-dz9mm" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908643 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ph44\" (UniqueName: \"kubernetes.io/projected/377f4c68-c512-417b-92f2-ebb124f472ba-kube-api-access-4ph44\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a96f7ddd-9780-4b41-abdb-dc8d64c0cb84-host-slash\") pod \"iptables-alerter-dz9mm\" (UID: \"a96f7ddd-9780-4b41-abdb-dc8d64c0cb84\") " pod="openshift-network-operator/iptables-alerter-dz9mm" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908702 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-cni-netd\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908727 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/73d65bde-5605-46fa-9c02-62fdc3f51501-ovnkube-config\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908747 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-kubernetes\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908774 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-systemd\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908795 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-run-systemd\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908798 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-var-lib-kubelet\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-run-ovn-kubernetes\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908885 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-device-dir\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908899 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-kubelet\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908906 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-registration-dir\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908936 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-etc-selinux\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908935 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/73d65bde-5605-46fa-9c02-62fdc3f51501-ovnkube-script-lib\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908940 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-sys\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908954 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/73d65bde-5605-46fa-9c02-62fdc3f51501-env-overrides\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908945 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-registration-dir\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908983 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-sys\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908986 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.909008 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-run-ovn\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908992 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-sysconfig\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.909024 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-run-ovn-kubernetes\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.909036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-var-lib-openvswitch\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908849 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-var-lib-kubelet\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.909045 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a96f7ddd-9780-4b41-abdb-dc8d64c0cb84-host-slash\") pod \"iptables-alerter-dz9mm\" (UID: \"a96f7ddd-9780-4b41-abdb-dc8d64c0cb84\") " pod="openshift-network-operator/iptables-alerter-dz9mm" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.908992 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-slash\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.909074 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-kubernetes\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.909078 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhx87\" (UniqueName: \"kubernetes.io/projected/73d65bde-5605-46fa-9c02-62fdc3f51501-kube-api-access-jhx87\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.909097 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-sys-fs\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.909114 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-run\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.909051 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-cni-bin\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.909132 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-modprobe-d\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.910878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.909137 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-etc-systemd\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.911736 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.909210 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/377f4c68-c512-417b-92f2-ebb124f472ba-lib-modules\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.911736 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.909253 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-cni-netd\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.911736 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.909286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/73d65bde-5605-46fa-9c02-62fdc3f51501-host-slash\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.911736 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.910894 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/377f4c68-c512-417b-92f2-ebb124f472ba-tmp\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.911736 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.911266 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/73d65bde-5605-46fa-9c02-62fdc3f51501-ovn-node-metrics-cert\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:13.911736 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.911382 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/377f4c68-c512-417b-92f2-ebb124f472ba-etc-tuned\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.917973 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:13.915576 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:13.917973 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:13.915606 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:13.917973 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:13.915622 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h5bcf for pod openshift-network-diagnostics/network-check-target-nvhzk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:13.917973 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:13.915697 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf podName:d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e nodeName:}" failed. No retries permitted until 2026-04-20 19:08:14.415677451 +0000 UTC m=+3.134980221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h5bcf" (UniqueName: "kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf") pod "network-check-target-nvhzk" (UID: "d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:13.918270 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.917957 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8snmq\" (UniqueName: \"kubernetes.io/projected/a96f7ddd-9780-4b41-abdb-dc8d64c0cb84-kube-api-access-8snmq\") pod \"iptables-alerter-dz9mm\" (UID: \"a96f7ddd-9780-4b41-abdb-dc8d64c0cb84\") " pod="openshift-network-operator/iptables-alerter-dz9mm" Apr 20 19:08:13.918947 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.918926 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtl9r\" (UniqueName: \"kubernetes.io/projected/2e6e36e3-ed64-4a29-a3d4-5181a81b8f72-kube-api-access-xtl9r\") pod \"aws-ebs-csi-driver-node-vp2sv\" (UID: \"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:13.919819 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.919799 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ph44\" (UniqueName: \"kubernetes.io/projected/377f4c68-c512-417b-92f2-ebb124f472ba-kube-api-access-4ph44\") pod \"tuned-h94wx\" (UID: \"377f4c68-c512-417b-92f2-ebb124f472ba\") " pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:13.919819 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:13.919805 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhx87\" (UniqueName: \"kubernetes.io/projected/73d65bde-5605-46fa-9c02-62fdc3f51501-kube-api-access-jhx87\") pod \"ovnkube-node-hqs9s\" (UID: \"73d65bde-5605-46fa-9c02-62fdc3f51501\") " pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:14.000491 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.000398 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pt5md" Apr 20 19:08:14.014346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.014312 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lghm5" Apr 20 19:08:14.021602 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.021583 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dzjv7" Apr 20 19:08:14.028116 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.028100 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tng2d" Apr 20 19:08:14.035728 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.035709 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hktrf" Apr 20 19:08:14.043311 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.043294 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dz9mm" Apr 20 19:08:14.050907 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.050889 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:14.058483 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.058448 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h94wx" Apr 20 19:08:14.063095 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.063075 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" Apr 20 19:08:14.189690 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.189655 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:08:14.313152 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.313064 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:14.313305 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:14.313220 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:14.313305 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:14.313297 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs podName:e9c238c6-ab0d-4140-b842-f59e7642479c nodeName:}" failed. No retries permitted until 2026-04-20 19:08:15.313278375 +0000 UTC m=+4.032581126 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs") pod "network-metrics-daemon-rwnnv" (UID: "e9c238c6-ab0d-4140-b842-f59e7642479c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:14.511650 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:14.511620 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89b86827_3229_4d28_8418_3ba07654afdd.slice/crio-774b6309f41276060ec495b66f5177d9eebc5851a4242ff868a89da13dec2f0d WatchSource:0}: Error finding container 774b6309f41276060ec495b66f5177d9eebc5851a4242ff868a89da13dec2f0d: Status 404 returned error can't find the container with id 774b6309f41276060ec495b66f5177d9eebc5851a4242ff868a89da13dec2f0d Apr 20 19:08:14.512437 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:14.512410 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73d65bde_5605_46fa_9c02_62fdc3f51501.slice/crio-dc923fe32d92629191d81fa1aaf05da13e3e5965b58278000bbe27dc41220795 WatchSource:0}: Error finding container dc923fe32d92629191d81fa1aaf05da13e3e5965b58278000bbe27dc41220795: Status 404 returned error can't find the container with id dc923fe32d92629191d81fa1aaf05da13e3e5965b58278000bbe27dc41220795 Apr 20 19:08:14.513862 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:14.513838 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9b82167_c5ea_4998_8d5c_93cec402a0fa.slice/crio-00b199f4189fef97abfd9d8be4b5f8c71b11dea5509a412c7bac355abc8fd804 WatchSource:0}: Error finding container 00b199f4189fef97abfd9d8be4b5f8c71b11dea5509a412c7bac355abc8fd804: Status 404 returned error can't find the container with id 00b199f4189fef97abfd9d8be4b5f8c71b11dea5509a412c7bac355abc8fd804 Apr 20 19:08:14.514087 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.513962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5bcf\" (UniqueName: \"kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf\") pod \"network-check-target-nvhzk\" (UID: \"d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e\") " pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:14.514145 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:14.514124 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:14.514145 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:14.514141 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:14.514242 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:14.514154 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h5bcf for pod openshift-network-diagnostics/network-check-target-nvhzk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:14.514291 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:14.514261 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf podName:d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e nodeName:}" failed. No retries permitted until 2026-04-20 19:08:15.51424164 +0000 UTC m=+4.233544402 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-h5bcf" (UniqueName: "kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf") pod "network-check-target-nvhzk" (UID: "d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:14.514716 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:14.514604 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e6e36e3_ed64_4a29_a3d4_5181a81b8f72.slice/crio-3f948274f87f0d9795bdf9f3565f02c79dfc3fc329320f449e51c2ac4c99d574 WatchSource:0}: Error finding container 3f948274f87f0d9795bdf9f3565f02c79dfc3fc329320f449e51c2ac4c99d574: Status 404 returned error can't find the container with id 3f948274f87f0d9795bdf9f3565f02c79dfc3fc329320f449e51c2ac4c99d574 Apr 20 19:08:14.515710 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:14.515506 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda96f7ddd_9780_4b41_abdb_dc8d64c0cb84.slice/crio-f08ef4b0320f7479b8d513d33a8c8b92cac6449ff02c3bf89ee1ff70c3ea591f WatchSource:0}: Error finding container f08ef4b0320f7479b8d513d33a8c8b92cac6449ff02c3bf89ee1ff70c3ea591f: Status 404 returned error can't find the container with id f08ef4b0320f7479b8d513d33a8c8b92cac6449ff02c3bf89ee1ff70c3ea591f Apr 20 19:08:14.517799 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:14.517777 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8d74104_dc6c_4c07_a9eb_c5e76c1d3b08.slice/crio-b84725b2db3cd6ef490cfaffaef3cc3ebb9a75f8530585b6292bda2fa6306b3b WatchSource:0}: Error finding container b84725b2db3cd6ef490cfaffaef3cc3ebb9a75f8530585b6292bda2fa6306b3b: Status 404 returned error can't find the container with id b84725b2db3cd6ef490cfaffaef3cc3ebb9a75f8530585b6292bda2fa6306b3b Apr 20 19:08:14.518736 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:14.518689 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod377f4c68_c512_417b_92f2_ebb124f472ba.slice/crio-cb55434bec6fe869856f664072f4b653bb31f43a5148199f0d4fbe8a0307a5de WatchSource:0}: Error finding container cb55434bec6fe869856f664072f4b653bb31f43a5148199f0d4fbe8a0307a5de: Status 404 returned error can't find the container with id cb55434bec6fe869856f664072f4b653bb31f43a5148199f0d4fbe8a0307a5de Apr 20 19:08:14.519783 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:14.519760 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09af58d0_bcc7_4f62_a6cc_c8e13d0edc7b.slice/crio-6d79de8809f5fd41141985ccd9ebecb86b88276d2787414f0aa2c771eb949c7c WatchSource:0}: Error finding container 6d79de8809f5fd41141985ccd9ebecb86b88276d2787414f0aa2c771eb949c7c: Status 404 returned error can't find the container with id 6d79de8809f5fd41141985ccd9ebecb86b88276d2787414f0aa2c771eb949c7c Apr 20 19:08:14.521119 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:14.521086 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cfa2a80_feb6_4f70_8ca4_75225aa4dae7.slice/crio-743eda1c956a156dce6a8fec7bfcc40135670cf7c4b7009e62b68da3b38daf64 WatchSource:0}: Error finding container 743eda1c956a156dce6a8fec7bfcc40135670cf7c4b7009e62b68da3b38daf64: Status 404 returned error can't find the container with id 743eda1c956a156dce6a8fec7bfcc40135670cf7c4b7009e62b68da3b38daf64 Apr 20 19:08:14.558556 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.558531 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5nsbx"] Apr 20 19:08:14.560640 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.560618 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:14.560755 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:14.560699 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:14.615250 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.615218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-kubelet-config\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:14.615381 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.615262 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-dbus\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:14.615381 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.615361 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:14.716711 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.716674 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:14.717156 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.716758 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-kubelet-config\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:14.717156 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.716788 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-dbus\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:14.717156 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:14.716840 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:14.717156 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.716880 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-dbus\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:14.717156 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:14.716911 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret podName:7850d8bf-83d1-45ed-9a2d-cccbc11a2db8 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:15.216892591 +0000 UTC m=+3.936195346 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret") pod "global-pull-secret-syncer-5nsbx" (UID: "7850d8bf-83d1-45ed-9a2d-cccbc11a2db8") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:14.717156 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.716889 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-kubelet-config\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:14.730252 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.730223 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:03:12 +0000 UTC" deadline="2027-09-21 11:01:24.453746875 +0000 UTC" Apr 20 19:08:14.730252 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.730248 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12447h53m9.723501008s" Apr 20 19:08:14.832136 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.832050 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hktrf" event={"ID":"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7","Type":"ContainerStarted","Data":"743eda1c956a156dce6a8fec7bfcc40135670cf7c4b7009e62b68da3b38daf64"} Apr 20 19:08:14.833068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.833045 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tng2d" event={"ID":"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b","Type":"ContainerStarted","Data":"6d79de8809f5fd41141985ccd9ebecb86b88276d2787414f0aa2c771eb949c7c"} Apr 20 19:08:14.834025 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.833998 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h94wx" event={"ID":"377f4c68-c512-417b-92f2-ebb124f472ba","Type":"ContainerStarted","Data":"cb55434bec6fe869856f664072f4b653bb31f43a5148199f0d4fbe8a0307a5de"} Apr 20 19:08:14.834993 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.834968 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lghm5" event={"ID":"a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08","Type":"ContainerStarted","Data":"b84725b2db3cd6ef490cfaffaef3cc3ebb9a75f8530585b6292bda2fa6306b3b"} Apr 20 19:08:14.835881 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.835860 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dz9mm" event={"ID":"a96f7ddd-9780-4b41-abdb-dc8d64c0cb84","Type":"ContainerStarted","Data":"f08ef4b0320f7479b8d513d33a8c8b92cac6449ff02c3bf89ee1ff70c3ea591f"} Apr 20 19:08:14.836788 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.836770 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" event={"ID":"73d65bde-5605-46fa-9c02-62fdc3f51501","Type":"ContainerStarted","Data":"dc923fe32d92629191d81fa1aaf05da13e3e5965b58278000bbe27dc41220795"} Apr 20 19:08:14.837815 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.837795 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dzjv7" event={"ID":"89b86827-3229-4d28-8418-3ba07654afdd","Type":"ContainerStarted","Data":"774b6309f41276060ec495b66f5177d9eebc5851a4242ff868a89da13dec2f0d"} Apr 20 19:08:14.839257 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.839236 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" event={"ID":"c759d37c693dd42a96a635e33d6e4429","Type":"ContainerStarted","Data":"026f2dffdf3d8f5fda35556e9b8cd23576b5088f2b1ccc49b5ed6f7275f7129d"} Apr 20 19:08:14.840222 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.840201 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" event={"ID":"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72","Type":"ContainerStarted","Data":"3f948274f87f0d9795bdf9f3565f02c79dfc3fc329320f449e51c2ac4c99d574"} Apr 20 19:08:14.841157 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.841140 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pt5md" event={"ID":"f9b82167-c5ea-4998-8d5c-93cec402a0fa","Type":"ContainerStarted","Data":"00b199f4189fef97abfd9d8be4b5f8c71b11dea5509a412c7bac355abc8fd804"} Apr 20 19:08:14.853269 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:14.853225 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-5.ec2.internal" podStartSLOduration=2.853215336 podStartE2EDuration="2.853215336s" podCreationTimestamp="2026-04-20 19:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:08:14.852483137 +0000 UTC m=+3.571785900" watchObservedRunningTime="2026-04-20 19:08:14.853215336 +0000 UTC m=+3.572518107" Apr 20 19:08:15.220333 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:15.220298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:15.220524 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:15.220444 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:15.220584 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:15.220530 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret podName:7850d8bf-83d1-45ed-9a2d-cccbc11a2db8 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:16.220510661 +0000 UTC m=+4.939813417 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret") pod "global-pull-secret-syncer-5nsbx" (UID: "7850d8bf-83d1-45ed-9a2d-cccbc11a2db8") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:15.320702 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:15.320663 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:15.320867 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:15.320855 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:15.320933 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:15.320917 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs podName:e9c238c6-ab0d-4140-b842-f59e7642479c nodeName:}" failed. No retries permitted until 2026-04-20 19:08:17.320897653 +0000 UTC m=+6.040200407 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs") pod "network-metrics-daemon-rwnnv" (UID: "e9c238c6-ab0d-4140-b842-f59e7642479c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:15.523460 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:15.523375 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5bcf\" (UniqueName: \"kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf\") pod \"network-check-target-nvhzk\" (UID: \"d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e\") " pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:15.523641 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:15.523584 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:15.523641 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:15.523605 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:15.523641 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:15.523616 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h5bcf for pod openshift-network-diagnostics/network-check-target-nvhzk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:15.523793 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:15.523674 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf podName:d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e nodeName:}" failed. No retries permitted until 2026-04-20 19:08:17.523655098 +0000 UTC m=+6.242957866 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-h5bcf" (UniqueName: "kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf") pod "network-check-target-nvhzk" (UID: "d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:15.826450 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:15.825297 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:15.826450 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:15.825430 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:15.826450 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:15.825904 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:15.826450 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:15.826069 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:15.826450 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:15.826342 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:15.826450 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:15.826420 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:15.864168 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:15.863391 2572 generic.go:358] "Generic (PLEG): container finished" podID="e2c41a124527f124f4d00e3a7bbddbf2" containerID="91f1d8e2cbab96d776ed14f205f732c7d41dd0a9311b359e11ab4226ea13e969" exitCode=0 Apr 20 19:08:15.864168 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:15.863572 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" event={"ID":"e2c41a124527f124f4d00e3a7bbddbf2","Type":"ContainerDied","Data":"91f1d8e2cbab96d776ed14f205f732c7d41dd0a9311b359e11ab4226ea13e969"} Apr 20 19:08:16.229340 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:16.229295 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:16.229540 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:16.229458 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:16.229616 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:16.229564 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret podName:7850d8bf-83d1-45ed-9a2d-cccbc11a2db8 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:18.229543311 +0000 UTC m=+6.948846076 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret") pod "global-pull-secret-syncer-5nsbx" (UID: "7850d8bf-83d1-45ed-9a2d-cccbc11a2db8") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:16.869416 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:16.869363 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" event={"ID":"e2c41a124527f124f4d00e3a7bbddbf2","Type":"ContainerStarted","Data":"8281efd709e9b4549afebbec97b6da8c9152c2b4ce5a82c6f3c767f4bf569819"} Apr 20 19:08:17.338762 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:17.338726 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:17.338944 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:17.338931 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:17.339368 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:17.339014 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs podName:e9c238c6-ab0d-4140-b842-f59e7642479c nodeName:}" failed. No retries permitted until 2026-04-20 19:08:21.338990414 +0000 UTC m=+10.058293170 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs") pod "network-metrics-daemon-rwnnv" (UID: "e9c238c6-ab0d-4140-b842-f59e7642479c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:17.540536 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:17.540495 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5bcf\" (UniqueName: \"kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf\") pod \"network-check-target-nvhzk\" (UID: \"d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e\") " pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:17.540722 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:17.540686 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:17.540722 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:17.540714 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:17.540843 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:17.540730 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h5bcf for pod openshift-network-diagnostics/network-check-target-nvhzk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:17.540843 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:17.540798 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf podName:d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e nodeName:}" failed. No retries permitted until 2026-04-20 19:08:21.540777417 +0000 UTC m=+10.260080167 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-h5bcf" (UniqueName: "kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf") pod "network-check-target-nvhzk" (UID: "d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:17.824949 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:17.824829 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:17.825125 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:17.824970 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:17.825125 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:17.825053 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:17.825259 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:17.825167 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:17.825259 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:17.825236 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:17.825406 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:17.825341 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:18.248953 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:18.248914 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:18.249417 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:18.249077 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:18.249417 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:18.249143 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret podName:7850d8bf-83d1-45ed-9a2d-cccbc11a2db8 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:22.249124209 +0000 UTC m=+10.968426965 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret") pod "global-pull-secret-syncer-5nsbx" (UID: "7850d8bf-83d1-45ed-9a2d-cccbc11a2db8") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:19.825084 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:19.824995 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:19.825547 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:19.825124 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:19.825547 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:19.825202 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:19.825547 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:19.825284 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:19.825547 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:19.825325 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:19.825547 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:19.825403 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:21.378254 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:21.378100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:21.378254 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:21.378249 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:21.378822 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:21.378315 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs podName:e9c238c6-ab0d-4140-b842-f59e7642479c nodeName:}" failed. No retries permitted until 2026-04-20 19:08:29.378296668 +0000 UTC m=+18.097599417 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs") pod "network-metrics-daemon-rwnnv" (UID: "e9c238c6-ab0d-4140-b842-f59e7642479c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:21.579713 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:21.579676 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5bcf\" (UniqueName: \"kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf\") pod \"network-check-target-nvhzk\" (UID: \"d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e\") " pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:21.579887 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:21.579792 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:21.579887 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:21.579816 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:21.579887 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:21.579829 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h5bcf for pod openshift-network-diagnostics/network-check-target-nvhzk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:21.580033 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:21.579893 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf podName:d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e nodeName:}" failed. No retries permitted until 2026-04-20 19:08:29.579873013 +0000 UTC m=+18.299175779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-h5bcf" (UniqueName: "kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf") pod "network-check-target-nvhzk" (UID: "d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:21.825957 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:21.825511 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:21.825957 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:21.825622 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:21.825957 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:21.825860 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:21.826231 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:21.825978 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:21.826231 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:21.826065 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:21.826231 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:21.826105 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:22.285866 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:22.285776 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:22.286043 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:22.285947 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:22.286043 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:22.286027 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret podName:7850d8bf-83d1-45ed-9a2d-cccbc11a2db8 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:30.286004753 +0000 UTC m=+19.005307527 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret") pod "global-pull-secret-syncer-5nsbx" (UID: "7850d8bf-83d1-45ed-9a2d-cccbc11a2db8") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:23.825254 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:23.825170 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:23.825254 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:23.825202 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:23.825254 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:23.825237 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:23.825803 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:23.825306 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:23.825803 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:23.825742 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:23.825911 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:23.825836 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:25.825096 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:25.825063 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:25.825565 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:25.825063 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:25.825565 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:25.825185 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:25.825565 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:25.825068 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:25.825565 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:25.825265 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:25.825565 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:25.825390 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:27.825073 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:27.824977 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:27.825511 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:27.825089 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:27.825511 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:27.825100 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:27.825511 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:27.825147 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:27.825511 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:27.825291 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:27.825511 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:27.825406 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:29.435256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:29.435213 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:29.435715 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:29.435382 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:29.435715 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:29.435484 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs podName:e9c238c6-ab0d-4140-b842-f59e7642479c nodeName:}" failed. No retries permitted until 2026-04-20 19:08:45.435447589 +0000 UTC m=+34.154750351 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs") pod "network-metrics-daemon-rwnnv" (UID: "e9c238c6-ab0d-4140-b842-f59e7642479c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:29.636416 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:29.636379 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5bcf\" (UniqueName: \"kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf\") pod \"network-check-target-nvhzk\" (UID: \"d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e\") " pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:29.636578 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:29.636514 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:29.636578 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:29.636528 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:29.636578 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:29.636538 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h5bcf for pod openshift-network-diagnostics/network-check-target-nvhzk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:29.636669 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:29.636586 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf podName:d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e nodeName:}" failed. No retries permitted until 2026-04-20 19:08:45.636573175 +0000 UTC m=+34.355875924 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-h5bcf" (UniqueName: "kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf") pod "network-check-target-nvhzk" (UID: "d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:29.824657 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:29.824578 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:29.824657 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:29.824627 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:29.824863 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:29.824719 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:29.824863 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:29.824791 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:29.824863 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:29.824830 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:29.825014 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:29.824927 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:30.342938 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:30.342897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:30.343134 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:30.343062 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:30.343194 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:30.343141 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret podName:7850d8bf-83d1-45ed-9a2d-cccbc11a2db8 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:46.343120993 +0000 UTC m=+35.062423749 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret") pod "global-pull-secret-syncer-5nsbx" (UID: "7850d8bf-83d1-45ed-9a2d-cccbc11a2db8") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:31.825709 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.825686 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:31.826311 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:31.825775 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:31.826311 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.826030 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:31.826311 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:31.826099 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:31.826311 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.826129 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:31.826311 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:31.826165 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:31.895412 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.895380 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hktrf" event={"ID":"5cfa2a80-feb6-4f70-8ca4-75225aa4dae7","Type":"ContainerStarted","Data":"ef735acacd6c08e1e16ee2b65f3b8ce578b696d1ed4f120da329e71cafb77ee3"} Apr 20 19:08:31.896876 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.896844 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tng2d" event={"ID":"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b","Type":"ContainerStarted","Data":"46a54011de557e9be2ec7f5945c6e20b63bd3090e72a83f02020f1a41c719ed4"} Apr 20 19:08:31.898389 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.898350 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h94wx" event={"ID":"377f4c68-c512-417b-92f2-ebb124f472ba","Type":"ContainerStarted","Data":"d57038c34a1a76ce84cf075657ffe35ded7314074a9a785a4b3ed6e73b895414"} Apr 20 19:08:31.899712 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.899683 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lghm5" event={"ID":"a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08","Type":"ContainerStarted","Data":"0467fca976ff70a5d8eac655cbc8fd4455ddac831432436a5447288124cacdcf"} Apr 20 19:08:31.902287 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.902260 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" event={"ID":"73d65bde-5605-46fa-9c02-62fdc3f51501","Type":"ContainerStarted","Data":"662dfbf4ed76e3c57abbd4b072b8313f08feccca6902cd44cd410d4300279e24"} Apr 20 19:08:31.902388 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.902297 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" event={"ID":"73d65bde-5605-46fa-9c02-62fdc3f51501","Type":"ContainerStarted","Data":"871397e5809ba2e8e1f1ca0243047ab54bfdf8d4d6d75654e743eb21745ac62d"} Apr 20 19:08:31.902388 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.902310 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" event={"ID":"73d65bde-5605-46fa-9c02-62fdc3f51501","Type":"ContainerStarted","Data":"a8baf70d98c537c9536c71ed6d0dcdcb8f23f42e9f108295d7625c20f94f14bb"} Apr 20 19:08:31.904067 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.904046 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dzjv7" event={"ID":"89b86827-3229-4d28-8418-3ba07654afdd","Type":"ContainerStarted","Data":"07119be812677d862bcb303c0007c856024a79cfa9982a7eee3a4ac8c3a3001b"} Apr 20 19:08:31.905251 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.905230 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" event={"ID":"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72","Type":"ContainerStarted","Data":"997dc393db57c60d396965a66b9442b71521d208310694696ef516603c210c2c"} Apr 20 19:08:31.906637 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.906615 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pt5md" event={"ID":"f9b82167-c5ea-4998-8d5c-93cec402a0fa","Type":"ContainerStarted","Data":"946a2bfe266fb7ba7d26028f4c1394d7ac1a1227a813863b4d4a792f88414baa"} Apr 20 19:08:31.916849 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.916800 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-5.ec2.internal" podStartSLOduration=19.916784933 podStartE2EDuration="19.916784933s" podCreationTimestamp="2026-04-20 19:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:08:16.889567066 +0000 UTC m=+5.608869840" watchObservedRunningTime="2026-04-20 19:08:31.916784933 +0000 UTC m=+20.636087707" Apr 20 19:08:31.917059 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.917029 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hktrf" podStartSLOduration=3.951204197 podStartE2EDuration="20.917021941s" podCreationTimestamp="2026-04-20 19:08:11 +0000 UTC" firstStartedPulling="2026-04-20 19:08:14.523503232 +0000 UTC m=+3.242805986" lastFinishedPulling="2026-04-20 19:08:31.489320965 +0000 UTC m=+20.208623730" observedRunningTime="2026-04-20 19:08:31.916924243 +0000 UTC m=+20.636227005" watchObservedRunningTime="2026-04-20 19:08:31.917021941 +0000 UTC m=+20.636324714" Apr 20 19:08:31.931589 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.931551 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lghm5" podStartSLOduration=4.000090827 podStartE2EDuration="20.931537s" podCreationTimestamp="2026-04-20 19:08:11 +0000 UTC" firstStartedPulling="2026-04-20 19:08:14.519822388 +0000 UTC m=+3.239125153" lastFinishedPulling="2026-04-20 19:08:31.451268563 +0000 UTC m=+20.170571326" observedRunningTime="2026-04-20 19:08:31.931260048 +0000 UTC m=+20.650562912" watchObservedRunningTime="2026-04-20 19:08:31.931537 +0000 UTC m=+20.650839771" Apr 20 19:08:31.970619 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.970579 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-h94wx" podStartSLOduration=3.040149891 podStartE2EDuration="19.970565559s" podCreationTimestamp="2026-04-20 19:08:12 +0000 UTC" firstStartedPulling="2026-04-20 19:08:14.520666511 +0000 UTC m=+3.239969265" lastFinishedPulling="2026-04-20 19:08:31.45108217 +0000 UTC m=+20.170384933" observedRunningTime="2026-04-20 19:08:31.970346056 +0000 UTC m=+20.689648828" watchObservedRunningTime="2026-04-20 19:08:31.970565559 +0000 UTC m=+20.689868331" Apr 20 19:08:31.987212 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:31.987169 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pt5md" podStartSLOduration=4.052237752 podStartE2EDuration="20.987155891s" podCreationTimestamp="2026-04-20 19:08:11 +0000 UTC" firstStartedPulling="2026-04-20 19:08:14.51610664 +0000 UTC m=+3.235409390" lastFinishedPulling="2026-04-20 19:08:31.451024771 +0000 UTC m=+20.170327529" observedRunningTime="2026-04-20 19:08:31.986313768 +0000 UTC m=+20.705616540" watchObservedRunningTime="2026-04-20 19:08:31.987155891 +0000 UTC m=+20.706458663" Apr 20 19:08:32.000805 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:32.000765 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dzjv7" podStartSLOduration=12.031092123 podStartE2EDuration="21.000749969s" podCreationTimestamp="2026-04-20 19:08:11 +0000 UTC" firstStartedPulling="2026-04-20 19:08:14.513171876 +0000 UTC m=+3.232474631" lastFinishedPulling="2026-04-20 19:08:23.482829724 +0000 UTC m=+12.202132477" observedRunningTime="2026-04-20 19:08:32.000318589 +0000 UTC m=+20.719621391" watchObservedRunningTime="2026-04-20 19:08:32.000749969 +0000 UTC m=+20.720052741" Apr 20 19:08:32.633660 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:32.633638 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 19:08:32.773076 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:32.772989 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T19:08:32.633654897Z","UUID":"79c9e40c-1fdb-4a81-bfc0-a5ed91821940","Handler":null,"Name":"","Endpoint":""} Apr 20 19:08:32.775393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:32.775371 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 19:08:32.775393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:32.775398 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 19:08:32.909666 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:32.909634 2572 generic.go:358] "Generic (PLEG): container finished" podID="09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b" containerID="46a54011de557e9be2ec7f5945c6e20b63bd3090e72a83f02020f1a41c719ed4" exitCode=0 Apr 20 19:08:32.910036 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:32.909705 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tng2d" event={"ID":"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b","Type":"ContainerDied","Data":"46a54011de557e9be2ec7f5945c6e20b63bd3090e72a83f02020f1a41c719ed4"} Apr 20 19:08:32.912500 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:32.912451 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" event={"ID":"73d65bde-5605-46fa-9c02-62fdc3f51501","Type":"ContainerStarted","Data":"1e223859fdbdc0a21b89b7620fb79fcb7969073647f0066f4e6038df9d80fd22"} Apr 20 19:08:32.912613 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:32.912510 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" event={"ID":"73d65bde-5605-46fa-9c02-62fdc3f51501","Type":"ContainerStarted","Data":"551b37545f6a3374f15dc66af92b86abe0ddafec1a29bf2470c0cfa987148081"} Apr 20 19:08:32.912613 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:32.912526 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" event={"ID":"73d65bde-5605-46fa-9c02-62fdc3f51501","Type":"ContainerStarted","Data":"d4af081a75455c2a42d3eef65247465a6612935e15ca57d2d27c81e899fe4a52"} Apr 20 19:08:32.914130 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:32.914105 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" event={"ID":"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72","Type":"ContainerStarted","Data":"e6a5a0b55f672bc21a0a021440582489c2f2f05421217cebb301f5957b8e5f8c"} Apr 20 19:08:33.415496 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:33.415461 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pt5md" Apr 20 19:08:33.416096 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:33.416077 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pt5md" Apr 20 19:08:33.824899 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:33.824819 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:33.825040 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:33.824819 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:33.825040 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:33.824968 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:33.825040 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:33.824819 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:33.825040 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:33.825008 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:33.825184 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:33.825073 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:33.917942 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:33.917906 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" event={"ID":"2e6e36e3-ed64-4a29-a3d4-5181a81b8f72","Type":"ContainerStarted","Data":"6fa3a779a79a1859395c33fd122937c6a9482550451140f08400d95c8a6cad21"} Apr 20 19:08:33.919204 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:33.919175 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dz9mm" event={"ID":"a96f7ddd-9780-4b41-abdb-dc8d64c0cb84","Type":"ContainerStarted","Data":"1b273cea5f30ba4839eedd6c9cfa90e69b865fa7f026899473df19a6496f9352"} Apr 20 19:08:33.919338 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:33.919327 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pt5md" Apr 20 19:08:33.919758 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:33.919741 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pt5md" Apr 20 19:08:33.934540 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:33.934505 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-vp2sv" podStartSLOduration=3.042314772 podStartE2EDuration="21.934494832s" podCreationTimestamp="2026-04-20 19:08:12 +0000 UTC" firstStartedPulling="2026-04-20 19:08:14.51703798 +0000 UTC m=+3.236340743" lastFinishedPulling="2026-04-20 19:08:33.409218049 +0000 UTC m=+22.128520803" observedRunningTime="2026-04-20 19:08:33.934140172 +0000 UTC m=+22.653442943" watchObservedRunningTime="2026-04-20 19:08:33.934494832 +0000 UTC m=+22.653797604" Apr 20 19:08:34.924825 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:34.924770 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" event={"ID":"73d65bde-5605-46fa-9c02-62fdc3f51501","Type":"ContainerStarted","Data":"aa2a98ba7451d05efed4f5e60c4c13dd76c8deb21fdfe7d17a4dbf2ecadbaf17"} Apr 20 19:08:35.825093 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:35.825055 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:35.825273 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:35.825055 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:35.825273 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:35.825187 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:35.825394 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:35.825066 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:35.825394 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:35.825287 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:35.825394 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:35.825383 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:36.932909 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:36.932870 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" event={"ID":"73d65bde-5605-46fa-9c02-62fdc3f51501","Type":"ContainerStarted","Data":"15b580dfa2ff1cc98319760bff6f45369fc506f67b657a9170ad991b5881fe3e"} Apr 20 19:08:36.933457 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:36.933178 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:36.933457 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:36.933209 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:36.933457 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:36.933224 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:36.951605 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:36.951576 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:36.952029 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:36.952008 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:08:36.957242 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:36.957203 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-dz9mm" podStartSLOduration=9.041015148 podStartE2EDuration="25.957191885s" podCreationTimestamp="2026-04-20 19:08:11 +0000 UTC" firstStartedPulling="2026-04-20 19:08:14.517124099 +0000 UTC m=+3.236426862" lastFinishedPulling="2026-04-20 19:08:31.433300846 +0000 UTC m=+20.152603599" observedRunningTime="2026-04-20 19:08:33.960907562 +0000 UTC m=+22.680210334" watchObservedRunningTime="2026-04-20 19:08:36.957191885 +0000 UTC m=+25.676494657" Apr 20 19:08:36.957554 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:36.957525 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" podStartSLOduration=8.942681279 podStartE2EDuration="25.95751864s" podCreationTimestamp="2026-04-20 19:08:11 +0000 UTC" firstStartedPulling="2026-04-20 19:08:14.514326142 +0000 UTC m=+3.233628907" lastFinishedPulling="2026-04-20 19:08:31.529163518 +0000 UTC m=+20.248466268" observedRunningTime="2026-04-20 19:08:36.957164547 +0000 UTC m=+25.676467321" watchObservedRunningTime="2026-04-20 19:08:36.95751864 +0000 UTC m=+25.676821414" Apr 20 19:08:37.825270 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:37.825237 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:37.825270 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:37.825259 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:37.825507 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:37.825361 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:37.825507 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:37.825395 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:37.825609 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:37.825530 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:37.825658 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:37.825626 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:38.415753 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:38.415690 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nvhzk"] Apr 20 19:08:38.416093 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:38.415837 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:38.416093 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:38.415952 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:38.420060 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:38.419753 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5nsbx"] Apr 20 19:08:38.420060 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:38.419881 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:38.420060 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:38.420004 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:38.421026 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:38.420643 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rwnnv"] Apr 20 19:08:38.421026 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:38.420765 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:38.421026 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:38.420899 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:38.938490 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:38.938272 2572 generic.go:358] "Generic (PLEG): container finished" podID="09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b" containerID="87af03dbe6b608adaed457c762db94ec5a090f9de81752d090ea7671662098d3" exitCode=0 Apr 20 19:08:38.938646 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:38.938347 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tng2d" event={"ID":"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b","Type":"ContainerDied","Data":"87af03dbe6b608adaed457c762db94ec5a090f9de81752d090ea7671662098d3"} Apr 20 19:08:39.827663 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:39.827634 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:39.827663 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:39.827656 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:39.828155 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:39.827640 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:39.828155 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:39.827728 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:39.828155 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:39.827823 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:39.828155 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:39.827910 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:40.943097 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:40.943058 2572 generic.go:358] "Generic (PLEG): container finished" podID="09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b" containerID="ab58e3e5c4fb1077aa14ca08713300f330004f1434f4841ea3e589f97440a858" exitCode=0 Apr 20 19:08:40.943787 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:40.943133 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tng2d" event={"ID":"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b","Type":"ContainerDied","Data":"ab58e3e5c4fb1077aa14ca08713300f330004f1434f4841ea3e589f97440a858"} Apr 20 19:08:41.827141 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:41.827072 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:41.827276 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:41.827072 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:41.827276 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:41.827159 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:41.827276 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:41.827076 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:41.827276 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:41.827228 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:41.827413 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:41.827292 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:42.948351 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:42.948317 2572 generic.go:358] "Generic (PLEG): container finished" podID="09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b" containerID="54ec18619fc261597c89453d4bf13c7f8752f868d97d00745795f2adadf00a55" exitCode=0 Apr 20 19:08:42.948786 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:42.948381 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tng2d" event={"ID":"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b","Type":"ContainerDied","Data":"54ec18619fc261597c89453d4bf13c7f8752f868d97d00745795f2adadf00a55"} Apr 20 19:08:43.825304 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:43.825270 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:43.825304 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:43.825289 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:43.825304 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:43.825301 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:43.825601 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:43.825413 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:08:43.825601 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:43.825569 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-nvhzk" podUID="d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e" Apr 20 19:08:43.826250 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:43.825675 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5nsbx" podUID="7850d8bf-83d1-45ed-9a2d-cccbc11a2db8" Apr 20 19:08:44.598288 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.598260 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-5.ec2.internal" event="NodeReady" Apr 20 19:08:44.598873 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.598401 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 19:08:44.643168 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.643137 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-srqmp"] Apr 20 19:08:44.668846 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.668812 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8c6k6"] Apr 20 19:08:44.669079 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.669006 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:44.671170 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.671145 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 19:08:44.671292 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.671179 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 19:08:44.671575 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.671426 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ggf2q\"" Apr 20 19:08:44.689460 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.689434 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-srqmp"] Apr 20 19:08:44.689590 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.689464 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8c6k6"] Apr 20 19:08:44.689590 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.689574 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:08:44.691586 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.691564 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 19:08:44.691701 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.691650 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gtxhr\"" Apr 20 19:08:44.691701 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.691654 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 19:08:44.691815 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.691651 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 19:08:44.852622 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.852536 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zhdw\" (UniqueName: \"kubernetes.io/projected/70ee668a-415c-4913-9312-a001e69b58d8-kube-api-access-4zhdw\") pod \"ingress-canary-8c6k6\" (UID: \"70ee668a-415c-4913-9312-a001e69b58d8\") " pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:08:44.852622 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.852581 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx6tv\" (UniqueName: \"kubernetes.io/projected/808a40da-6675-4800-964d-852bb302978e-kube-api-access-lx6tv\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:44.852858 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.852645 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/808a40da-6675-4800-964d-852bb302978e-config-volume\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:44.852858 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.852690 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:44.852858 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.852745 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert\") pod \"ingress-canary-8c6k6\" (UID: \"70ee668a-415c-4913-9312-a001e69b58d8\") " pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:08:44.852858 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.852801 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/808a40da-6675-4800-964d-852bb302978e-tmp-dir\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:44.954170 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.954135 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:44.954346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.954201 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert\") pod \"ingress-canary-8c6k6\" (UID: \"70ee668a-415c-4913-9312-a001e69b58d8\") " pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:08:44.954346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.954298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/808a40da-6675-4800-964d-852bb302978e-tmp-dir\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:44.954346 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:44.954306 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:44.954346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.954334 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zhdw\" (UniqueName: \"kubernetes.io/projected/70ee668a-415c-4913-9312-a001e69b58d8-kube-api-access-4zhdw\") pod \"ingress-canary-8c6k6\" (UID: \"70ee668a-415c-4913-9312-a001e69b58d8\") " pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:08:44.954594 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.954350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lx6tv\" (UniqueName: \"kubernetes.io/projected/808a40da-6675-4800-964d-852bb302978e-kube-api-access-lx6tv\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:44.954594 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:44.954378 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls podName:808a40da-6675-4800-964d-852bb302978e nodeName:}" failed. No retries permitted until 2026-04-20 19:08:45.454354841 +0000 UTC m=+34.173657597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls") pod "dns-default-srqmp" (UID: "808a40da-6675-4800-964d-852bb302978e") : secret "dns-default-metrics-tls" not found Apr 20 19:08:44.954594 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:44.954306 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:44.954594 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.954428 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/808a40da-6675-4800-964d-852bb302978e-config-volume\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:44.954594 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:44.954452 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert podName:70ee668a-415c-4913-9312-a001e69b58d8 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:45.454437284 +0000 UTC m=+34.173740044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert") pod "ingress-canary-8c6k6" (UID: "70ee668a-415c-4913-9312-a001e69b58d8") : secret "canary-serving-cert" not found Apr 20 19:08:44.955081 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.955059 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/808a40da-6675-4800-964d-852bb302978e-config-volume\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:44.961742 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.961676 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/808a40da-6675-4800-964d-852bb302978e-tmp-dir\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:44.964889 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.964870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx6tv\" (UniqueName: \"kubernetes.io/projected/808a40da-6675-4800-964d-852bb302978e-kube-api-access-lx6tv\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:44.965136 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:44.965113 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zhdw\" (UniqueName: \"kubernetes.io/projected/70ee668a-415c-4913-9312-a001e69b58d8-kube-api-access-4zhdw\") pod \"ingress-canary-8c6k6\" (UID: \"70ee668a-415c-4913-9312-a001e69b58d8\") " pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:08:45.459329 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:45.459293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert\") pod \"ingress-canary-8c6k6\" (UID: \"70ee668a-415c-4913-9312-a001e69b58d8\") " pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:08:45.459630 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:45.459355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:45.459630 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:45.459392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:45.459630 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:45.459459 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:45.459630 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:45.459508 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:45.459630 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:45.459507 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:45.459630 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:45.459554 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert podName:70ee668a-415c-4913-9312-a001e69b58d8 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:46.459532126 +0000 UTC m=+35.178834875 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert") pod "ingress-canary-8c6k6" (UID: "70ee668a-415c-4913-9312-a001e69b58d8") : secret "canary-serving-cert" not found Apr 20 19:08:45.459630 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:45.459580 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs podName:e9c238c6-ab0d-4140-b842-f59e7642479c nodeName:}" failed. No retries permitted until 2026-04-20 19:09:17.45956525 +0000 UTC m=+66.178868000 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs") pod "network-metrics-daemon-rwnnv" (UID: "e9c238c6-ab0d-4140-b842-f59e7642479c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:45.459630 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:45.459597 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls podName:808a40da-6675-4800-964d-852bb302978e nodeName:}" failed. No retries permitted until 2026-04-20 19:08:46.459588542 +0000 UTC m=+35.178891293 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls") pod "dns-default-srqmp" (UID: "808a40da-6675-4800-964d-852bb302978e") : secret "dns-default-metrics-tls" not found Apr 20 19:08:45.661462 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:45.661206 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5bcf\" (UniqueName: \"kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf\") pod \"network-check-target-nvhzk\" (UID: \"d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e\") " pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:45.661897 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:45.661355 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:45.661897 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:45.661544 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:45.661897 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:45.661557 2572 projected.go:194] Error preparing data for projected volume kube-api-access-h5bcf for pod openshift-network-diagnostics/network-check-target-nvhzk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:45.661897 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:45.661619 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf podName:d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e nodeName:}" failed. No retries permitted until 2026-04-20 19:09:17.661598481 +0000 UTC m=+66.380901245 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-h5bcf" (UniqueName: "kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf") pod "network-check-target-nvhzk" (UID: "d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:45.824616 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:45.824533 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:08:45.824775 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:45.824533 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:45.824874 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:45.824533 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:08:45.827414 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:45.827368 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:08:45.827414 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:45.827380 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:08:45.827414 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:45.827373 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 19:08:45.828183 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:45.827905 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:08:45.828183 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:45.827951 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wstmx\"" Apr 20 19:08:45.828183 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:45.827905 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bdmvj\"" Apr 20 19:08:46.366940 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:46.366889 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:46.369807 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:46.369785 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/7850d8bf-83d1-45ed-9a2d-cccbc11a2db8-original-pull-secret\") pod \"global-pull-secret-syncer-5nsbx\" (UID: \"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8\") " pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:46.445003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:46.444962 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5nsbx" Apr 20 19:08:46.468142 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:46.468111 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:46.468298 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:46.468169 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert\") pod \"ingress-canary-8c6k6\" (UID: \"70ee668a-415c-4913-9312-a001e69b58d8\") " pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:08:46.468298 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:46.468260 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:46.468389 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:46.468329 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert podName:70ee668a-415c-4913-9312-a001e69b58d8 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:48.468308883 +0000 UTC m=+37.187611635 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert") pod "ingress-canary-8c6k6" (UID: "70ee668a-415c-4913-9312-a001e69b58d8") : secret "canary-serving-cert" not found Apr 20 19:08:46.468389 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:46.468261 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:46.468504 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:46.468436 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls podName:808a40da-6675-4800-964d-852bb302978e nodeName:}" failed. No retries permitted until 2026-04-20 19:08:48.468411722 +0000 UTC m=+37.187714541 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls") pod "dns-default-srqmp" (UID: "808a40da-6675-4800-964d-852bb302978e") : secret "dns-default-metrics-tls" not found Apr 20 19:08:48.483262 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:48.483222 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:48.483695 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:48.483278 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert\") pod \"ingress-canary-8c6k6\" (UID: \"70ee668a-415c-4913-9312-a001e69b58d8\") " pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:08:48.483695 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:48.483370 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:48.483695 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:48.483436 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert podName:70ee668a-415c-4913-9312-a001e69b58d8 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:52.483421116 +0000 UTC m=+41.202723866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert") pod "ingress-canary-8c6k6" (UID: "70ee668a-415c-4913-9312-a001e69b58d8") : secret "canary-serving-cert" not found Apr 20 19:08:48.483695 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:48.483376 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:48.483695 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:48.483549 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls podName:808a40da-6675-4800-964d-852bb302978e nodeName:}" failed. No retries permitted until 2026-04-20 19:08:52.483531194 +0000 UTC m=+41.202833947 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls") pod "dns-default-srqmp" (UID: "808a40da-6675-4800-964d-852bb302978e") : secret "dns-default-metrics-tls" not found Apr 20 19:08:48.627857 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:48.627804 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5nsbx"] Apr 20 19:08:48.682154 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:08:48.682115 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7850d8bf_83d1_45ed_9a2d_cccbc11a2db8.slice/crio-49525f8189e8313b0db0e73ba6d3c1f8ad755503bc44db1e1bc4d6dcb3e25208 WatchSource:0}: Error finding container 49525f8189e8313b0db0e73ba6d3c1f8ad755503bc44db1e1bc4d6dcb3e25208: Status 404 returned error can't find the container with id 49525f8189e8313b0db0e73ba6d3c1f8ad755503bc44db1e1bc4d6dcb3e25208 Apr 20 19:08:48.963289 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:48.963238 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tng2d" event={"ID":"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b","Type":"ContainerStarted","Data":"fa6c5e8dc5cbf459e3fd1b3b0a5a406cbe641155f494f60e92755e0c3cf13acb"} Apr 20 19:08:48.964436 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:48.964399 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5nsbx" event={"ID":"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8","Type":"ContainerStarted","Data":"49525f8189e8313b0db0e73ba6d3c1f8ad755503bc44db1e1bc4d6dcb3e25208"} Apr 20 19:08:49.969666 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:49.969448 2572 generic.go:358] "Generic (PLEG): container finished" podID="09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b" containerID="fa6c5e8dc5cbf459e3fd1b3b0a5a406cbe641155f494f60e92755e0c3cf13acb" exitCode=0 Apr 20 19:08:49.970158 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:49.969512 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tng2d" event={"ID":"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b","Type":"ContainerDied","Data":"fa6c5e8dc5cbf459e3fd1b3b0a5a406cbe641155f494f60e92755e0c3cf13acb"} Apr 20 19:08:50.975440 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:50.975401 2572 generic.go:358] "Generic (PLEG): container finished" podID="09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b" containerID="0f2fd96f4de4e0db2381553d4c0c75357f9c61de68d32c01d581a53c59c75645" exitCode=0 Apr 20 19:08:50.975850 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:50.975494 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tng2d" event={"ID":"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b","Type":"ContainerDied","Data":"0f2fd96f4de4e0db2381553d4c0c75357f9c61de68d32c01d581a53c59c75645"} Apr 20 19:08:51.981043 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:51.981005 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tng2d" event={"ID":"09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b","Type":"ContainerStarted","Data":"e2fa89fb67d70f61b00a3cb0209e6ccc7aa56d8a65e55aa51f74faedcf985040"} Apr 20 19:08:52.003933 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:52.003875 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tng2d" podStartSLOduration=6.812622039 podStartE2EDuration="41.003858844s" podCreationTimestamp="2026-04-20 19:08:11 +0000 UTC" firstStartedPulling="2026-04-20 19:08:14.521944424 +0000 UTC m=+3.241247174" lastFinishedPulling="2026-04-20 19:08:48.713181214 +0000 UTC m=+37.432483979" observedRunningTime="2026-04-20 19:08:52.002886805 +0000 UTC m=+40.722189581" watchObservedRunningTime="2026-04-20 19:08:52.003858844 +0000 UTC m=+40.723161620" Apr 20 19:08:52.513721 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:52.513682 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert\") pod \"ingress-canary-8c6k6\" (UID: \"70ee668a-415c-4913-9312-a001e69b58d8\") " pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:08:52.513896 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:52.513782 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:08:52.513896 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:52.513834 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:52.513896 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:52.513889 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:52.514030 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:52.513924 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert podName:70ee668a-415c-4913-9312-a001e69b58d8 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:00.513907004 +0000 UTC m=+49.233209755 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert") pod "ingress-canary-8c6k6" (UID: "70ee668a-415c-4913-9312-a001e69b58d8") : secret "canary-serving-cert" not found Apr 20 19:08:52.514030 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:08:52.513940 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls podName:808a40da-6675-4800-964d-852bb302978e nodeName:}" failed. No retries permitted until 2026-04-20 19:09:00.513933038 +0000 UTC m=+49.233235788 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls") pod "dns-default-srqmp" (UID: "808a40da-6675-4800-964d-852bb302978e") : secret "dns-default-metrics-tls" not found Apr 20 19:08:53.985911 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:53.985872 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5nsbx" event={"ID":"7850d8bf-83d1-45ed-9a2d-cccbc11a2db8","Type":"ContainerStarted","Data":"689509e679460415eaf99878efe467e9e133018d947091ddf19f4a6856a3e46d"} Apr 20 19:08:54.006751 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:08:54.006703 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5nsbx" podStartSLOduration=35.558871342 podStartE2EDuration="40.006689151s" podCreationTimestamp="2026-04-20 19:08:14 +0000 UTC" firstStartedPulling="2026-04-20 19:08:48.691574809 +0000 UTC m=+37.410877559" lastFinishedPulling="2026-04-20 19:08:53.139392605 +0000 UTC m=+41.858695368" observedRunningTime="2026-04-20 19:08:54.006520846 +0000 UTC m=+42.725823620" watchObservedRunningTime="2026-04-20 19:08:54.006689151 +0000 UTC m=+42.725991924" Apr 20 19:09:00.571033 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:00.569158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:09:00.571033 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:00.569253 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert\") pod \"ingress-canary-8c6k6\" (UID: \"70ee668a-415c-4913-9312-a001e69b58d8\") " pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:09:00.571033 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:09:00.569412 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:09:00.571033 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:09:00.569503 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert podName:70ee668a-415c-4913-9312-a001e69b58d8 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:16.569454679 +0000 UTC m=+65.288757449 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert") pod "ingress-canary-8c6k6" (UID: "70ee668a-415c-4913-9312-a001e69b58d8") : secret "canary-serving-cert" not found Apr 20 19:09:00.571033 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:09:00.569992 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:09:00.571033 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:09:00.570053 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls podName:808a40da-6675-4800-964d-852bb302978e nodeName:}" failed. No retries permitted until 2026-04-20 19:09:16.570037305 +0000 UTC m=+65.289340057 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls") pod "dns-default-srqmp" (UID: "808a40da-6675-4800-964d-852bb302978e") : secret "dns-default-metrics-tls" not found Apr 20 19:09:08.956606 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:08.956576 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hqs9s" Apr 20 19:09:14.814277 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.814238 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj"] Apr 20 19:09:14.820551 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.820531 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" Apr 20 19:09:14.823352 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.823330 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 19:09:14.823683 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.823345 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 19:09:14.823938 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.823914 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 19:09:14.824058 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.824042 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 19:09:14.824465 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.824395 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj"] Apr 20 19:09:14.863315 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.863292 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r"] Apr 20 19:09:14.863732 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.863709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a90d00ae-29bb-4e3f-b6ec-33d288c3b449-klusterlet-config\") pod \"klusterlet-addon-workmgr-7c854dddc7-xpxtj\" (UID: \"a90d00ae-29bb-4e3f-b6ec-33d288c3b449\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" Apr 20 19:09:14.863839 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.863745 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdv69\" (UniqueName: \"kubernetes.io/projected/a90d00ae-29bb-4e3f-b6ec-33d288c3b449-kube-api-access-hdv69\") pod \"klusterlet-addon-workmgr-7c854dddc7-xpxtj\" (UID: \"a90d00ae-29bb-4e3f-b6ec-33d288c3b449\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" Apr 20 19:09:14.863888 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.863842 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a90d00ae-29bb-4e3f-b6ec-33d288c3b449-tmp\") pod \"klusterlet-addon-workmgr-7c854dddc7-xpxtj\" (UID: \"a90d00ae-29bb-4e3f-b6ec-33d288c3b449\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" Apr 20 19:09:14.866202 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.866188 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:14.868313 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.868291 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 19:09:14.868313 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.868310 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 19:09:14.868454 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.868301 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 19:09:14.868454 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.868349 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 19:09:14.874397 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.874373 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r"] Apr 20 19:09:14.964998 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.964960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a90d00ae-29bb-4e3f-b6ec-33d288c3b449-klusterlet-config\") pod \"klusterlet-addon-workmgr-7c854dddc7-xpxtj\" (UID: \"a90d00ae-29bb-4e3f-b6ec-33d288c3b449\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" Apr 20 19:09:14.964998 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.965005 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdv69\" (UniqueName: \"kubernetes.io/projected/a90d00ae-29bb-4e3f-b6ec-33d288c3b449-kube-api-access-hdv69\") pod \"klusterlet-addon-workmgr-7c854dddc7-xpxtj\" (UID: \"a90d00ae-29bb-4e3f-b6ec-33d288c3b449\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" Apr 20 19:09:14.965229 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.965032 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9742c349-1fef-4082-91d3-df662355e514-ca\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:14.965229 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.965061 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9742c349-1fef-4082-91d3-df662355e514-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:14.965229 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.965111 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9742c349-1fef-4082-91d3-df662355e514-hub\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:14.965229 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.965162 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9742c349-1fef-4082-91d3-df662355e514-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:14.965229 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.965196 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4lhh\" (UniqueName: \"kubernetes.io/projected/9742c349-1fef-4082-91d3-df662355e514-kube-api-access-k4lhh\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:14.965229 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.965230 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a90d00ae-29bb-4e3f-b6ec-33d288c3b449-tmp\") pod \"klusterlet-addon-workmgr-7c854dddc7-xpxtj\" (UID: \"a90d00ae-29bb-4e3f-b6ec-33d288c3b449\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" Apr 20 19:09:14.965426 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.965249 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9742c349-1fef-4082-91d3-df662355e514-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:14.965607 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.965588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a90d00ae-29bb-4e3f-b6ec-33d288c3b449-tmp\") pod \"klusterlet-addon-workmgr-7c854dddc7-xpxtj\" (UID: \"a90d00ae-29bb-4e3f-b6ec-33d288c3b449\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" Apr 20 19:09:14.968912 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.968891 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a90d00ae-29bb-4e3f-b6ec-33d288c3b449-klusterlet-config\") pod \"klusterlet-addon-workmgr-7c854dddc7-xpxtj\" (UID: \"a90d00ae-29bb-4e3f-b6ec-33d288c3b449\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" Apr 20 19:09:14.973691 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:14.973669 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdv69\" (UniqueName: \"kubernetes.io/projected/a90d00ae-29bb-4e3f-b6ec-33d288c3b449-kube-api-access-hdv69\") pod \"klusterlet-addon-workmgr-7c854dddc7-xpxtj\" (UID: \"a90d00ae-29bb-4e3f-b6ec-33d288c3b449\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" Apr 20 19:09:15.065721 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.065662 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9742c349-1fef-4082-91d3-df662355e514-ca\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:15.065721 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.065692 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9742c349-1fef-4082-91d3-df662355e514-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:15.065878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.065725 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9742c349-1fef-4082-91d3-df662355e514-hub\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:15.065878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.065742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9742c349-1fef-4082-91d3-df662355e514-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:15.065878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.065759 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4lhh\" (UniqueName: \"kubernetes.io/projected/9742c349-1fef-4082-91d3-df662355e514-kube-api-access-k4lhh\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:15.065878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.065778 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9742c349-1fef-4082-91d3-df662355e514-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:15.066620 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.066592 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/9742c349-1fef-4082-91d3-df662355e514-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:15.068018 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.067991 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/9742c349-1fef-4082-91d3-df662355e514-hub\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:15.068106 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.068048 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/9742c349-1fef-4082-91d3-df662355e514-ca\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:15.068148 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.068115 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/9742c349-1fef-4082-91d3-df662355e514-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:15.068236 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.068220 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9742c349-1fef-4082-91d3-df662355e514-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:15.073796 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.073772 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4lhh\" (UniqueName: \"kubernetes.io/projected/9742c349-1fef-4082-91d3-df662355e514-kube-api-access-k4lhh\") pod \"cluster-proxy-proxy-agent-686c9c8c65-wb55r\" (UID: \"9742c349-1fef-4082-91d3-df662355e514\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:15.130214 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.130181 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" Apr 20 19:09:15.193677 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.189291 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:09:15.262315 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.262270 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj"] Apr 20 19:09:15.265888 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:09:15.265858 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda90d00ae_29bb_4e3f_b6ec_33d288c3b449.slice/crio-2155f655754bdb72f88d2556467d32a3fc6d2b864d0ba2634ff24f4c61e4ed40 WatchSource:0}: Error finding container 2155f655754bdb72f88d2556467d32a3fc6d2b864d0ba2634ff24f4c61e4ed40: Status 404 returned error can't find the container with id 2155f655754bdb72f88d2556467d32a3fc6d2b864d0ba2634ff24f4c61e4ed40 Apr 20 19:09:15.321117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:15.321047 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r"] Apr 20 19:09:15.323908 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:09:15.323880 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9742c349_1fef_4082_91d3_df662355e514.slice/crio-5955eaf0231e2eb1ba2edcc19c3e1c8a45ec6d21fd52f42f18d64b73f1d10694 WatchSource:0}: Error finding container 5955eaf0231e2eb1ba2edcc19c3e1c8a45ec6d21fd52f42f18d64b73f1d10694: Status 404 returned error can't find the container with id 5955eaf0231e2eb1ba2edcc19c3e1c8a45ec6d21fd52f42f18d64b73f1d10694 Apr 20 19:09:16.030757 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:16.030690 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" event={"ID":"9742c349-1fef-4082-91d3-df662355e514","Type":"ContainerStarted","Data":"5955eaf0231e2eb1ba2edcc19c3e1c8a45ec6d21fd52f42f18d64b73f1d10694"} Apr 20 19:09:16.033164 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:16.033133 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" event={"ID":"a90d00ae-29bb-4e3f-b6ec-33d288c3b449","Type":"ContainerStarted","Data":"2155f655754bdb72f88d2556467d32a3fc6d2b864d0ba2634ff24f4c61e4ed40"} Apr 20 19:09:16.576673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:16.576360 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert\") pod \"ingress-canary-8c6k6\" (UID: \"70ee668a-415c-4913-9312-a001e69b58d8\") " pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:09:16.576673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:16.576456 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:09:16.576673 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:09:16.576538 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:09:16.576673 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:09:16.576586 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:09:16.576673 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:09:16.576623 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert podName:70ee668a-415c-4913-9312-a001e69b58d8 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:48.576601471 +0000 UTC m=+97.295904226 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert") pod "ingress-canary-8c6k6" (UID: "70ee668a-415c-4913-9312-a001e69b58d8") : secret "canary-serving-cert" not found Apr 20 19:09:16.576673 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:09:16.576650 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls podName:808a40da-6675-4800-964d-852bb302978e nodeName:}" failed. No retries permitted until 2026-04-20 19:09:48.576631886 +0000 UTC m=+97.295934635 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls") pod "dns-default-srqmp" (UID: "808a40da-6675-4800-964d-852bb302978e") : secret "dns-default-metrics-tls" not found Apr 20 19:09:17.483483 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:17.483429 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:09:17.486384 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:17.486356 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:09:17.493970 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:09:17.493830 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:09:17.493970 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:09:17.493906 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs podName:e9c238c6-ab0d-4140-b842-f59e7642479c nodeName:}" failed. No retries permitted until 2026-04-20 19:10:21.493880867 +0000 UTC m=+130.213183631 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs") pod "network-metrics-daemon-rwnnv" (UID: "e9c238c6-ab0d-4140-b842-f59e7642479c") : secret "metrics-daemon-secret" not found Apr 20 19:09:17.685714 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:17.685680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5bcf\" (UniqueName: \"kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf\") pod \"network-check-target-nvhzk\" (UID: \"d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e\") " pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:09:17.688255 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:17.688224 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:09:17.698304 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:17.698275 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:09:17.709965 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:17.709935 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5bcf\" (UniqueName: \"kubernetes.io/projected/d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e-kube-api-access-h5bcf\") pod \"network-check-target-nvhzk\" (UID: \"d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e\") " pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:09:17.952762 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:17.952719 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wstmx\"" Apr 20 19:09:17.961045 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:17.961014 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:09:18.398050 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:18.398020 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-nvhzk"] Apr 20 19:09:19.499005 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:09:19.498968 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9f0d4ce_307e_4bf5_99dc_e95da51d3c9e.slice/crio-ac6837b457e39be457d15baa968268d86dcf05966e80373629cdca2a88c38d70 WatchSource:0}: Error finding container ac6837b457e39be457d15baa968268d86dcf05966e80373629cdca2a88c38d70: Status 404 returned error can't find the container with id ac6837b457e39be457d15baa968268d86dcf05966e80373629cdca2a88c38d70 Apr 20 19:09:20.044092 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:20.044049 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" event={"ID":"9742c349-1fef-4082-91d3-df662355e514","Type":"ContainerStarted","Data":"3171a4394bb950cd7259a53d569bc5cef351fc3f95ce24b56b42934814fa416b"} Apr 20 19:09:20.045206 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:20.045171 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nvhzk" event={"ID":"d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e","Type":"ContainerStarted","Data":"ac6837b457e39be457d15baa968268d86dcf05966e80373629cdca2a88c38d70"} Apr 20 19:09:20.046531 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:20.046507 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" event={"ID":"a90d00ae-29bb-4e3f-b6ec-33d288c3b449","Type":"ContainerStarted","Data":"4d34f4e8eec6e6952baef3816b2d0b14105c225a5dd3329f64f4e94f75e82d7e"} Apr 20 19:09:20.046774 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:20.046750 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" Apr 20 19:09:20.048528 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:20.048509 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" Apr 20 19:09:20.064377 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:20.064327 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c854dddc7-xpxtj" podStartSLOduration=1.765031963 podStartE2EDuration="6.064315694s" podCreationTimestamp="2026-04-20 19:09:14 +0000 UTC" firstStartedPulling="2026-04-20 19:09:15.267598101 +0000 UTC m=+63.986900864" lastFinishedPulling="2026-04-20 19:09:19.56688183 +0000 UTC m=+68.286184595" observedRunningTime="2026-04-20 19:09:20.063905973 +0000 UTC m=+68.783208746" watchObservedRunningTime="2026-04-20 19:09:20.064315694 +0000 UTC m=+68.783618466" Apr 20 19:09:24.056948 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:24.056914 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" event={"ID":"9742c349-1fef-4082-91d3-df662355e514","Type":"ContainerStarted","Data":"6096bfb311f7b57701fbc2c2c59b11c59c24645f87ab269dce6c33649301e901"} Apr 20 19:09:24.056948 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:24.056950 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" event={"ID":"9742c349-1fef-4082-91d3-df662355e514","Type":"ContainerStarted","Data":"633ecb9a9266bf8e4487e3305e2618742151f280d9dfdc5581fb63e32d39dcbc"} Apr 20 19:09:24.058178 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:24.058155 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-nvhzk" event={"ID":"d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e","Type":"ContainerStarted","Data":"1aa988881ff2d851649d468baf7279e386f408f20279276584df40a7b26c1862"} Apr 20 19:09:24.058333 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:24.058316 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:09:24.073866 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:24.073812 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" podStartSLOduration=2.139286444 podStartE2EDuration="10.073798705s" podCreationTimestamp="2026-04-20 19:09:14 +0000 UTC" firstStartedPulling="2026-04-20 19:09:15.325662586 +0000 UTC m=+64.044965337" lastFinishedPulling="2026-04-20 19:09:23.260174846 +0000 UTC m=+71.979477598" observedRunningTime="2026-04-20 19:09:24.072750382 +0000 UTC m=+72.792053154" watchObservedRunningTime="2026-04-20 19:09:24.073798705 +0000 UTC m=+72.793101477" Apr 20 19:09:24.093852 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:24.093805 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-nvhzk" podStartSLOduration=69.339939466 podStartE2EDuration="1m13.093789876s" podCreationTimestamp="2026-04-20 19:08:11 +0000 UTC" firstStartedPulling="2026-04-20 19:09:19.500992424 +0000 UTC m=+68.220295178" lastFinishedPulling="2026-04-20 19:09:23.254842818 +0000 UTC m=+71.974145588" observedRunningTime="2026-04-20 19:09:24.093595622 +0000 UTC m=+72.812898396" watchObservedRunningTime="2026-04-20 19:09:24.093789876 +0000 UTC m=+72.813092649" Apr 20 19:09:48.615087 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:48.615050 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert\") pod \"ingress-canary-8c6k6\" (UID: \"70ee668a-415c-4913-9312-a001e69b58d8\") " pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:09:48.615544 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:48.615113 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:09:48.615544 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:09:48.615201 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:09:48.615544 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:09:48.615204 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:09:48.615544 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:09:48.615270 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert podName:70ee668a-415c-4913-9312-a001e69b58d8 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:52.615248354 +0000 UTC m=+161.334551104 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert") pod "ingress-canary-8c6k6" (UID: "70ee668a-415c-4913-9312-a001e69b58d8") : secret "canary-serving-cert" not found Apr 20 19:09:48.615544 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:09:48.615287 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls podName:808a40da-6675-4800-964d-852bb302978e nodeName:}" failed. No retries permitted until 2026-04-20 19:10:52.615280658 +0000 UTC m=+161.334583407 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls") pod "dns-default-srqmp" (UID: "808a40da-6675-4800-964d-852bb302978e") : secret "dns-default-metrics-tls" not found Apr 20 19:09:55.063778 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:09:55.063745 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-nvhzk" Apr 20 19:10:14.252775 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:14.252744 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lghm5_a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08/dns-node-resolver/0.log" Apr 20 19:10:14.849663 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:14.849634 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dzjv7_89b86827-3229-4d28-8418-3ba07654afdd/node-ca/0.log" Apr 20 19:10:21.537610 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:21.537575 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:10:21.538000 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:21.537699 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:10:21.538000 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:21.537766 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs podName:e9c238c6-ab0d-4140-b842-f59e7642479c nodeName:}" failed. No retries permitted until 2026-04-20 19:12:23.537748622 +0000 UTC m=+252.257051378 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs") pod "network-metrics-daemon-rwnnv" (UID: "e9c238c6-ab0d-4140-b842-f59e7642479c") : secret "metrics-daemon-secret" not found Apr 20 19:10:25.730234 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.730202 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xgj9z"] Apr 20 19:10:25.733190 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.733174 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:25.735294 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.735270 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 19:10:25.735294 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.735290 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 19:10:25.735504 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.735299 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 19:10:25.735504 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.735277 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 19:10:25.735504 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.735274 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qwcxh\"" Apr 20 19:10:25.741997 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.741976 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xgj9z"] Apr 20 19:10:25.768083 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.768058 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:25.768185 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.768121 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmvhg\" (UniqueName: \"kubernetes.io/projected/e9e9b8ab-e36a-433d-90a4-607be9937a16-kube-api-access-nmvhg\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:25.768185 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.768147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e9e9b8ab-e36a-433d-90a4-607be9937a16-data-volume\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:25.768257 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.768196 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e9e9b8ab-e36a-433d-90a4-607be9937a16-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:25.768294 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.768259 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e9e9b8ab-e36a-433d-90a4-607be9937a16-crio-socket\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:25.869308 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.869272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:25.869595 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.869339 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmvhg\" (UniqueName: \"kubernetes.io/projected/e9e9b8ab-e36a-433d-90a4-607be9937a16-kube-api-access-nmvhg\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:25.869595 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.869373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e9e9b8ab-e36a-433d-90a4-607be9937a16-data-volume\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:25.869595 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.869394 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e9e9b8ab-e36a-433d-90a4-607be9937a16-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:25.869595 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.869434 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e9e9b8ab-e36a-433d-90a4-607be9937a16-crio-socket\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:25.869595 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:25.869454 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:25.869595 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.869541 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e9e9b8ab-e36a-433d-90a4-607be9937a16-crio-socket\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:25.869595 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:25.869568 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls podName:e9e9b8ab-e36a-433d-90a4-607be9937a16 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:26.369543752 +0000 UTC m=+135.088846518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xgj9z" (UID: "e9e9b8ab-e36a-433d-90a4-607be9937a16") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:25.869865 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.869777 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e9e9b8ab-e36a-433d-90a4-607be9937a16-data-volume\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:25.869940 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.869923 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e9e9b8ab-e36a-433d-90a4-607be9937a16-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:25.877836 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:25.877817 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmvhg\" (UniqueName: \"kubernetes.io/projected/e9e9b8ab-e36a-433d-90a4-607be9937a16-kube-api-access-nmvhg\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:26.373825 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:26.373784 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:26.374008 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:26.373929 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:26.374008 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:26.373998 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls podName:e9e9b8ab-e36a-433d-90a4-607be9937a16 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:27.373979748 +0000 UTC m=+136.093282499 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xgj9z" (UID: "e9e9b8ab-e36a-433d-90a4-607be9937a16") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:27.380034 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:27.379984 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:27.380450 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:27.380139 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:27.380450 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:27.380218 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls podName:e9e9b8ab-e36a-433d-90a4-607be9937a16 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:29.380201203 +0000 UTC m=+138.099503957 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xgj9z" (UID: "e9e9b8ab-e36a-433d-90a4-607be9937a16") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:29.394488 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:29.394435 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:29.394998 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:29.394618 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:29.394998 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:29.394715 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls podName:e9e9b8ab-e36a-433d-90a4-607be9937a16 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:33.39469322 +0000 UTC m=+142.113995974 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xgj9z" (UID: "e9e9b8ab-e36a-433d-90a4-607be9937a16") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:33.423832 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:33.423787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:33.424303 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:33.424277 2572 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:33.424402 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:33.424389 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls podName:e9e9b8ab-e36a-433d-90a4-607be9937a16 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:41.424368525 +0000 UTC m=+150.143671282 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xgj9z" (UID: "e9e9b8ab-e36a-433d-90a4-607be9937a16") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:41.481767 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:41.481725 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:41.483970 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:41.483939 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e9e9b8ab-e36a-433d-90a4-607be9937a16-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xgj9z\" (UID: \"e9e9b8ab-e36a-433d-90a4-607be9937a16\") " pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:41.641816 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:41.641773 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xgj9z" Apr 20 19:10:41.760369 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:41.760304 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xgj9z"] Apr 20 19:10:41.763518 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:10:41.763492 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9e9b8ab_e36a_433d_90a4_607be9937a16.slice/crio-2d02efec916fcbe4d85bd82295de3357a6e58b98ccb90377304ed068b6264f0c WatchSource:0}: Error finding container 2d02efec916fcbe4d85bd82295de3357a6e58b98ccb90377304ed068b6264f0c: Status 404 returned error can't find the container with id 2d02efec916fcbe4d85bd82295de3357a6e58b98ccb90377304ed068b6264f0c Apr 20 19:10:42.240596 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:42.240562 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xgj9z" event={"ID":"e9e9b8ab-e36a-433d-90a4-607be9937a16","Type":"ContainerStarted","Data":"351a10ffce714c8cb1825dfd0f26deb4be62b7b618c83b3b56295257ca8c8570"} Apr 20 19:10:42.240596 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:42.240598 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xgj9z" event={"ID":"e9e9b8ab-e36a-433d-90a4-607be9937a16","Type":"ContainerStarted","Data":"2d02efec916fcbe4d85bd82295de3357a6e58b98ccb90377304ed068b6264f0c"} Apr 20 19:10:43.244509 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:43.244453 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xgj9z" event={"ID":"e9e9b8ab-e36a-433d-90a4-607be9937a16","Type":"ContainerStarted","Data":"05ad635cdcc5f8b65a9c5f2702254e0cbb07d8b767533e0ef39db0afcf871862"} Apr 20 19:10:44.248722 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:44.248634 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xgj9z" event={"ID":"e9e9b8ab-e36a-433d-90a4-607be9937a16","Type":"ContainerStarted","Data":"e07bb2679d99d0323b4f9e959b4261c1c8d44c5f6271e9353ccb7da41307059e"} Apr 20 19:10:44.266537 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:44.266492 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xgj9z" podStartSLOduration=17.089428252 podStartE2EDuration="19.266457625s" podCreationTimestamp="2026-04-20 19:10:25 +0000 UTC" firstStartedPulling="2026-04-20 19:10:41.816997208 +0000 UTC m=+150.536299958" lastFinishedPulling="2026-04-20 19:10:43.994026576 +0000 UTC m=+152.713329331" observedRunningTime="2026-04-20 19:10:44.265504827 +0000 UTC m=+152.984807600" watchObservedRunningTime="2026-04-20 19:10:44.266457625 +0000 UTC m=+152.985760398" Apr 20 19:10:46.869616 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.869582 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-665dcb5b4b-6xv5x"] Apr 20 19:10:46.872501 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.872465 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:46.875163 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.875140 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 19:10:46.875288 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.875208 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 19:10:46.875354 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.875319 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-w68rf\"" Apr 20 19:10:46.875688 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.875676 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 19:10:46.881867 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.881846 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 19:10:46.882598 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.882578 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-665dcb5b4b-6xv5x"] Apr 20 19:10:46.925662 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.925628 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c5ac44b-a86d-483e-a156-7693b47bc2db-registry-tls\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:46.925662 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.925661 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8lcf\" (UniqueName: \"kubernetes.io/projected/0c5ac44b-a86d-483e-a156-7693b47bc2db-kube-api-access-s8lcf\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:46.925858 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.925731 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c5ac44b-a86d-483e-a156-7693b47bc2db-registry-certificates\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:46.925858 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.925758 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c5ac44b-a86d-483e-a156-7693b47bc2db-ca-trust-extracted\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:46.925858 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.925790 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c5ac44b-a86d-483e-a156-7693b47bc2db-trusted-ca\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:46.925858 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.925819 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c5ac44b-a86d-483e-a156-7693b47bc2db-image-registry-private-configuration\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:46.925858 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.925837 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c5ac44b-a86d-483e-a156-7693b47bc2db-installation-pull-secrets\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:46.925858 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:46.925855 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c5ac44b-a86d-483e-a156-7693b47bc2db-bound-sa-token\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.027112 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.027077 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c5ac44b-a86d-483e-a156-7693b47bc2db-registry-certificates\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.027292 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.027122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c5ac44b-a86d-483e-a156-7693b47bc2db-ca-trust-extracted\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.027292 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.027155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c5ac44b-a86d-483e-a156-7693b47bc2db-trusted-ca\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.027292 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.027199 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c5ac44b-a86d-483e-a156-7693b47bc2db-image-registry-private-configuration\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.027292 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.027228 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c5ac44b-a86d-483e-a156-7693b47bc2db-installation-pull-secrets\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.027292 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.027254 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c5ac44b-a86d-483e-a156-7693b47bc2db-bound-sa-token\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.027292 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.027277 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c5ac44b-a86d-483e-a156-7693b47bc2db-registry-tls\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.027641 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.027304 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8lcf\" (UniqueName: \"kubernetes.io/projected/0c5ac44b-a86d-483e-a156-7693b47bc2db-kube-api-access-s8lcf\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.027697 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.027650 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c5ac44b-a86d-483e-a156-7693b47bc2db-ca-trust-extracted\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.028109 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.028084 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c5ac44b-a86d-483e-a156-7693b47bc2db-registry-certificates\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.028247 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.028224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c5ac44b-a86d-483e-a156-7693b47bc2db-trusted-ca\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.029661 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.029636 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c5ac44b-a86d-483e-a156-7693b47bc2db-image-registry-private-configuration\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.029751 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.029700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c5ac44b-a86d-483e-a156-7693b47bc2db-installation-pull-secrets\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.029829 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.029811 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c5ac44b-a86d-483e-a156-7693b47bc2db-registry-tls\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.043622 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.043593 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8lcf\" (UniqueName: \"kubernetes.io/projected/0c5ac44b-a86d-483e-a156-7693b47bc2db-kube-api-access-s8lcf\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.043757 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.043620 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c5ac44b-a86d-483e-a156-7693b47bc2db-bound-sa-token\") pod \"image-registry-665dcb5b4b-6xv5x\" (UID: \"0c5ac44b-a86d-483e-a156-7693b47bc2db\") " pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.181695 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.181671 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:47.319832 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:47.319801 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-665dcb5b4b-6xv5x"] Apr 20 19:10:47.322839 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:10:47.322812 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c5ac44b_a86d_483e_a156_7693b47bc2db.slice/crio-75e603da4fb10e97314f2b72fb6cc2ab03efaed8f559ec27634fc992059faba7 WatchSource:0}: Error finding container 75e603da4fb10e97314f2b72fb6cc2ab03efaed8f559ec27634fc992059faba7: Status 404 returned error can't find the container with id 75e603da4fb10e97314f2b72fb6cc2ab03efaed8f559ec27634fc992059faba7 Apr 20 19:10:47.680882 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:47.680837 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-srqmp" podUID="808a40da-6675-4800-964d-852bb302978e" Apr 20 19:10:47.699091 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:47.699062 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-8c6k6" podUID="70ee668a-415c-4913-9312-a001e69b58d8" Apr 20 19:10:48.259312 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:48.259283 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:10:48.259752 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:48.259284 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" event={"ID":"0c5ac44b-a86d-483e-a156-7693b47bc2db","Type":"ContainerStarted","Data":"b43e3a4e43de616005725df4ccf4212cb9604486e09064d2604d1afd6d72cbba"} Apr 20 19:10:48.259752 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:48.259381 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-srqmp" Apr 20 19:10:48.259752 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:48.259495 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:10:48.259752 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:48.259520 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" event={"ID":"0c5ac44b-a86d-483e-a156-7693b47bc2db","Type":"ContainerStarted","Data":"75e603da4fb10e97314f2b72fb6cc2ab03efaed8f559ec27634fc992059faba7"} Apr 20 19:10:48.277921 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:48.277879 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" podStartSLOduration=2.277867621 podStartE2EDuration="2.277867621s" podCreationTimestamp="2026-04-20 19:10:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:10:48.2767048 +0000 UTC m=+156.996007575" watchObservedRunningTime="2026-04-20 19:10:48.277867621 +0000 UTC m=+156.997170392" Apr 20 19:10:48.837416 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:48.837361 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-rwnnv" podUID="e9c238c6-ab0d-4140-b842-f59e7642479c" Apr 20 19:10:52.666506 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:52.666388 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert\") pod \"ingress-canary-8c6k6\" (UID: \"70ee668a-415c-4913-9312-a001e69b58d8\") " pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:10:52.666506 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:52.666449 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:10:52.668867 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:52.668840 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/70ee668a-415c-4913-9312-a001e69b58d8-cert\") pod \"ingress-canary-8c6k6\" (UID: \"70ee668a-415c-4913-9312-a001e69b58d8\") " pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:10:52.668971 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:52.668872 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/808a40da-6675-4800-964d-852bb302978e-metrics-tls\") pod \"dns-default-srqmp\" (UID: \"808a40da-6675-4800-964d-852bb302978e\") " pod="openshift-dns/dns-default-srqmp" Apr 20 19:10:52.762161 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:52.762127 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gtxhr\"" Apr 20 19:10:52.762518 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:52.762501 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ggf2q\"" Apr 20 19:10:52.770053 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:52.770033 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8c6k6" Apr 20 19:10:52.770139 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:52.770063 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-srqmp" Apr 20 19:10:52.891555 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:52.891529 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8c6k6"] Apr 20 19:10:52.894965 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:10:52.894940 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70ee668a_415c_4913_9312_a001e69b58d8.slice/crio-760001c304ca6aebd5c194ef912ed0a0dbd7ddb7334bf339ce9aafab1574a6cb WatchSource:0}: Error finding container 760001c304ca6aebd5c194ef912ed0a0dbd7ddb7334bf339ce9aafab1574a6cb: Status 404 returned error can't find the container with id 760001c304ca6aebd5c194ef912ed0a0dbd7ddb7334bf339ce9aafab1574a6cb Apr 20 19:10:52.906757 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:52.906736 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-srqmp"] Apr 20 19:10:52.909508 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:10:52.909483 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod808a40da_6675_4800_964d_852bb302978e.slice/crio-6a04727aba948b1c52882f39f5ce75d2f05a69140b646270a1c95d4b598a1c9b WatchSource:0}: Error finding container 6a04727aba948b1c52882f39f5ce75d2f05a69140b646270a1c95d4b598a1c9b: Status 404 returned error can't find the container with id 6a04727aba948b1c52882f39f5ce75d2f05a69140b646270a1c95d4b598a1c9b Apr 20 19:10:53.272830 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:53.272796 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8c6k6" event={"ID":"70ee668a-415c-4913-9312-a001e69b58d8","Type":"ContainerStarted","Data":"760001c304ca6aebd5c194ef912ed0a0dbd7ddb7334bf339ce9aafab1574a6cb"} Apr 20 19:10:53.273947 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:53.273912 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-srqmp" event={"ID":"808a40da-6675-4800-964d-852bb302978e","Type":"ContainerStarted","Data":"6a04727aba948b1c52882f39f5ce75d2f05a69140b646270a1c95d4b598a1c9b"} Apr 20 19:10:54.036854 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.036817 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-v9q78"] Apr 20 19:10:54.040425 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.040398 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.042969 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.042916 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 19:10:54.042969 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.042939 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 19:10:54.043673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.043391 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 19:10:54.043673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.043393 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 19:10:54.043673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.043447 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 19:10:54.043673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.043502 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lszj4\"" Apr 20 19:10:54.043673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.043545 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 19:10:54.176722 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.176691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-tls\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.176886 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.176739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwmvz\" (UniqueName: \"kubernetes.io/projected/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-kube-api-access-hwmvz\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.176886 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.176811 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-textfile\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.176886 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.176856 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-root\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.177007 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.176886 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-wtmp\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.177007 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.176905 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-metrics-client-ca\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.177007 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.176931 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.177007 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.176961 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-sys\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.177007 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.177004 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-accelerators-collector-config\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.278328 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.278294 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-tls\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.278538 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.278337 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwmvz\" (UniqueName: \"kubernetes.io/projected/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-kube-api-access-hwmvz\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.278538 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.278367 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-textfile\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.278538 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.278420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-root\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.278538 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.278452 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-wtmp\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.278538 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:54.278460 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 19:10:54.278538 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.278488 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-metrics-client-ca\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.278538 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.278527 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.278538 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.278532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-root\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.278949 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:54.278542 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-tls podName:41dab2ed-afae-48a6-8ea7-794f4f1f5e76 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:54.778521549 +0000 UTC m=+163.497824305 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-tls") pod "node-exporter-v9q78" (UID: "41dab2ed-afae-48a6-8ea7-794f4f1f5e76") : secret "node-exporter-tls" not found Apr 20 19:10:54.278949 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.278597 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-sys\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.278949 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.278628 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-wtmp\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.278949 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.278647 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-accelerators-collector-config\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.278949 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.278676 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-sys\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.278949 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.278677 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-textfile\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.279215 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.279194 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-metrics-client-ca\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.279315 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.279291 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-accelerators-collector-config\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.280946 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.280922 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.286904 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.286838 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwmvz\" (UniqueName: \"kubernetes.io/projected/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-kube-api-access-hwmvz\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.784074 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:54.784044 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-tls\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:54.784185 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:54.784169 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 19:10:54.784247 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:10:54.784230 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-tls podName:41dab2ed-afae-48a6-8ea7-794f4f1f5e76 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:55.784209739 +0000 UTC m=+164.503512493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-tls") pod "node-exporter-v9q78" (UID: "41dab2ed-afae-48a6-8ea7-794f4f1f5e76") : secret "node-exporter-tls" not found Apr 20 19:10:55.190695 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:55.190587 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" podUID="9742c349-1fef-4082-91d3-df662355e514" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 19:10:55.279881 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:55.279835 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-srqmp" event={"ID":"808a40da-6675-4800-964d-852bb302978e","Type":"ContainerStarted","Data":"dd3d036db0e8b248a2e1185e1b849cde70b9ad27ad1da864d22fb0bdd3ca188c"} Apr 20 19:10:55.279881 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:55.279877 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-srqmp" event={"ID":"808a40da-6675-4800-964d-852bb302978e","Type":"ContainerStarted","Data":"7eac0d50d6d385b6e2652ae3ac2d5537f8865d9f11c060c7dafc6c1f7a225da3"} Apr 20 19:10:55.280112 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:55.279943 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-srqmp" Apr 20 19:10:55.281067 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:55.281036 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8c6k6" event={"ID":"70ee668a-415c-4913-9312-a001e69b58d8","Type":"ContainerStarted","Data":"34a84dcba943bc93c0d20e27fa5dc2216f0a8abffe6a421b977a4609d82a2ca8"} Apr 20 19:10:55.301424 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:55.301379 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-srqmp" podStartSLOduration=129.528131216 podStartE2EDuration="2m11.301367029s" podCreationTimestamp="2026-04-20 19:08:44 +0000 UTC" firstStartedPulling="2026-04-20 19:10:52.911569275 +0000 UTC m=+161.630872029" lastFinishedPulling="2026-04-20 19:10:54.684805091 +0000 UTC m=+163.404107842" observedRunningTime="2026-04-20 19:10:55.300590081 +0000 UTC m=+164.019892865" watchObservedRunningTime="2026-04-20 19:10:55.301367029 +0000 UTC m=+164.020669801" Apr 20 19:10:55.314505 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:55.314456 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8c6k6" podStartSLOduration=129.483066175 podStartE2EDuration="2m11.3144466s" podCreationTimestamp="2026-04-20 19:08:44 +0000 UTC" firstStartedPulling="2026-04-20 19:10:52.897078822 +0000 UTC m=+161.616381573" lastFinishedPulling="2026-04-20 19:10:54.728459242 +0000 UTC m=+163.447761998" observedRunningTime="2026-04-20 19:10:55.313941754 +0000 UTC m=+164.033244528" watchObservedRunningTime="2026-04-20 19:10:55.3144466 +0000 UTC m=+164.033749368" Apr 20 19:10:55.792878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:55.792826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-tls\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:55.795052 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:55.795031 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/41dab2ed-afae-48a6-8ea7-794f4f1f5e76-node-exporter-tls\") pod \"node-exporter-v9q78\" (UID: \"41dab2ed-afae-48a6-8ea7-794f4f1f5e76\") " pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:55.852435 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:55.852395 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v9q78" Apr 20 19:10:55.859721 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:10:55.859696 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41dab2ed_afae_48a6_8ea7_794f4f1f5e76.slice/crio-084f7d8560307a6a343f0ade27b5419259788ad9ad98b0d46f4a06db9e15b321 WatchSource:0}: Error finding container 084f7d8560307a6a343f0ade27b5419259788ad9ad98b0d46f4a06db9e15b321: Status 404 returned error can't find the container with id 084f7d8560307a6a343f0ade27b5419259788ad9ad98b0d46f4a06db9e15b321 Apr 20 19:10:56.284648 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:56.284609 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9q78" event={"ID":"41dab2ed-afae-48a6-8ea7-794f4f1f5e76","Type":"ContainerStarted","Data":"084f7d8560307a6a343f0ade27b5419259788ad9ad98b0d46f4a06db9e15b321"} Apr 20 19:10:57.288676 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:57.288643 2572 generic.go:358] "Generic (PLEG): container finished" podID="41dab2ed-afae-48a6-8ea7-794f4f1f5e76" containerID="7e97a314eb44739b1cc901fddea1695e0cedd97640094eaa4d2f2ddf84587b08" exitCode=0 Apr 20 19:10:57.289059 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:57.288698 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9q78" event={"ID":"41dab2ed-afae-48a6-8ea7-794f4f1f5e76","Type":"ContainerDied","Data":"7e97a314eb44739b1cc901fddea1695e0cedd97640094eaa4d2f2ddf84587b08"} Apr 20 19:10:58.292778 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:58.292738 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9q78" event={"ID":"41dab2ed-afae-48a6-8ea7-794f4f1f5e76","Type":"ContainerStarted","Data":"5d2ef77ac2af801201d9cc68b79df8c479efb21ff5cfbf1d9ab926139348b7ff"} Apr 20 19:10:58.292778 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:58.292774 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9q78" event={"ID":"41dab2ed-afae-48a6-8ea7-794f4f1f5e76","Type":"ContainerStarted","Data":"b1f479b9dbad82543e4419707642b3ede50c04e7055838f72fcbbba4f56149e2"} Apr 20 19:10:58.313882 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:10:58.313826 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-v9q78" podStartSLOduration=3.368576453 podStartE2EDuration="4.313810449s" podCreationTimestamp="2026-04-20 19:10:54 +0000 UTC" firstStartedPulling="2026-04-20 19:10:55.861427316 +0000 UTC m=+164.580730080" lastFinishedPulling="2026-04-20 19:10:56.806661321 +0000 UTC m=+165.525964076" observedRunningTime="2026-04-20 19:10:58.313103981 +0000 UTC m=+167.032406753" watchObservedRunningTime="2026-04-20 19:10:58.313810449 +0000 UTC m=+167.033113218" Apr 20 19:11:00.180524 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.180492 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:11:00.185100 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.185068 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.187402 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.187375 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-dp0knp9piq4hn\"" Apr 20 19:11:00.187605 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.187451 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 19:11:00.187605 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.187458 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 19:11:00.188012 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.187992 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 19:11:00.188012 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.188004 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 19:11:00.188180 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.188019 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 19:11:00.188180 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.188060 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 19:11:00.188180 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.188172 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 19:11:00.188484 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.188450 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 19:11:00.188601 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.188482 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 19:11:00.188601 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.188507 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 19:11:00.188601 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.188455 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 19:11:00.188831 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.188813 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-bdz9h\"" Apr 20 19:11:00.189005 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.188989 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 19:11:00.190445 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.190425 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 19:11:00.198223 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.198194 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:11:00.330723 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.330689 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.330723 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.330724 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b299183a-8101-4599-a023-e4a120ba27f1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.330922 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.330742 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.330922 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.330782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.330922 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.330810 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-config\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.330922 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.330828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.330922 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.330843 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.330922 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.330862 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.330922 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.330875 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-web-config\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.330922 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.330894 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.330922 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.330912 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.331180 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.330955 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.331180 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.330991 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.331180 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.331021 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b299183a-8101-4599-a023-e4a120ba27f1-config-out\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.331180 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.331046 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.331180 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.331061 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.331180 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.331090 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.331180 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.331134 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5m4t\" (UniqueName: \"kubernetes.io/projected/b299183a-8101-4599-a023-e4a120ba27f1-kube-api-access-x5m4t\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.431515 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.431424 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b299183a-8101-4599-a023-e4a120ba27f1-config-out\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.431515 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.431463 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.431515 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.431505 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.431727 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.431523 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.431727 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.431553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5m4t\" (UniqueName: \"kubernetes.io/projected/b299183a-8101-4599-a023-e4a120ba27f1-kube-api-access-x5m4t\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.431727 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.431587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.431870 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.431741 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b299183a-8101-4599-a023-e4a120ba27f1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.431870 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.431786 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.431870 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.431823 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.432016 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.431878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-config\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.432016 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.431912 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.432016 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.431937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.432016 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.431968 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.432016 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.431992 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-web-config\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.432260 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.432027 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.432260 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.432065 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.432260 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.432096 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.432260 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.432121 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.432260 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.432181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.432532 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.432296 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.433400 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.433158 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.434643 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.434326 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b299183a-8101-4599-a023-e4a120ba27f1-config-out\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.434763 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.434676 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.435207 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.435184 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.436221 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.435271 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-web-config\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.436221 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.435288 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-config\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.436221 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.435309 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b299183a-8101-4599-a023-e4a120ba27f1-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.436221 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.435855 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.436221 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.435963 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.436221 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.436093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.436516 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.436375 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.436951 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.436927 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.437840 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.437815 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.437940 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.437901 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.438186 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.438167 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.440020 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.439998 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5m4t\" (UniqueName: \"kubernetes.io/projected/b299183a-8101-4599-a023-e4a120ba27f1-kube-api-access-x5m4t\") pod \"prometheus-k8s-0\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.495034 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.494995 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:00.621156 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.621126 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:11:00.624644 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:11:00.624613 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb299183a_8101_4599_a023_e4a120ba27f1.slice/crio-65a21881556c3d54884fbc2aca078eea2dda5ffc1bbf146ae6d1cd2c66f8a6bd WatchSource:0}: Error finding container 65a21881556c3d54884fbc2aca078eea2dda5ffc1bbf146ae6d1cd2c66f8a6bd: Status 404 returned error can't find the container with id 65a21881556c3d54884fbc2aca078eea2dda5ffc1bbf146ae6d1cd2c66f8a6bd Apr 20 19:11:00.824923 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:00.824845 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:11:01.302595 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:01.302554 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerStarted","Data":"65a21881556c3d54884fbc2aca078eea2dda5ffc1bbf146ae6d1cd2c66f8a6bd"} Apr 20 19:11:02.306789 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:02.306750 2572 generic.go:358] "Generic (PLEG): container finished" podID="b299183a-8101-4599-a023-e4a120ba27f1" containerID="fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f" exitCode=0 Apr 20 19:11:02.307237 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:02.306832 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerDied","Data":"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f"} Apr 20 19:11:05.191350 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:05.191305 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" podUID="9742c349-1fef-4082-91d3-df662355e514" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 19:11:05.286775 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:05.286752 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-srqmp" Apr 20 19:11:05.316458 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:05.316427 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerStarted","Data":"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e"} Apr 20 19:11:06.321971 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:06.321935 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerStarted","Data":"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462"} Apr 20 19:11:07.327309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:07.327228 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerStarted","Data":"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41"} Apr 20 19:11:07.327309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:07.327272 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerStarted","Data":"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444"} Apr 20 19:11:07.327309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:07.327285 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerStarted","Data":"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9"} Apr 20 19:11:07.327309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:07.327301 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerStarted","Data":"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855"} Apr 20 19:11:07.357130 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:07.357083 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=0.909500026 podStartE2EDuration="7.357063945s" podCreationTimestamp="2026-04-20 19:11:00 +0000 UTC" firstStartedPulling="2026-04-20 19:11:00.626607459 +0000 UTC m=+169.345910209" lastFinishedPulling="2026-04-20 19:11:07.074171378 +0000 UTC m=+175.793474128" observedRunningTime="2026-04-20 19:11:07.35508654 +0000 UTC m=+176.074389313" watchObservedRunningTime="2026-04-20 19:11:07.357063945 +0000 UTC m=+176.076366718" Apr 20 19:11:09.266544 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:09.266506 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-665dcb5b4b-6xv5x" Apr 20 19:11:10.496184 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:10.496146 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:11:15.190651 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:15.190609 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" podUID="9742c349-1fef-4082-91d3-df662355e514" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 19:11:15.191044 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:15.190690 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" Apr 20 19:11:15.191193 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:15.191145 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"6096bfb311f7b57701fbc2c2c59b11c59c24645f87ab269dce6c33649301e901"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 19:11:15.191244 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:15.191205 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" podUID="9742c349-1fef-4082-91d3-df662355e514" containerName="service-proxy" containerID="cri-o://6096bfb311f7b57701fbc2c2c59b11c59c24645f87ab269dce6c33649301e901" gracePeriod=30 Apr 20 19:11:15.349623 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:15.349586 2572 generic.go:358] "Generic (PLEG): container finished" podID="9742c349-1fef-4082-91d3-df662355e514" containerID="6096bfb311f7b57701fbc2c2c59b11c59c24645f87ab269dce6c33649301e901" exitCode=2 Apr 20 19:11:15.349729 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:15.349653 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" event={"ID":"9742c349-1fef-4082-91d3-df662355e514","Type":"ContainerDied","Data":"6096bfb311f7b57701fbc2c2c59b11c59c24645f87ab269dce6c33649301e901"} Apr 20 19:11:16.353795 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:11:16.353763 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-686c9c8c65-wb55r" event={"ID":"9742c349-1fef-4082-91d3-df662355e514","Type":"ContainerStarted","Data":"bd785b0353ced9cb28a55dfb6cc5b27db3e7b98c6994dd9d3fd50ccc940743f6"} Apr 20 19:12:00.495501 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:00.495455 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:00.513564 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:00.513509 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:01.488021 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:01.487992 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:18.512628 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.512589 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:12:18.513187 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.513020 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="prometheus" containerID="cri-o://a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e" gracePeriod=600 Apr 20 19:12:18.513187 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.513040 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="kube-rbac-proxy" containerID="cri-o://37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444" gracePeriod=600 Apr 20 19:12:18.513187 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.513067 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="config-reloader" containerID="cri-o://1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462" gracePeriod=600 Apr 20 19:12:18.513187 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.513094 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="kube-rbac-proxy-web" containerID="cri-o://55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9" gracePeriod=600 Apr 20 19:12:18.513187 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.513050 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="thanos-sidecar" containerID="cri-o://ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855" gracePeriod=600 Apr 20 19:12:18.513187 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.513117 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="kube-rbac-proxy-thanos" containerID="cri-o://9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41" gracePeriod=600 Apr 20 19:12:18.740309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.740286 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:18.860240 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860161 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5m4t\" (UniqueName: \"kubernetes.io/projected/b299183a-8101-4599-a023-e4a120ba27f1-kube-api-access-x5m4t\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860240 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860200 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-metrics-client-ca\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860240 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860229 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-trusted-ca-bundle\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860523 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860253 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-thanos-prometheus-http-client-file\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860523 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860290 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b299183a-8101-4599-a023-e4a120ba27f1-config-out\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860523 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860320 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-tls\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860523 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860345 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-grpc-tls\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860523 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860392 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-metrics-client-certs\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860523 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860432 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-k8s-db\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860523 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860457 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860523 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860509 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b299183a-8101-4599-a023-e4a120ba27f1-tls-assets\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860925 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860535 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-serving-certs-ca-bundle\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860925 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860564 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860925 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860595 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-kubelet-serving-ca-bundle\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860925 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860620 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-kube-rbac-proxy\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860925 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860661 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-k8s-rulefiles-0\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860925 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860690 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-config\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860925 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860718 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-web-config\") pod \"b299183a-8101-4599-a023-e4a120ba27f1\" (UID: \"b299183a-8101-4599-a023-e4a120ba27f1\") " Apr 20 19:12:18.860925 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860746 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:12:18.860925 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860837 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:12:18.861363 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.860973 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-metrics-client-ca\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.861363 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.861006 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-trusted-ca-bundle\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.861363 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.861212 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:12:18.862251 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.862032 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:12:18.863978 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.863249 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:12:18.863978 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.863327 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b299183a-8101-4599-a023-e4a120ba27f1-config-out" (OuterVolumeSpecName: "config-out") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:12:18.863978 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.863653 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b299183a-8101-4599-a023-e4a120ba27f1-kube-api-access-x5m4t" (OuterVolumeSpecName: "kube-api-access-x5m4t") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "kube-api-access-x5m4t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:12:18.863978 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.863726 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:18.863978 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.863819 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:18.864453 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.864415 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:18.864619 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.864595 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:18.864701 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.864675 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:12:18.864701 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.864678 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:18.865454 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.865431 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:18.865538 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.865488 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b299183a-8101-4599-a023-e4a120ba27f1-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:12:18.866160 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.866130 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-config" (OuterVolumeSpecName: "config") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:18.866264 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.866229 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:18.872908 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.872888 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-web-config" (OuterVolumeSpecName: "web-config") pod "b299183a-8101-4599-a023-e4a120ba27f1" (UID: "b299183a-8101-4599-a023-e4a120ba27f1"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:18.962120 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962091 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b299183a-8101-4599-a023-e4a120ba27f1-config-out\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962120 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962115 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-tls\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962120 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962126 2572 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-grpc-tls\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962305 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962136 2572 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-metrics-client-certs\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962305 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962146 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-k8s-db\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962305 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962155 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962305 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962164 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b299183a-8101-4599-a023-e4a120ba27f1-tls-assets\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962305 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962173 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962305 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962183 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962305 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962192 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962305 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962200 2572 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-secret-kube-rbac-proxy\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962305 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962208 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b299183a-8101-4599-a023-e4a120ba27f1-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962305 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962217 2572 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-config\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962305 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962226 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-web-config\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962305 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962234 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x5m4t\" (UniqueName: \"kubernetes.io/projected/b299183a-8101-4599-a023-e4a120ba27f1-kube-api-access-x5m4t\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:18.962305 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:18.962243 2572 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b299183a-8101-4599-a023-e4a120ba27f1-thanos-prometheus-http-client-file\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:12:19.524189 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524152 2572 generic.go:358] "Generic (PLEG): container finished" podID="b299183a-8101-4599-a023-e4a120ba27f1" containerID="9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41" exitCode=0 Apr 20 19:12:19.524189 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524184 2572 generic.go:358] "Generic (PLEG): container finished" podID="b299183a-8101-4599-a023-e4a120ba27f1" containerID="37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444" exitCode=0 Apr 20 19:12:19.524615 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524197 2572 generic.go:358] "Generic (PLEG): container finished" podID="b299183a-8101-4599-a023-e4a120ba27f1" containerID="55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9" exitCode=0 Apr 20 19:12:19.524615 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524206 2572 generic.go:358] "Generic (PLEG): container finished" podID="b299183a-8101-4599-a023-e4a120ba27f1" containerID="ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855" exitCode=0 Apr 20 19:12:19.524615 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524216 2572 generic.go:358] "Generic (PLEG): container finished" podID="b299183a-8101-4599-a023-e4a120ba27f1" containerID="1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462" exitCode=0 Apr 20 19:12:19.524615 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524225 2572 generic.go:358] "Generic (PLEG): container finished" podID="b299183a-8101-4599-a023-e4a120ba27f1" containerID="a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e" exitCode=0 Apr 20 19:12:19.524615 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524227 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerDied","Data":"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41"} Apr 20 19:12:19.524615 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524269 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.524615 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524275 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerDied","Data":"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444"} Apr 20 19:12:19.524615 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524290 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerDied","Data":"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9"} Apr 20 19:12:19.524615 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524303 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerDied","Data":"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855"} Apr 20 19:12:19.524615 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524317 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerDied","Data":"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462"} Apr 20 19:12:19.524615 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524330 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerDied","Data":"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e"} Apr 20 19:12:19.524615 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524345 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b299183a-8101-4599-a023-e4a120ba27f1","Type":"ContainerDied","Data":"65a21881556c3d54884fbc2aca078eea2dda5ffc1bbf146ae6d1cd2c66f8a6bd"} Apr 20 19:12:19.524615 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.524346 2572 scope.go:117] "RemoveContainer" containerID="9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41" Apr 20 19:12:19.539549 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.539527 2572 scope.go:117] "RemoveContainer" containerID="37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444" Apr 20 19:12:19.545923 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.545908 2572 scope.go:117] "RemoveContainer" containerID="55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9" Apr 20 19:12:19.552107 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.552082 2572 scope.go:117] "RemoveContainer" containerID="ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855" Apr 20 19:12:19.553019 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.552997 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:12:19.557513 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.557491 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:12:19.559233 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.559216 2572 scope.go:117] "RemoveContainer" containerID="1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462" Apr 20 19:12:19.565518 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.565498 2572 scope.go:117] "RemoveContainer" containerID="a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e" Apr 20 19:12:19.572260 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.572236 2572 scope.go:117] "RemoveContainer" containerID="fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f" Apr 20 19:12:19.578346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.578330 2572 scope.go:117] "RemoveContainer" containerID="9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41" Apr 20 19:12:19.578912 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:12:19.578888 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41\": container with ID starting with 9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41 not found: ID does not exist" containerID="9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41" Apr 20 19:12:19.579001 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.578918 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41"} err="failed to get container status \"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41\": rpc error: code = NotFound desc = could not find container \"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41\": container with ID starting with 9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41 not found: ID does not exist" Apr 20 19:12:19.579001 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.578937 2572 scope.go:117] "RemoveContainer" containerID="37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444" Apr 20 19:12:19.579163 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:12:19.579146 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444\": container with ID starting with 37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444 not found: ID does not exist" containerID="37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444" Apr 20 19:12:19.579200 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.579167 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444"} err="failed to get container status \"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444\": rpc error: code = NotFound desc = could not find container \"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444\": container with ID starting with 37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444 not found: ID does not exist" Apr 20 19:12:19.579200 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.579180 2572 scope.go:117] "RemoveContainer" containerID="55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9" Apr 20 19:12:19.579377 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:12:19.579362 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9\": container with ID starting with 55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9 not found: ID does not exist" containerID="55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9" Apr 20 19:12:19.579417 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.579381 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9"} err="failed to get container status \"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9\": rpc error: code = NotFound desc = could not find container \"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9\": container with ID starting with 55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9 not found: ID does not exist" Apr 20 19:12:19.579417 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.579394 2572 scope.go:117] "RemoveContainer" containerID="ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855" Apr 20 19:12:19.579595 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:12:19.579580 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855\": container with ID starting with ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855 not found: ID does not exist" containerID="ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855" Apr 20 19:12:19.579645 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.579599 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855"} err="failed to get container status \"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855\": rpc error: code = NotFound desc = could not find container \"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855\": container with ID starting with ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855 not found: ID does not exist" Apr 20 19:12:19.579645 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.579611 2572 scope.go:117] "RemoveContainer" containerID="1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462" Apr 20 19:12:19.579976 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:12:19.579947 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462\": container with ID starting with 1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462 not found: ID does not exist" containerID="1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462" Apr 20 19:12:19.580086 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.579986 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462"} err="failed to get container status \"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462\": rpc error: code = NotFound desc = could not find container \"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462\": container with ID starting with 1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462 not found: ID does not exist" Apr 20 19:12:19.580086 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.580009 2572 scope.go:117] "RemoveContainer" containerID="a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e" Apr 20 19:12:19.580307 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:12:19.580281 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e\": container with ID starting with a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e not found: ID does not exist" containerID="a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e" Apr 20 19:12:19.580425 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.580311 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e"} err="failed to get container status \"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e\": rpc error: code = NotFound desc = could not find container \"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e\": container with ID starting with a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e not found: ID does not exist" Apr 20 19:12:19.580425 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.580331 2572 scope.go:117] "RemoveContainer" containerID="fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f" Apr 20 19:12:19.581007 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:12:19.580988 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f\": container with ID starting with fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f not found: ID does not exist" containerID="fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f" Apr 20 19:12:19.581068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581012 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f"} err="failed to get container status \"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f\": rpc error: code = NotFound desc = could not find container \"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f\": container with ID starting with fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f not found: ID does not exist" Apr 20 19:12:19.581068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581032 2572 scope.go:117] "RemoveContainer" containerID="9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41" Apr 20 19:12:19.581256 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581240 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41"} err="failed to get container status \"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41\": rpc error: code = NotFound desc = could not find container \"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41\": container with ID starting with 9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41 not found: ID does not exist" Apr 20 19:12:19.581299 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581265 2572 scope.go:117] "RemoveContainer" containerID="37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444" Apr 20 19:12:19.581503 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581461 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444"} err="failed to get container status \"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444\": rpc error: code = NotFound desc = could not find container \"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444\": container with ID starting with 37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444 not found: ID does not exist" Apr 20 19:12:19.581577 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581505 2572 scope.go:117] "RemoveContainer" containerID="55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9" Apr 20 19:12:19.581577 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581552 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:12:19.581756 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581737 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9"} err="failed to get container status \"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9\": rpc error: code = NotFound desc = could not find container \"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9\": container with ID starting with 55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9 not found: ID does not exist" Apr 20 19:12:19.581803 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581757 2572 scope.go:117] "RemoveContainer" containerID="ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855" Apr 20 19:12:19.581803 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581786 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="init-config-reloader" Apr 20 19:12:19.581803 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581800 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="init-config-reloader" Apr 20 19:12:19.581910 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581809 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="prometheus" Apr 20 19:12:19.581910 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581815 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="prometheus" Apr 20 19:12:19.581910 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581820 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="kube-rbac-proxy-thanos" Apr 20 19:12:19.581910 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581827 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="kube-rbac-proxy-thanos" Apr 20 19:12:19.581910 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581834 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="config-reloader" Apr 20 19:12:19.581910 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581839 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="config-reloader" Apr 20 19:12:19.581910 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581845 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="kube-rbac-proxy" Apr 20 19:12:19.581910 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581850 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="kube-rbac-proxy" Apr 20 19:12:19.581910 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581863 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="kube-rbac-proxy-web" Apr 20 19:12:19.581910 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581870 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="kube-rbac-proxy-web" Apr 20 19:12:19.581910 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581881 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="thanos-sidecar" Apr 20 19:12:19.581910 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581887 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="thanos-sidecar" Apr 20 19:12:19.582315 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581926 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="thanos-sidecar" Apr 20 19:12:19.582315 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581935 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="prometheus" Apr 20 19:12:19.582315 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581941 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="kube-rbac-proxy-web" Apr 20 19:12:19.582315 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581949 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="config-reloader" Apr 20 19:12:19.582315 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581959 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="kube-rbac-proxy" Apr 20 19:12:19.582315 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581966 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b299183a-8101-4599-a023-e4a120ba27f1" containerName="kube-rbac-proxy-thanos" Apr 20 19:12:19.582315 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581978 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855"} err="failed to get container status \"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855\": rpc error: code = NotFound desc = could not find container \"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855\": container with ID starting with ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855 not found: ID does not exist" Apr 20 19:12:19.582315 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.581995 2572 scope.go:117] "RemoveContainer" containerID="1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462" Apr 20 19:12:19.582315 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.582188 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462"} err="failed to get container status \"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462\": rpc error: code = NotFound desc = could not find container \"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462\": container with ID starting with 1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462 not found: ID does not exist" Apr 20 19:12:19.582315 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.582204 2572 scope.go:117] "RemoveContainer" containerID="a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e" Apr 20 19:12:19.582689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.582443 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e"} err="failed to get container status \"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e\": rpc error: code = NotFound desc = could not find container \"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e\": container with ID starting with a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e not found: ID does not exist" Apr 20 19:12:19.582689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.582457 2572 scope.go:117] "RemoveContainer" containerID="fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f" Apr 20 19:12:19.582689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.582641 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f"} err="failed to get container status \"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f\": rpc error: code = NotFound desc = could not find container \"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f\": container with ID starting with fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f not found: ID does not exist" Apr 20 19:12:19.582689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.582654 2572 scope.go:117] "RemoveContainer" containerID="9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41" Apr 20 19:12:19.582851 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.582832 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41"} err="failed to get container status \"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41\": rpc error: code = NotFound desc = could not find container \"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41\": container with ID starting with 9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41 not found: ID does not exist" Apr 20 19:12:19.582890 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.582851 2572 scope.go:117] "RemoveContainer" containerID="37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444" Apr 20 19:12:19.583038 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.583016 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444"} err="failed to get container status \"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444\": rpc error: code = NotFound desc = could not find container \"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444\": container with ID starting with 37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444 not found: ID does not exist" Apr 20 19:12:19.583111 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.583039 2572 scope.go:117] "RemoveContainer" containerID="55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9" Apr 20 19:12:19.583237 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.583219 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9"} err="failed to get container status \"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9\": rpc error: code = NotFound desc = could not find container \"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9\": container with ID starting with 55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9 not found: ID does not exist" Apr 20 19:12:19.583286 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.583238 2572 scope.go:117] "RemoveContainer" containerID="ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855" Apr 20 19:12:19.583431 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.583416 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855"} err="failed to get container status \"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855\": rpc error: code = NotFound desc = could not find container \"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855\": container with ID starting with ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855 not found: ID does not exist" Apr 20 19:12:19.583490 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.583432 2572 scope.go:117] "RemoveContainer" containerID="1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462" Apr 20 19:12:19.583665 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.583648 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462"} err="failed to get container status \"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462\": rpc error: code = NotFound desc = could not find container \"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462\": container with ID starting with 1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462 not found: ID does not exist" Apr 20 19:12:19.583729 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.583667 2572 scope.go:117] "RemoveContainer" containerID="a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e" Apr 20 19:12:19.583913 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.583891 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e"} err="failed to get container status \"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e\": rpc error: code = NotFound desc = could not find container \"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e\": container with ID starting with a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e not found: ID does not exist" Apr 20 19:12:19.583913 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.583912 2572 scope.go:117] "RemoveContainer" containerID="fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f" Apr 20 19:12:19.584130 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.584113 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f"} err="failed to get container status \"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f\": rpc error: code = NotFound desc = could not find container \"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f\": container with ID starting with fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f not found: ID does not exist" Apr 20 19:12:19.584130 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.584130 2572 scope.go:117] "RemoveContainer" containerID="9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41" Apr 20 19:12:19.584338 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.584318 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41"} err="failed to get container status \"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41\": rpc error: code = NotFound desc = could not find container \"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41\": container with ID starting with 9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41 not found: ID does not exist" Apr 20 19:12:19.584389 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.584338 2572 scope.go:117] "RemoveContainer" containerID="37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444" Apr 20 19:12:19.584567 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.584549 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444"} err="failed to get container status \"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444\": rpc error: code = NotFound desc = could not find container \"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444\": container with ID starting with 37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444 not found: ID does not exist" Apr 20 19:12:19.584619 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.584567 2572 scope.go:117] "RemoveContainer" containerID="55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9" Apr 20 19:12:19.584784 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.584766 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9"} err="failed to get container status \"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9\": rpc error: code = NotFound desc = could not find container \"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9\": container with ID starting with 55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9 not found: ID does not exist" Apr 20 19:12:19.584879 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.584785 2572 scope.go:117] "RemoveContainer" containerID="ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855" Apr 20 19:12:19.584972 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.584957 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855"} err="failed to get container status \"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855\": rpc error: code = NotFound desc = could not find container \"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855\": container with ID starting with ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855 not found: ID does not exist" Apr 20 19:12:19.585010 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.584971 2572 scope.go:117] "RemoveContainer" containerID="1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462" Apr 20 19:12:19.585155 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.585140 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462"} err="failed to get container status \"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462\": rpc error: code = NotFound desc = could not find container \"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462\": container with ID starting with 1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462 not found: ID does not exist" Apr 20 19:12:19.585194 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.585155 2572 scope.go:117] "RemoveContainer" containerID="a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e" Apr 20 19:12:19.585358 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.585342 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e"} err="failed to get container status \"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e\": rpc error: code = NotFound desc = could not find container \"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e\": container with ID starting with a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e not found: ID does not exist" Apr 20 19:12:19.585358 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.585357 2572 scope.go:117] "RemoveContainer" containerID="fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f" Apr 20 19:12:19.585575 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.585560 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.585621 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.585579 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f"} err="failed to get container status \"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f\": rpc error: code = NotFound desc = could not find container \"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f\": container with ID starting with fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f not found: ID does not exist" Apr 20 19:12:19.585621 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.585596 2572 scope.go:117] "RemoveContainer" containerID="9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41" Apr 20 19:12:19.585950 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.585810 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41"} err="failed to get container status \"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41\": rpc error: code = NotFound desc = could not find container \"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41\": container with ID starting with 9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41 not found: ID does not exist" Apr 20 19:12:19.585950 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.585838 2572 scope.go:117] "RemoveContainer" containerID="37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444" Apr 20 19:12:19.586158 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.586132 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444"} err="failed to get container status \"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444\": rpc error: code = NotFound desc = could not find container \"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444\": container with ID starting with 37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444 not found: ID does not exist" Apr 20 19:12:19.586158 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.586158 2572 scope.go:117] "RemoveContainer" containerID="55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9" Apr 20 19:12:19.586413 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.586398 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9"} err="failed to get container status \"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9\": rpc error: code = NotFound desc = could not find container \"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9\": container with ID starting with 55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9 not found: ID does not exist" Apr 20 19:12:19.586485 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.586414 2572 scope.go:117] "RemoveContainer" containerID="ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855" Apr 20 19:12:19.586648 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.586621 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855"} err="failed to get container status \"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855\": rpc error: code = NotFound desc = could not find container \"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855\": container with ID starting with ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855 not found: ID does not exist" Apr 20 19:12:19.586697 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.586649 2572 scope.go:117] "RemoveContainer" containerID="1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462" Apr 20 19:12:19.586862 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.586845 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462"} err="failed to get container status \"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462\": rpc error: code = NotFound desc = could not find container \"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462\": container with ID starting with 1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462 not found: ID does not exist" Apr 20 19:12:19.586916 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.586863 2572 scope.go:117] "RemoveContainer" containerID="a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e" Apr 20 19:12:19.587093 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.587072 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e"} err="failed to get container status \"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e\": rpc error: code = NotFound desc = could not find container \"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e\": container with ID starting with a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e not found: ID does not exist" Apr 20 19:12:19.587093 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.587092 2572 scope.go:117] "RemoveContainer" containerID="fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f" Apr 20 19:12:19.587331 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.587304 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f"} err="failed to get container status \"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f\": rpc error: code = NotFound desc = could not find container \"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f\": container with ID starting with fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f not found: ID does not exist" Apr 20 19:12:19.587407 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.587332 2572 scope.go:117] "RemoveContainer" containerID="9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41" Apr 20 19:12:19.587566 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.587542 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41"} err="failed to get container status \"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41\": rpc error: code = NotFound desc = could not find container \"9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41\": container with ID starting with 9db6f71e93f0ccd1c8bd724619bea977922cbb6df7d10e1466bb5adcd8d7af41 not found: ID does not exist" Apr 20 19:12:19.587566 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.587562 2572 scope.go:117] "RemoveContainer" containerID="37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444" Apr 20 19:12:19.587792 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.587775 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 19:12:19.587850 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.587792 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444"} err="failed to get container status \"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444\": rpc error: code = NotFound desc = could not find container \"37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444\": container with ID starting with 37533b3c7c04595b6a1bdddc85b15941bebc9d54614d7edc58eabf243a84c444 not found: ID does not exist" Apr 20 19:12:19.587850 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.587809 2572 scope.go:117] "RemoveContainer" containerID="55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9" Apr 20 19:12:19.587850 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.587784 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 19:12:19.588044 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588023 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9"} err="failed to get container status \"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9\": rpc error: code = NotFound desc = could not find container \"55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9\": container with ID starting with 55746ce06ff36aebb6990c1a3cc789748cbf08ae75e7b98ae5b42bea851109c9 not found: ID does not exist" Apr 20 19:12:19.588101 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588045 2572 scope.go:117] "RemoveContainer" containerID="ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855" Apr 20 19:12:19.588142 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588054 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 19:12:19.588265 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588246 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 19:12:19.588391 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588366 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855"} err="failed to get container status \"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855\": rpc error: code = NotFound desc = could not find container \"ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855\": container with ID starting with ddfe3ae2abde67af1fbf162ec1f75f5163c9ffa34aedd3a5f944a9279302b855 not found: ID does not exist" Apr 20 19:12:19.588442 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588396 2572 scope.go:117] "RemoveContainer" containerID="1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462" Apr 20 19:12:19.588442 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588401 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-dp0knp9piq4hn\"" Apr 20 19:12:19.588649 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588466 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 19:12:19.588649 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588466 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 19:12:19.588649 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588597 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-bdz9h\"" Apr 20 19:12:19.588649 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588625 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 19:12:19.588843 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588648 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 19:12:19.588843 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588745 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462"} err="failed to get container status \"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462\": rpc error: code = NotFound desc = could not find container \"1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462\": container with ID starting with 1668bc9f56368ea1ca3aef19b19cdd8943fa26ae5ddecbd43bedd0213c654462 not found: ID does not exist" Apr 20 19:12:19.588843 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588767 2572 scope.go:117] "RemoveContainer" containerID="a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e" Apr 20 19:12:19.588978 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588848 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 19:12:19.588978 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.588919 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 19:12:19.589050 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.589012 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e"} err="failed to get container status \"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e\": rpc error: code = NotFound desc = could not find container \"a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e\": container with ID starting with a29ce0787668d810abcdc40e1b2c71529060106450741f71447d02ab46c4578e not found: ID does not exist" Apr 20 19:12:19.589050 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.589029 2572 scope.go:117] "RemoveContainer" containerID="fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f" Apr 20 19:12:19.589141 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.589058 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 19:12:19.589270 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.589252 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f"} err="failed to get container status \"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f\": rpc error: code = NotFound desc = could not find container \"fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f\": container with ID starting with fb1cee5e909e8e10fdcbc14136981e031fa738c123db6e15c92b1b66a792d33f not found: ID does not exist" Apr 20 19:12:19.592151 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.592132 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 19:12:19.593358 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.593341 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 19:12:19.606199 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.606171 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:12:19.667509 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667454 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.667509 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667510 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.667727 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667538 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-config\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.667727 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667570 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.667727 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667587 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-web-config\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.667727 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667633 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.667727 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667664 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2da19f46-34a6-4ebb-863a-83d31b4ab964-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.667727 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667703 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.667727 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.668062 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667746 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.668062 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhm2p\" (UniqueName: \"kubernetes.io/projected/2da19f46-34a6-4ebb-863a-83d31b4ab964-kube-api-access-lhm2p\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.668062 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.668062 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2da19f46-34a6-4ebb-863a-83d31b4ab964-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.668062 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667935 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2da19f46-34a6-4ebb-863a-83d31b4ab964-config-out\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.668062 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667964 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.668062 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.667997 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.668062 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.668024 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.668313 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.668066 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.768668 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.768634 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.768668 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.768666 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-config\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.768908 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.768696 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.768908 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.768714 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-web-config\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.768908 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.768730 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.768908 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.768747 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2da19f46-34a6-4ebb-863a-83d31b4ab964-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.768908 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.768769 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.768908 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.768800 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.768908 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.768827 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.768908 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.768852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhm2p\" (UniqueName: \"kubernetes.io/projected/2da19f46-34a6-4ebb-863a-83d31b4ab964-kube-api-access-lhm2p\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.769899 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.769870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.770004 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.769882 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.770069 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.770018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.770128 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.770086 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2da19f46-34a6-4ebb-863a-83d31b4ab964-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.770180 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.770137 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2da19f46-34a6-4ebb-863a-83d31b4ab964-config-out\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.770247 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.770230 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.770333 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.770312 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.770488 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.770355 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.770572 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.770534 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.770621 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.770577 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.772314 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.772075 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.772314 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.772098 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2da19f46-34a6-4ebb-863a-83d31b4ab964-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.772976 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.772622 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-config\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.772976 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.772887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.773683 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.773656 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.776988 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.775282 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.776988 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.775399 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.776988 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.776198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-web-config\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.776988 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.776581 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.776988 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.776865 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.777309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.777131 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2da19f46-34a6-4ebb-863a-83d31b4ab964-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.778136 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.777676 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2da19f46-34a6-4ebb-863a-83d31b4ab964-config-out\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.778136 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.777911 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhm2p\" (UniqueName: \"kubernetes.io/projected/2da19f46-34a6-4ebb-863a-83d31b4ab964-kube-api-access-lhm2p\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.778970 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.778868 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2da19f46-34a6-4ebb-863a-83d31b4ab964-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.779662 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.779643 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.779772 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.779754 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2da19f46-34a6-4ebb-863a-83d31b4ab964-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2da19f46-34a6-4ebb-863a-83d31b4ab964\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:19.828429 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.828400 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b299183a-8101-4599-a023-e4a120ba27f1" path="/var/lib/kubelet/pods/b299183a-8101-4599-a023-e4a120ba27f1/volumes" Apr 20 19:12:19.895398 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:19.895370 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:20.012686 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:20.012658 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 19:12:20.015862 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:12:20.015838 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2da19f46_34a6_4ebb_863a_83d31b4ab964.slice/crio-7c5c5b3d397f1cdd1132d6621a68c16aaacb8da6dbbd942c78ce3892bdb507df WatchSource:0}: Error finding container 7c5c5b3d397f1cdd1132d6621a68c16aaacb8da6dbbd942c78ce3892bdb507df: Status 404 returned error can't find the container with id 7c5c5b3d397f1cdd1132d6621a68c16aaacb8da6dbbd942c78ce3892bdb507df Apr 20 19:12:20.531920 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:20.531880 2572 generic.go:358] "Generic (PLEG): container finished" podID="2da19f46-34a6-4ebb-863a-83d31b4ab964" containerID="751231b2535cbd348c00d9d72ad107f813202937790f71eb1a007fc153339fc5" exitCode=0 Apr 20 19:12:20.532339 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:20.531955 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da19f46-34a6-4ebb-863a-83d31b4ab964","Type":"ContainerDied","Data":"751231b2535cbd348c00d9d72ad107f813202937790f71eb1a007fc153339fc5"} Apr 20 19:12:20.532339 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:20.531986 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da19f46-34a6-4ebb-863a-83d31b4ab964","Type":"ContainerStarted","Data":"7c5c5b3d397f1cdd1132d6621a68c16aaacb8da6dbbd942c78ce3892bdb507df"} Apr 20 19:12:21.536860 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:21.536781 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da19f46-34a6-4ebb-863a-83d31b4ab964","Type":"ContainerStarted","Data":"eda23976bae0fc8bcebbcb89c7feb1227151b090578d6ed97abe16c78db07636"} Apr 20 19:12:21.536860 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:21.536814 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da19f46-34a6-4ebb-863a-83d31b4ab964","Type":"ContainerStarted","Data":"6628c065a0f0d90767317c1808cc0510aa4c47929148969287fd416c9f231f06"} Apr 20 19:12:21.536860 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:21.536825 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da19f46-34a6-4ebb-863a-83d31b4ab964","Type":"ContainerStarted","Data":"08b62283bbaefbf0b6edbb44af5497ca251770193e33bb60da7a903fe121fee7"} Apr 20 19:12:21.536860 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:21.536833 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da19f46-34a6-4ebb-863a-83d31b4ab964","Type":"ContainerStarted","Data":"0119a7ee941ce13446565535318f3709856240037a62144242ea85648b4959c5"} Apr 20 19:12:21.536860 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:21.536844 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da19f46-34a6-4ebb-863a-83d31b4ab964","Type":"ContainerStarted","Data":"73b6bb36c260650dff04c272e7f293795a4089b7852e020eb9d9e5b071a9352e"} Apr 20 19:12:21.536860 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:21.536855 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da19f46-34a6-4ebb-863a-83d31b4ab964","Type":"ContainerStarted","Data":"8bcbd14f0158f8062dd47a505ddf45a1b016f3cef81de1489a90f772f41d077a"} Apr 20 19:12:21.562733 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:21.562686 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.562673689 podStartE2EDuration="2.562673689s" podCreationTimestamp="2026-04-20 19:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:12:21.561413155 +0000 UTC m=+250.280715927" watchObservedRunningTime="2026-04-20 19:12:21.562673689 +0000 UTC m=+250.281976461" Apr 20 19:12:23.606823 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:23.606778 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:12:23.609146 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:23.609111 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9c238c6-ab0d-4140-b842-f59e7642479c-metrics-certs\") pod \"network-metrics-daemon-rwnnv\" (UID: \"e9c238c6-ab0d-4140-b842-f59e7642479c\") " pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:12:23.628042 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:23.628023 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bdmvj\"" Apr 20 19:12:23.636686 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:23.636671 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rwnnv" Apr 20 19:12:23.771783 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:23.771693 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rwnnv"] Apr 20 19:12:23.774292 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:12:23.774262 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9c238c6_ab0d_4140_b842_f59e7642479c.slice/crio-989b290be9773bf58aa45bb75d8bf3a7877639241ef79735890e573118bd60d2 WatchSource:0}: Error finding container 989b290be9773bf58aa45bb75d8bf3a7877639241ef79735890e573118bd60d2: Status 404 returned error can't find the container with id 989b290be9773bf58aa45bb75d8bf3a7877639241ef79735890e573118bd60d2 Apr 20 19:12:24.549155 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:24.549119 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rwnnv" event={"ID":"e9c238c6-ab0d-4140-b842-f59e7642479c","Type":"ContainerStarted","Data":"989b290be9773bf58aa45bb75d8bf3a7877639241ef79735890e573118bd60d2"} Apr 20 19:12:24.896106 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:24.896027 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:12:25.553542 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:25.553502 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rwnnv" event={"ID":"e9c238c6-ab0d-4140-b842-f59e7642479c","Type":"ContainerStarted","Data":"de27b9c994483e843612352e572dbe942e6463114729b4db8e652151ece5b085"} Apr 20 19:12:25.553542 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:25.553543 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rwnnv" event={"ID":"e9c238c6-ab0d-4140-b842-f59e7642479c","Type":"ContainerStarted","Data":"abab7af2c83f1979dcec6afaa0cb694100010a7049a4c15e7e77fe42a99f561e"} Apr 20 19:12:25.570135 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:12:25.570093 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rwnnv" podStartSLOduration=253.432178898 podStartE2EDuration="4m14.570078877s" podCreationTimestamp="2026-04-20 19:08:11 +0000 UTC" firstStartedPulling="2026-04-20 19:12:23.776261176 +0000 UTC m=+252.495563933" lastFinishedPulling="2026-04-20 19:12:24.914161161 +0000 UTC m=+253.633463912" observedRunningTime="2026-04-20 19:12:25.568245829 +0000 UTC m=+254.287548601" watchObservedRunningTime="2026-04-20 19:12:25.570078877 +0000 UTC m=+254.289381649" Apr 20 19:13:19.896467 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:13:19.896434 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:13:19.912067 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:13:19.912042 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:13:20.712341 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:13:20.712317 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 19:15:59.961833 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:15:59.961801 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj"] Apr 20 19:15:59.964795 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:15:59.964774 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" Apr 20 19:15:59.968056 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:15:59.968036 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-kdxcc\"" Apr 20 19:15:59.968321 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:15:59.968300 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 19:15:59.968408 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:15:59.968391 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 19:15:59.968526 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:15:59.968512 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 19:15:59.971259 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:15:59.971242 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 19:15:59.995197 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:15:59.995163 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj"] Apr 20 19:16:00.133887 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:00.133857 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z74d\" (UniqueName: \"kubernetes.io/projected/26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba-kube-api-access-7z74d\") pod \"opendatahub-operator-controller-manager-6c77764cd6-d25gj\" (UID: \"26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" Apr 20 19:16:00.134063 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:00.133895 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-d25gj\" (UID: \"26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" Apr 20 19:16:00.134063 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:00.133981 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-d25gj\" (UID: \"26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" Apr 20 19:16:00.235373 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:00.235285 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-d25gj\" (UID: \"26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" Apr 20 19:16:00.235551 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:00.235444 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z74d\" (UniqueName: \"kubernetes.io/projected/26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba-kube-api-access-7z74d\") pod \"opendatahub-operator-controller-manager-6c77764cd6-d25gj\" (UID: \"26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" Apr 20 19:16:00.235551 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:00.235508 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-d25gj\" (UID: \"26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" Apr 20 19:16:00.237875 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:00.237851 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-d25gj\" (UID: \"26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" Apr 20 19:16:00.237991 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:00.237971 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-d25gj\" (UID: \"26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" Apr 20 19:16:00.245766 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:00.245742 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z74d\" (UniqueName: \"kubernetes.io/projected/26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba-kube-api-access-7z74d\") pod \"opendatahub-operator-controller-manager-6c77764cd6-d25gj\" (UID: \"26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" Apr 20 19:16:00.274316 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:00.274287 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" Apr 20 19:16:00.394794 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:00.394772 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj"] Apr 20 19:16:00.397442 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:16:00.397418 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26e1776a_b8a0_4fa8_aa99_2e7d50f0e3ba.slice/crio-c302d4254f85cf3ba1f09d387f5ab0e0081a1436437ed8cf9f99e5104412c4ca WatchSource:0}: Error finding container c302d4254f85cf3ba1f09d387f5ab0e0081a1436437ed8cf9f99e5104412c4ca: Status 404 returned error can't find the container with id c302d4254f85cf3ba1f09d387f5ab0e0081a1436437ed8cf9f99e5104412c4ca Apr 20 19:16:00.399023 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:00.399003 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:16:01.096069 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:01.096036 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" event={"ID":"26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba","Type":"ContainerStarted","Data":"c302d4254f85cf3ba1f09d387f5ab0e0081a1436437ed8cf9f99e5104412c4ca"} Apr 20 19:16:03.102533 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:03.102439 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" event={"ID":"26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba","Type":"ContainerStarted","Data":"8845888ccf2ecad26c58be8e0e37d2eaee5158f013efd33fb42b0c7df824cc81"} Apr 20 19:16:03.102889 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:03.102618 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" Apr 20 19:16:03.125115 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:03.125054 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" podStartSLOduration=1.7420298920000001 podStartE2EDuration="4.125036044s" podCreationTimestamp="2026-04-20 19:15:59 +0000 UTC" firstStartedPulling="2026-04-20 19:16:00.399173015 +0000 UTC m=+469.118475766" lastFinishedPulling="2026-04-20 19:16:02.782179168 +0000 UTC m=+471.501481918" observedRunningTime="2026-04-20 19:16:03.124092245 +0000 UTC m=+471.843395019" watchObservedRunningTime="2026-04-20 19:16:03.125036044 +0000 UTC m=+471.844338816" Apr 20 19:16:14.108213 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:14.108177 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-d25gj" Apr 20 19:16:28.437753 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.437712 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-5489467c57-rkknm"] Apr 20 19:16:28.440809 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.440787 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5489467c57-rkknm" Apr 20 19:16:28.443308 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.443290 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 19:16:28.444381 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.444354 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-hdxhc\"" Apr 20 19:16:28.444381 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.444374 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 19:16:28.444381 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.444361 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 19:16:28.444597 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.444367 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 19:16:28.451703 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.451683 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5489467c57-rkknm"] Apr 20 19:16:28.564774 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.564739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8208c3ef-bdb3-4df4-81b8-808e9113792f-tls-certs\") pod \"kube-auth-proxy-5489467c57-rkknm\" (UID: \"8208c3ef-bdb3-4df4-81b8-808e9113792f\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-rkknm" Apr 20 19:16:28.564774 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.564777 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98pnz\" (UniqueName: \"kubernetes.io/projected/8208c3ef-bdb3-4df4-81b8-808e9113792f-kube-api-access-98pnz\") pod \"kube-auth-proxy-5489467c57-rkknm\" (UID: \"8208c3ef-bdb3-4df4-81b8-808e9113792f\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-rkknm" Apr 20 19:16:28.564993 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.564809 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8208c3ef-bdb3-4df4-81b8-808e9113792f-tmp\") pod \"kube-auth-proxy-5489467c57-rkknm\" (UID: \"8208c3ef-bdb3-4df4-81b8-808e9113792f\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-rkknm" Apr 20 19:16:28.666158 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.666127 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8208c3ef-bdb3-4df4-81b8-808e9113792f-tls-certs\") pod \"kube-auth-proxy-5489467c57-rkknm\" (UID: \"8208c3ef-bdb3-4df4-81b8-808e9113792f\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-rkknm" Apr 20 19:16:28.666158 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.666163 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98pnz\" (UniqueName: \"kubernetes.io/projected/8208c3ef-bdb3-4df4-81b8-808e9113792f-kube-api-access-98pnz\") pod \"kube-auth-proxy-5489467c57-rkknm\" (UID: \"8208c3ef-bdb3-4df4-81b8-808e9113792f\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-rkknm" Apr 20 19:16:28.666408 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.666206 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8208c3ef-bdb3-4df4-81b8-808e9113792f-tmp\") pod \"kube-auth-proxy-5489467c57-rkknm\" (UID: \"8208c3ef-bdb3-4df4-81b8-808e9113792f\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-rkknm" Apr 20 19:16:28.668512 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.668465 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8208c3ef-bdb3-4df4-81b8-808e9113792f-tmp\") pod \"kube-auth-proxy-5489467c57-rkknm\" (UID: \"8208c3ef-bdb3-4df4-81b8-808e9113792f\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-rkknm" Apr 20 19:16:28.668614 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.668595 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8208c3ef-bdb3-4df4-81b8-808e9113792f-tls-certs\") pod \"kube-auth-proxy-5489467c57-rkknm\" (UID: \"8208c3ef-bdb3-4df4-81b8-808e9113792f\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-rkknm" Apr 20 19:16:28.674162 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.674140 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98pnz\" (UniqueName: \"kubernetes.io/projected/8208c3ef-bdb3-4df4-81b8-808e9113792f-kube-api-access-98pnz\") pod \"kube-auth-proxy-5489467c57-rkknm\" (UID: \"8208c3ef-bdb3-4df4-81b8-808e9113792f\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-rkknm" Apr 20 19:16:28.749634 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.749553 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5489467c57-rkknm" Apr 20 19:16:28.881504 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:28.881405 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5489467c57-rkknm"] Apr 20 19:16:28.884726 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:16:28.884693 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8208c3ef_bdb3_4df4_81b8_808e9113792f.slice/crio-68acb4ba0b538c7ac3c4a7de359178ab3fab9077b51d21f7213895deb95d04fe WatchSource:0}: Error finding container 68acb4ba0b538c7ac3c4a7de359178ab3fab9077b51d21f7213895deb95d04fe: Status 404 returned error can't find the container with id 68acb4ba0b538c7ac3c4a7de359178ab3fab9077b51d21f7213895deb95d04fe Apr 20 19:16:29.174830 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:29.174786 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5489467c57-rkknm" event={"ID":"8208c3ef-bdb3-4df4-81b8-808e9113792f","Type":"ContainerStarted","Data":"68acb4ba0b538c7ac3c4a7de359178ab3fab9077b51d21f7213895deb95d04fe"} Apr 20 19:16:31.203392 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:31.203358 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-hkrrj"] Apr 20 19:16:31.206948 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:31.206925 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" Apr 20 19:16:31.209857 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:31.209824 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 19:16:31.210013 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:31.209994 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-88vbq\"" Apr 20 19:16:31.217680 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:31.217654 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-hkrrj"] Apr 20 19:16:31.287974 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:31.287942 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/549ea97d-23c0-456d-b087-b04bb3694d05-cert\") pod \"odh-model-controller-858dbf95b8-hkrrj\" (UID: \"549ea97d-23c0-456d-b087-b04bb3694d05\") " pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" Apr 20 19:16:31.288144 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:31.287983 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq8c9\" (UniqueName: \"kubernetes.io/projected/549ea97d-23c0-456d-b087-b04bb3694d05-kube-api-access-zq8c9\") pod \"odh-model-controller-858dbf95b8-hkrrj\" (UID: \"549ea97d-23c0-456d-b087-b04bb3694d05\") " pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" Apr 20 19:16:31.388889 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:31.388856 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/549ea97d-23c0-456d-b087-b04bb3694d05-cert\") pod \"odh-model-controller-858dbf95b8-hkrrj\" (UID: \"549ea97d-23c0-456d-b087-b04bb3694d05\") " pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" Apr 20 19:16:31.389055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:31.388921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq8c9\" (UniqueName: \"kubernetes.io/projected/549ea97d-23c0-456d-b087-b04bb3694d05-kube-api-access-zq8c9\") pod \"odh-model-controller-858dbf95b8-hkrrj\" (UID: \"549ea97d-23c0-456d-b087-b04bb3694d05\") " pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" Apr 20 19:16:31.389055 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:16:31.389007 2572 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 19:16:31.389128 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:16:31.389071 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/549ea97d-23c0-456d-b087-b04bb3694d05-cert podName:549ea97d-23c0-456d-b087-b04bb3694d05 nodeName:}" failed. No retries permitted until 2026-04-20 19:16:31.889053006 +0000 UTC m=+500.608355756 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/549ea97d-23c0-456d-b087-b04bb3694d05-cert") pod "odh-model-controller-858dbf95b8-hkrrj" (UID: "549ea97d-23c0-456d-b087-b04bb3694d05") : secret "odh-model-controller-webhook-cert" not found Apr 20 19:16:31.398310 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:31.398273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq8c9\" (UniqueName: \"kubernetes.io/projected/549ea97d-23c0-456d-b087-b04bb3694d05-kube-api-access-zq8c9\") pod \"odh-model-controller-858dbf95b8-hkrrj\" (UID: \"549ea97d-23c0-456d-b087-b04bb3694d05\") " pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" Apr 20 19:16:31.894068 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:31.894030 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/549ea97d-23c0-456d-b087-b04bb3694d05-cert\") pod \"odh-model-controller-858dbf95b8-hkrrj\" (UID: \"549ea97d-23c0-456d-b087-b04bb3694d05\") " pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" Apr 20 19:16:31.896931 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:31.896904 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/549ea97d-23c0-456d-b087-b04bb3694d05-cert\") pod \"odh-model-controller-858dbf95b8-hkrrj\" (UID: \"549ea97d-23c0-456d-b087-b04bb3694d05\") " pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" Apr 20 19:16:32.121076 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:32.121034 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" Apr 20 19:16:32.398503 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:32.398460 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-hkrrj"] Apr 20 19:16:32.400776 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:16:32.400741 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod549ea97d_23c0_456d_b087_b04bb3694d05.slice/crio-b0c1c3a95908e608ad58f6cfee457709a14b2b6b306e63a8ad48737ba523292b WatchSource:0}: Error finding container b0c1c3a95908e608ad58f6cfee457709a14b2b6b306e63a8ad48737ba523292b: Status 404 returned error can't find the container with id b0c1c3a95908e608ad58f6cfee457709a14b2b6b306e63a8ad48737ba523292b Apr 20 19:16:33.188360 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:33.188141 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5489467c57-rkknm" event={"ID":"8208c3ef-bdb3-4df4-81b8-808e9113792f","Type":"ContainerStarted","Data":"aedc0a407aa0853267513577c09b80eced1631ecf24200b5ff36cec52d401bab"} Apr 20 19:16:33.189663 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:33.189629 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" event={"ID":"549ea97d-23c0-456d-b087-b04bb3694d05","Type":"ContainerStarted","Data":"b0c1c3a95908e608ad58f6cfee457709a14b2b6b306e63a8ad48737ba523292b"} Apr 20 19:16:33.209964 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:33.209904 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-5489467c57-rkknm" podStartSLOduration=1.763259181 podStartE2EDuration="5.209886043s" podCreationTimestamp="2026-04-20 19:16:28 +0000 UTC" firstStartedPulling="2026-04-20 19:16:28.886622907 +0000 UTC m=+497.605925663" lastFinishedPulling="2026-04-20 19:16:32.33324976 +0000 UTC m=+501.052552525" observedRunningTime="2026-04-20 19:16:33.207462219 +0000 UTC m=+501.926764993" watchObservedRunningTime="2026-04-20 19:16:33.209886043 +0000 UTC m=+501.929188859" Apr 20 19:16:36.202542 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:36.202507 2572 generic.go:358] "Generic (PLEG): container finished" podID="549ea97d-23c0-456d-b087-b04bb3694d05" containerID="a6ab0798168d25fa569026a93e015f9df8c4c19e9d1cef7f32c4f522f485b1a8" exitCode=1 Apr 20 19:16:36.203061 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:36.202570 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" event={"ID":"549ea97d-23c0-456d-b087-b04bb3694d05","Type":"ContainerDied","Data":"a6ab0798168d25fa569026a93e015f9df8c4c19e9d1cef7f32c4f522f485b1a8"} Apr 20 19:16:36.203061 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:36.202806 2572 scope.go:117] "RemoveContainer" containerID="a6ab0798168d25fa569026a93e015f9df8c4c19e9d1cef7f32c4f522f485b1a8" Apr 20 19:16:37.207251 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:37.207212 2572 generic.go:358] "Generic (PLEG): container finished" podID="549ea97d-23c0-456d-b087-b04bb3694d05" containerID="561caf877141e5d3116cf7eb3defa50c2a02ef80c7513818d3c8ad33d72f32ce" exitCode=1 Apr 20 19:16:37.207729 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:37.207290 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" event={"ID":"549ea97d-23c0-456d-b087-b04bb3694d05","Type":"ContainerDied","Data":"561caf877141e5d3116cf7eb3defa50c2a02ef80c7513818d3c8ad33d72f32ce"} Apr 20 19:16:37.207729 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:37.207326 2572 scope.go:117] "RemoveContainer" containerID="a6ab0798168d25fa569026a93e015f9df8c4c19e9d1cef7f32c4f522f485b1a8" Apr 20 19:16:37.207729 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:37.207612 2572 scope.go:117] "RemoveContainer" containerID="561caf877141e5d3116cf7eb3defa50c2a02ef80c7513818d3c8ad33d72f32ce" Apr 20 19:16:37.207874 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:16:37.207850 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-hkrrj_opendatahub(549ea97d-23c0-456d-b087-b04bb3694d05)\"" pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" podUID="549ea97d-23c0-456d-b087-b04bb3694d05" Apr 20 19:16:37.693990 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:37.693930 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-njx8s"] Apr 20 19:16:37.696387 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:37.696371 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-njx8s" Apr 20 19:16:37.698961 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:37.698936 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 19:16:37.699185 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:37.699163 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-wdfbs\"" Apr 20 19:16:37.710343 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:37.710314 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-njx8s"] Apr 20 19:16:37.846628 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:37.846600 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e6fec0a-e407-4959-8eac-96add8aa5367-cert\") pod \"kserve-controller-manager-856948b99f-njx8s\" (UID: \"5e6fec0a-e407-4959-8eac-96add8aa5367\") " pod="opendatahub/kserve-controller-manager-856948b99f-njx8s" Apr 20 19:16:37.846628 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:37.846632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2svg2\" (UniqueName: \"kubernetes.io/projected/5e6fec0a-e407-4959-8eac-96add8aa5367-kube-api-access-2svg2\") pod \"kserve-controller-manager-856948b99f-njx8s\" (UID: \"5e6fec0a-e407-4959-8eac-96add8aa5367\") " pod="opendatahub/kserve-controller-manager-856948b99f-njx8s" Apr 20 19:16:37.947553 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:37.947451 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e6fec0a-e407-4959-8eac-96add8aa5367-cert\") pod \"kserve-controller-manager-856948b99f-njx8s\" (UID: \"5e6fec0a-e407-4959-8eac-96add8aa5367\") " pod="opendatahub/kserve-controller-manager-856948b99f-njx8s" Apr 20 19:16:37.947553 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:37.947504 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2svg2\" (UniqueName: \"kubernetes.io/projected/5e6fec0a-e407-4959-8eac-96add8aa5367-kube-api-access-2svg2\") pod \"kserve-controller-manager-856948b99f-njx8s\" (UID: \"5e6fec0a-e407-4959-8eac-96add8aa5367\") " pod="opendatahub/kserve-controller-manager-856948b99f-njx8s" Apr 20 19:16:37.947770 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:16:37.947602 2572 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 19:16:37.947770 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:16:37.947678 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e6fec0a-e407-4959-8eac-96add8aa5367-cert podName:5e6fec0a-e407-4959-8eac-96add8aa5367 nodeName:}" failed. No retries permitted until 2026-04-20 19:16:38.447660378 +0000 UTC m=+507.166963128 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5e6fec0a-e407-4959-8eac-96add8aa5367-cert") pod "kserve-controller-manager-856948b99f-njx8s" (UID: "5e6fec0a-e407-4959-8eac-96add8aa5367") : secret "kserve-webhook-server-cert" not found Apr 20 19:16:37.958601 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:37.958582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2svg2\" (UniqueName: \"kubernetes.io/projected/5e6fec0a-e407-4959-8eac-96add8aa5367-kube-api-access-2svg2\") pod \"kserve-controller-manager-856948b99f-njx8s\" (UID: \"5e6fec0a-e407-4959-8eac-96add8aa5367\") " pod="opendatahub/kserve-controller-manager-856948b99f-njx8s" Apr 20 19:16:38.216444 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:38.216370 2572 scope.go:117] "RemoveContainer" containerID="561caf877141e5d3116cf7eb3defa50c2a02ef80c7513818d3c8ad33d72f32ce" Apr 20 19:16:38.216830 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:16:38.216569 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-hkrrj_opendatahub(549ea97d-23c0-456d-b087-b04bb3694d05)\"" pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" podUID="549ea97d-23c0-456d-b087-b04bb3694d05" Apr 20 19:16:38.452759 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:38.452723 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e6fec0a-e407-4959-8eac-96add8aa5367-cert\") pod \"kserve-controller-manager-856948b99f-njx8s\" (UID: \"5e6fec0a-e407-4959-8eac-96add8aa5367\") " pod="opendatahub/kserve-controller-manager-856948b99f-njx8s" Apr 20 19:16:38.455085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:38.455063 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e6fec0a-e407-4959-8eac-96add8aa5367-cert\") pod \"kserve-controller-manager-856948b99f-njx8s\" (UID: \"5e6fec0a-e407-4959-8eac-96add8aa5367\") " pod="opendatahub/kserve-controller-manager-856948b99f-njx8s" Apr 20 19:16:38.608515 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:38.608395 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-njx8s" Apr 20 19:16:38.730846 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:38.730720 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-njx8s"] Apr 20 19:16:38.732930 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:16:38.732899 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e6fec0a_e407_4959_8eac_96add8aa5367.slice/crio-672532b79db6464573f2e728a6a1a44e2cf3ff85bc32f87557b557f287bd17ed WatchSource:0}: Error finding container 672532b79db6464573f2e728a6a1a44e2cf3ff85bc32f87557b557f287bd17ed: Status 404 returned error can't find the container with id 672532b79db6464573f2e728a6a1a44e2cf3ff85bc32f87557b557f287bd17ed Apr 20 19:16:39.219951 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:39.219919 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-njx8s" event={"ID":"5e6fec0a-e407-4959-8eac-96add8aa5367","Type":"ContainerStarted","Data":"672532b79db6464573f2e728a6a1a44e2cf3ff85bc32f87557b557f287bd17ed"} Apr 20 19:16:42.121576 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:42.121540 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" Apr 20 19:16:42.121987 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:42.121909 2572 scope.go:117] "RemoveContainer" containerID="561caf877141e5d3116cf7eb3defa50c2a02ef80c7513818d3c8ad33d72f32ce" Apr 20 19:16:42.122090 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:16:42.122071 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-hkrrj_opendatahub(549ea97d-23c0-456d-b087-b04bb3694d05)\"" pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" podUID="549ea97d-23c0-456d-b087-b04bb3694d05" Apr 20 19:16:42.231114 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:42.231072 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-njx8s" event={"ID":"5e6fec0a-e407-4959-8eac-96add8aa5367","Type":"ContainerStarted","Data":"828429030e07ef073c6f8e05726d006c3690e430fbc2cf669a40cdf2d26f2f89"} Apr 20 19:16:42.231287 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:42.231194 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-njx8s" Apr 20 19:16:42.263096 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:42.263046 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-njx8s" podStartSLOduration=2.474276166 podStartE2EDuration="5.263028136s" podCreationTimestamp="2026-04-20 19:16:37 +0000 UTC" firstStartedPulling="2026-04-20 19:16:38.734132328 +0000 UTC m=+507.453435083" lastFinishedPulling="2026-04-20 19:16:41.522884287 +0000 UTC m=+510.242187053" observedRunningTime="2026-04-20 19:16:42.26173295 +0000 UTC m=+510.981035723" watchObservedRunningTime="2026-04-20 19:16:42.263028136 +0000 UTC m=+510.982330907" Apr 20 19:16:43.282283 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:43.282251 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8"] Apr 20 19:16:43.284956 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:43.284923 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8" Apr 20 19:16:43.291724 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:43.291696 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-6v6f5\"" Apr 20 19:16:43.291871 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:43.291745 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 19:16:43.291871 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:43.291764 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 19:16:43.312700 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:43.312671 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8"] Apr 20 19:16:43.391460 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:43.391431 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/e6cb731a-d95e-4bf9-a0a0-a1c635310fa3-operator-config\") pod \"servicemesh-operator3-55f49c5f94-9xkh8\" (UID: \"e6cb731a-d95e-4bf9-a0a0-a1c635310fa3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8" Apr 20 19:16:43.391615 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:43.391465 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62zd7\" (UniqueName: \"kubernetes.io/projected/e6cb731a-d95e-4bf9-a0a0-a1c635310fa3-kube-api-access-62zd7\") pod \"servicemesh-operator3-55f49c5f94-9xkh8\" (UID: \"e6cb731a-d95e-4bf9-a0a0-a1c635310fa3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8" Apr 20 19:16:43.492932 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:43.492899 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/e6cb731a-d95e-4bf9-a0a0-a1c635310fa3-operator-config\") pod \"servicemesh-operator3-55f49c5f94-9xkh8\" (UID: \"e6cb731a-d95e-4bf9-a0a0-a1c635310fa3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8" Apr 20 19:16:43.492932 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:43.492934 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62zd7\" (UniqueName: \"kubernetes.io/projected/e6cb731a-d95e-4bf9-a0a0-a1c635310fa3-kube-api-access-62zd7\") pod \"servicemesh-operator3-55f49c5f94-9xkh8\" (UID: \"e6cb731a-d95e-4bf9-a0a0-a1c635310fa3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8" Apr 20 19:16:43.495538 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:43.495509 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/e6cb731a-d95e-4bf9-a0a0-a1c635310fa3-operator-config\") pod \"servicemesh-operator3-55f49c5f94-9xkh8\" (UID: \"e6cb731a-d95e-4bf9-a0a0-a1c635310fa3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8" Apr 20 19:16:43.503050 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:43.503024 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62zd7\" (UniqueName: \"kubernetes.io/projected/e6cb731a-d95e-4bf9-a0a0-a1c635310fa3-kube-api-access-62zd7\") pod \"servicemesh-operator3-55f49c5f94-9xkh8\" (UID: \"e6cb731a-d95e-4bf9-a0a0-a1c635310fa3\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8" Apr 20 19:16:43.596811 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:43.596713 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8" Apr 20 19:16:43.725027 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:43.724961 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8"] Apr 20 19:16:43.728760 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:16:43.728734 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6cb731a_d95e_4bf9_a0a0_a1c635310fa3.slice/crio-f4a166f7e15623a1e79b84dcaa7f519e72b0c80f184880812fd32730021efde3 WatchSource:0}: Error finding container f4a166f7e15623a1e79b84dcaa7f519e72b0c80f184880812fd32730021efde3: Status 404 returned error can't find the container with id f4a166f7e15623a1e79b84dcaa7f519e72b0c80f184880812fd32730021efde3 Apr 20 19:16:44.237486 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:44.237431 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8" event={"ID":"e6cb731a-d95e-4bf9-a0a0-a1c635310fa3","Type":"ContainerStarted","Data":"f4a166f7e15623a1e79b84dcaa7f519e72b0c80f184880812fd32730021efde3"} Apr 20 19:16:47.248595 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:47.248560 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8" event={"ID":"e6cb731a-d95e-4bf9-a0a0-a1c635310fa3","Type":"ContainerStarted","Data":"2aadeb02cc12dc44456d9ad778532ab58b436e844a45f874e44dc155c39e3857"} Apr 20 19:16:47.248595 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:47.248612 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8" Apr 20 19:16:47.271599 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:47.271556 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8" podStartSLOduration=1.319595859 podStartE2EDuration="4.271541481s" podCreationTimestamp="2026-04-20 19:16:43 +0000 UTC" firstStartedPulling="2026-04-20 19:16:43.731638367 +0000 UTC m=+512.450941116" lastFinishedPulling="2026-04-20 19:16:46.683583969 +0000 UTC m=+515.402886738" observedRunningTime="2026-04-20 19:16:47.269212549 +0000 UTC m=+515.988515321" watchObservedRunningTime="2026-04-20 19:16:47.271541481 +0000 UTC m=+515.990844255" Apr 20 19:16:52.121558 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:52.121457 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" Apr 20 19:16:52.121922 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:52.121851 2572 scope.go:117] "RemoveContainer" containerID="561caf877141e5d3116cf7eb3defa50c2a02ef80c7513818d3c8ad33d72f32ce" Apr 20 19:16:53.268635 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:53.268595 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" event={"ID":"549ea97d-23c0-456d-b087-b04bb3694d05","Type":"ContainerStarted","Data":"ca6d76f422f7a8d4eba530460847a04701a4f456d41ca43b78751d0eb29dc415"} Apr 20 19:16:53.269052 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:53.268813 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" Apr 20 19:16:53.287281 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:53.287232 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" podStartSLOduration=2.257136368 podStartE2EDuration="22.287220202s" podCreationTimestamp="2026-04-20 19:16:31 +0000 UTC" firstStartedPulling="2026-04-20 19:16:32.402091924 +0000 UTC m=+501.121394678" lastFinishedPulling="2026-04-20 19:16:52.432175759 +0000 UTC m=+521.151478512" observedRunningTime="2026-04-20 19:16:53.285999563 +0000 UTC m=+522.005302358" watchObservedRunningTime="2026-04-20 19:16:53.287220202 +0000 UTC m=+522.006522974" Apr 20 19:16:58.254499 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:58.254448 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9xkh8" Apr 20 19:16:59.248876 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.248838 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr"] Apr 20 19:16:59.251383 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.251363 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.254105 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.254085 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 19:16:59.254322 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.254301 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 19:16:59.254792 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.254754 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 19:16:59.254792 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.254766 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-wpnd5\"" Apr 20 19:16:59.255172 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.254755 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 19:16:59.268601 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.268578 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr"] Apr 20 19:16:59.329131 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.329107 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.329285 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.329147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.329285 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.329169 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.329285 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.329252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.329393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.329328 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgn6m\" (UniqueName: \"kubernetes.io/projected/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-kube-api-access-vgn6m\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.329393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.329365 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.329453 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.329394 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.429821 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.429780 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.430029 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.429835 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.430029 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.429867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.430029 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.429903 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgn6m\" (UniqueName: \"kubernetes.io/projected/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-kube-api-access-vgn6m\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.430190 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.430079 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.430190 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.430128 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.430190 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.430173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.430855 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.430830 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.432535 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.432491 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.432830 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.432805 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.432933 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.432828 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.432933 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.432839 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.442525 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.442505 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.447439 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.447408 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgn6m\" (UniqueName: \"kubernetes.io/projected/7f2c3012-6e24-4bb5-b0e9-72fb8186d007-kube-api-access-vgn6m\") pod \"istiod-openshift-gateway-55ff986f96-8ptvr\" (UID: \"7f2c3012-6e24-4bb5-b0e9-72fb8186d007\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.561662 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.561579 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:16:59.711008 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:16:59.710979 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr"] Apr 20 19:16:59.717080 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:16:59.715026 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f2c3012_6e24_4bb5_b0e9_72fb8186d007.slice/crio-32c79d34d7f618ae63d7e92adb8c6440aa78f3f28505e14cae58eb39e2f446b6 WatchSource:0}: Error finding container 32c79d34d7f618ae63d7e92adb8c6440aa78f3f28505e14cae58eb39e2f446b6: Status 404 returned error can't find the container with id 32c79d34d7f618ae63d7e92adb8c6440aa78f3f28505e14cae58eb39e2f446b6 Apr 20 19:17:00.291871 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:00.291836 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" event={"ID":"7f2c3012-6e24-4bb5-b0e9-72fb8186d007","Type":"ContainerStarted","Data":"32c79d34d7f618ae63d7e92adb8c6440aa78f3f28505e14cae58eb39e2f446b6"} Apr 20 19:17:02.182058 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:02.182017 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 19:17:02.182344 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:02.182086 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 19:17:02.301130 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:02.301090 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" event={"ID":"7f2c3012-6e24-4bb5-b0e9-72fb8186d007","Type":"ContainerStarted","Data":"c6b1001fb040fb288faae8f77c79309975114f86c9f129f09f459319d4089d70"} Apr 20 19:17:02.301364 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:02.301194 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:17:02.318868 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:02.318796 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" podStartSLOduration=0.854316775 podStartE2EDuration="3.318777576s" podCreationTimestamp="2026-04-20 19:16:59 +0000 UTC" firstStartedPulling="2026-04-20 19:16:59.717324106 +0000 UTC m=+528.436626860" lastFinishedPulling="2026-04-20 19:17:02.181784901 +0000 UTC m=+530.901087661" observedRunningTime="2026-04-20 19:17:02.317790943 +0000 UTC m=+531.037093715" watchObservedRunningTime="2026-04-20 19:17:02.318777576 +0000 UTC m=+531.038080348" Apr 20 19:17:03.305981 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:03.305948 2572 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-8ptvr container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 19:17:03.306424 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:03.306010 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" podUID="7f2c3012-6e24-4bb5-b0e9-72fb8186d007" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 19:17:04.274423 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:04.274392 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-hkrrj" Apr 20 19:17:06.305408 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:06.305376 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-8ptvr" Apr 20 19:17:13.238991 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:13.238957 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-njx8s" Apr 20 19:17:49.556541 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:49.556501 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m"] Apr 20 19:17:49.561631 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:49.561606 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" Apr 20 19:17:49.564337 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:49.564314 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 19:17:49.564438 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:49.564387 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 19:17:49.564989 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:49.564970 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-npjhc\"" Apr 20 19:17:49.572064 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:49.572044 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m"] Apr 20 19:17:49.655637 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:49.655607 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ce8f7cc-957d-402d-932a-628b31bd73c9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fkr9m\" (UID: \"7ce8f7cc-957d-402d-932a-628b31bd73c9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" Apr 20 19:17:49.655792 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:49.655658 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shcf2\" (UniqueName: \"kubernetes.io/projected/7ce8f7cc-957d-402d-932a-628b31bd73c9-kube-api-access-shcf2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fkr9m\" (UID: \"7ce8f7cc-957d-402d-932a-628b31bd73c9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" Apr 20 19:17:49.756738 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:49.756700 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ce8f7cc-957d-402d-932a-628b31bd73c9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fkr9m\" (UID: \"7ce8f7cc-957d-402d-932a-628b31bd73c9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" Apr 20 19:17:49.756893 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:49.756759 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shcf2\" (UniqueName: \"kubernetes.io/projected/7ce8f7cc-957d-402d-932a-628b31bd73c9-kube-api-access-shcf2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fkr9m\" (UID: \"7ce8f7cc-957d-402d-932a-628b31bd73c9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" Apr 20 19:17:49.757153 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:49.757134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ce8f7cc-957d-402d-932a-628b31bd73c9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fkr9m\" (UID: \"7ce8f7cc-957d-402d-932a-628b31bd73c9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" Apr 20 19:17:49.766084 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:49.766052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shcf2\" (UniqueName: \"kubernetes.io/projected/7ce8f7cc-957d-402d-932a-628b31bd73c9-kube-api-access-shcf2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-fkr9m\" (UID: \"7ce8f7cc-957d-402d-932a-628b31bd73c9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" Apr 20 19:17:49.872628 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:49.872553 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" Apr 20 19:17:50.003621 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:50.003591 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m"] Apr 20 19:17:50.005263 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:17:50.005231 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ce8f7cc_957d_402d_932a_628b31bd73c9.slice/crio-f0fa5323fbc702364118970df6bea4a093caa4167e46229e20c271606f65b2e0 WatchSource:0}: Error finding container f0fa5323fbc702364118970df6bea4a093caa4167e46229e20c271606f65b2e0: Status 404 returned error can't find the container with id f0fa5323fbc702364118970df6bea4a093caa4167e46229e20c271606f65b2e0 Apr 20 19:17:50.448493 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:50.448434 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" event={"ID":"7ce8f7cc-957d-402d-932a-628b31bd73c9","Type":"ContainerStarted","Data":"f0fa5323fbc702364118970df6bea4a093caa4167e46229e20c271606f65b2e0"} Apr 20 19:17:56.471649 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:56.471609 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" event={"ID":"7ce8f7cc-957d-402d-932a-628b31bd73c9","Type":"ContainerStarted","Data":"d75b179374ee7ee15726fbf30ca7d67082dd838f729a55b08b09e0d820cb822f"} Apr 20 19:17:56.472070 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:56.471817 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" Apr 20 19:17:56.493447 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:17:56.493401 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" podStartSLOduration=2.049447684 podStartE2EDuration="7.49338759s" podCreationTimestamp="2026-04-20 19:17:49 +0000 UTC" firstStartedPulling="2026-04-20 19:17:50.007545613 +0000 UTC m=+578.726848366" lastFinishedPulling="2026-04-20 19:17:55.45148552 +0000 UTC m=+584.170788272" observedRunningTime="2026-04-20 19:17:56.491105689 +0000 UTC m=+585.210408460" watchObservedRunningTime="2026-04-20 19:17:56.49338759 +0000 UTC m=+585.212690362" Apr 20 19:18:07.477752 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:07.477716 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" Apr 20 19:18:08.905806 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:08.905773 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk"] Apr 20 19:18:08.907947 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:08.907919 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk" Apr 20 19:18:08.923004 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:08.922979 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk"] Apr 20 19:18:09.016522 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.016490 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/637ebd26-96fc-4e46-9142-093f29cd3ca9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-rczrk\" (UID: \"637ebd26-96fc-4e46-9142-093f29cd3ca9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk" Apr 20 19:18:09.016665 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.016541 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr6cx\" (UniqueName: \"kubernetes.io/projected/637ebd26-96fc-4e46-9142-093f29cd3ca9-kube-api-access-pr6cx\") pod \"kuadrant-operator-controller-manager-84b657d985-rczrk\" (UID: \"637ebd26-96fc-4e46-9142-093f29cd3ca9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk" Apr 20 19:18:09.117545 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.117509 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pr6cx\" (UniqueName: \"kubernetes.io/projected/637ebd26-96fc-4e46-9142-093f29cd3ca9-kube-api-access-pr6cx\") pod \"kuadrant-operator-controller-manager-84b657d985-rczrk\" (UID: \"637ebd26-96fc-4e46-9142-093f29cd3ca9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk" Apr 20 19:18:09.117716 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.117618 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/637ebd26-96fc-4e46-9142-093f29cd3ca9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-rczrk\" (UID: \"637ebd26-96fc-4e46-9142-093f29cd3ca9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk" Apr 20 19:18:09.117992 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.117971 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/637ebd26-96fc-4e46-9142-093f29cd3ca9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-rczrk\" (UID: \"637ebd26-96fc-4e46-9142-093f29cd3ca9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk" Apr 20 19:18:09.128174 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.128149 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr6cx\" (UniqueName: \"kubernetes.io/projected/637ebd26-96fc-4e46-9142-093f29cd3ca9-kube-api-access-pr6cx\") pod \"kuadrant-operator-controller-manager-84b657d985-rczrk\" (UID: \"637ebd26-96fc-4e46-9142-093f29cd3ca9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk" Apr 20 19:18:09.180158 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.180082 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m"] Apr 20 19:18:09.180366 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.180342 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" podUID="7ce8f7cc-957d-402d-932a-628b31bd73c9" containerName="manager" containerID="cri-o://d75b179374ee7ee15726fbf30ca7d67082dd838f729a55b08b09e0d820cb822f" gracePeriod=2 Apr 20 19:18:09.186840 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.186806 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m"] Apr 20 19:18:09.197542 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.197513 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk"] Apr 20 19:18:09.197862 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.197817 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk" Apr 20 19:18:09.205848 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.205818 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk"] Apr 20 19:18:09.209196 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.209167 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247"] Apr 20 19:18:09.209676 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.209652 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ce8f7cc-957d-402d-932a-628b31bd73c9" containerName="manager" Apr 20 19:18:09.209676 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.209677 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce8f7cc-957d-402d-932a-628b31bd73c9" containerName="manager" Apr 20 19:18:09.209814 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.209758 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ce8f7cc-957d-402d-932a-628b31bd73c9" containerName="manager" Apr 20 19:18:09.212001 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.211979 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" Apr 20 19:18:09.234452 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.234416 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd"] Apr 20 19:18:09.236393 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.236372 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" Apr 20 19:18:09.241845 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.241816 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247"] Apr 20 19:18:09.280820 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.280795 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd"] Apr 20 19:18:09.291158 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.291117 2572 status_manager.go:895] "Failed to get status for pod" podUID="7ce8f7cc-957d-402d-932a-628b31bd73c9" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fkr9m\" is forbidden: User \"system:node:ip-10-0-136-5.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-5.ec2.internal' and this object" Apr 20 19:18:09.293388 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.293356 2572 status_manager.go:895] "Failed to get status for pod" podUID="7ce8f7cc-957d-402d-932a-628b31bd73c9" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fkr9m\" is forbidden: User \"system:node:ip-10-0-136-5.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-5.ec2.internal' and this object" Apr 20 19:18:09.320117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.319837 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f69470bf-0c1e-48bf-84b9-490641866559-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-s5trd\" (UID: \"f69470bf-0c1e-48bf-84b9-490641866559\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" Apr 20 19:18:09.320117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.319892 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4bb7816-15bd-4011-8fe7-e7543af66b1a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-jl247\" (UID: \"d4bb7816-15bd-4011-8fe7-e7543af66b1a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" Apr 20 19:18:09.320117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.319941 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bdks\" (UniqueName: \"kubernetes.io/projected/d4bb7816-15bd-4011-8fe7-e7543af66b1a-kube-api-access-4bdks\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-jl247\" (UID: \"d4bb7816-15bd-4011-8fe7-e7543af66b1a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" Apr 20 19:18:09.320117 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.320006 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gm5d\" (UniqueName: \"kubernetes.io/projected/f69470bf-0c1e-48bf-84b9-490641866559-kube-api-access-5gm5d\") pod \"kuadrant-operator-controller-manager-84b657d985-s5trd\" (UID: \"f69470bf-0c1e-48bf-84b9-490641866559\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" Apr 20 19:18:09.421121 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.420965 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gm5d\" (UniqueName: \"kubernetes.io/projected/f69470bf-0c1e-48bf-84b9-490641866559-kube-api-access-5gm5d\") pod \"kuadrant-operator-controller-manager-84b657d985-s5trd\" (UID: \"f69470bf-0c1e-48bf-84b9-490641866559\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" Apr 20 19:18:09.421121 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.421023 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f69470bf-0c1e-48bf-84b9-490641866559-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-s5trd\" (UID: \"f69470bf-0c1e-48bf-84b9-490641866559\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" Apr 20 19:18:09.421121 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.421056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4bb7816-15bd-4011-8fe7-e7543af66b1a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-jl247\" (UID: \"d4bb7816-15bd-4011-8fe7-e7543af66b1a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" Apr 20 19:18:09.421121 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.421089 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bdks\" (UniqueName: \"kubernetes.io/projected/d4bb7816-15bd-4011-8fe7-e7543af66b1a-kube-api-access-4bdks\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-jl247\" (UID: \"d4bb7816-15bd-4011-8fe7-e7543af66b1a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" Apr 20 19:18:09.421498 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.421369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f69470bf-0c1e-48bf-84b9-490641866559-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-s5trd\" (UID: \"f69470bf-0c1e-48bf-84b9-490641866559\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" Apr 20 19:18:09.421498 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.421442 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4bb7816-15bd-4011-8fe7-e7543af66b1a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-jl247\" (UID: \"d4bb7816-15bd-4011-8fe7-e7543af66b1a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" Apr 20 19:18:09.436055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.436005 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" Apr 20 19:18:09.438307 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.438271 2572 status_manager.go:895] "Failed to get status for pod" podUID="7ce8f7cc-957d-402d-932a-628b31bd73c9" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fkr9m\" is forbidden: User \"system:node:ip-10-0-136-5.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-5.ec2.internal' and this object" Apr 20 19:18:09.438691 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.438672 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bdks\" (UniqueName: \"kubernetes.io/projected/d4bb7816-15bd-4011-8fe7-e7543af66b1a-kube-api-access-4bdks\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-jl247\" (UID: \"d4bb7816-15bd-4011-8fe7-e7543af66b1a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" Apr 20 19:18:09.441429 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.441412 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gm5d\" (UniqueName: \"kubernetes.io/projected/f69470bf-0c1e-48bf-84b9-490641866559-kube-api-access-5gm5d\") pod \"kuadrant-operator-controller-manager-84b657d985-s5trd\" (UID: \"f69470bf-0c1e-48bf-84b9-490641866559\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" Apr 20 19:18:09.512960 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.512933 2572 generic.go:358] "Generic (PLEG): container finished" podID="7ce8f7cc-957d-402d-932a-628b31bd73c9" containerID="d75b179374ee7ee15726fbf30ca7d67082dd838f729a55b08b09e0d820cb822f" exitCode=0 Apr 20 19:18:09.513065 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.512976 2572 scope.go:117] "RemoveContainer" containerID="d75b179374ee7ee15726fbf30ca7d67082dd838f729a55b08b09e0d820cb822f" Apr 20 19:18:09.513065 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.512976 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" Apr 20 19:18:09.515246 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.515224 2572 status_manager.go:895] "Failed to get status for pod" podUID="7ce8f7cc-957d-402d-932a-628b31bd73c9" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fkr9m\" is forbidden: User \"system:node:ip-10-0-136-5.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-5.ec2.internal' and this object" Apr 20 19:18:09.520236 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.520213 2572 scope.go:117] "RemoveContainer" containerID="d75b179374ee7ee15726fbf30ca7d67082dd838f729a55b08b09e0d820cb822f" Apr 20 19:18:09.520462 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:18:09.520445 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d75b179374ee7ee15726fbf30ca7d67082dd838f729a55b08b09e0d820cb822f\": container with ID starting with d75b179374ee7ee15726fbf30ca7d67082dd838f729a55b08b09e0d820cb822f not found: ID does not exist" containerID="d75b179374ee7ee15726fbf30ca7d67082dd838f729a55b08b09e0d820cb822f" Apr 20 19:18:09.520523 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.520483 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d75b179374ee7ee15726fbf30ca7d67082dd838f729a55b08b09e0d820cb822f"} err="failed to get container status \"d75b179374ee7ee15726fbf30ca7d67082dd838f729a55b08b09e0d820cb822f\": rpc error: code = NotFound desc = could not find container \"d75b179374ee7ee15726fbf30ca7d67082dd838f729a55b08b09e0d820cb822f\": container with ID starting with d75b179374ee7ee15726fbf30ca7d67082dd838f729a55b08b09e0d820cb822f not found: ID does not exist" Apr 20 19:18:09.521702 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.521685 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ce8f7cc-957d-402d-932a-628b31bd73c9-extensions-socket-volume\") pod \"7ce8f7cc-957d-402d-932a-628b31bd73c9\" (UID: \"7ce8f7cc-957d-402d-932a-628b31bd73c9\") " Apr 20 19:18:09.521762 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.521723 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shcf2\" (UniqueName: \"kubernetes.io/projected/7ce8f7cc-957d-402d-932a-628b31bd73c9-kube-api-access-shcf2\") pod \"7ce8f7cc-957d-402d-932a-628b31bd73c9\" (UID: \"7ce8f7cc-957d-402d-932a-628b31bd73c9\") " Apr 20 19:18:09.522154 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.522132 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ce8f7cc-957d-402d-932a-628b31bd73c9-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "7ce8f7cc-957d-402d-932a-628b31bd73c9" (UID: "7ce8f7cc-957d-402d-932a-628b31bd73c9"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:18:09.523700 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.523677 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce8f7cc-957d-402d-932a-628b31bd73c9-kube-api-access-shcf2" (OuterVolumeSpecName: "kube-api-access-shcf2") pod "7ce8f7cc-957d-402d-932a-628b31bd73c9" (UID: "7ce8f7cc-957d-402d-932a-628b31bd73c9"). InnerVolumeSpecName "kube-api-access-shcf2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:18:09.560865 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.560842 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" Apr 20 19:18:09.567444 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.567425 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" Apr 20 19:18:09.623420 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.623368 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7ce8f7cc-957d-402d-932a-628b31bd73c9-extensions-socket-volume\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:18:09.623420 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.623398 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-shcf2\" (UniqueName: \"kubernetes.io/projected/7ce8f7cc-957d-402d-932a-628b31bd73c9-kube-api-access-shcf2\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:18:09.703187 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.703100 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247"] Apr 20 19:18:09.706179 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:18:09.706141 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4bb7816_15bd_4011_8fe7_e7543af66b1a.slice/crio-24d35dcc50e025864f7747bf33ee9899296c5fa6297d8dbe5d95d8a79660bc5c WatchSource:0}: Error finding container 24d35dcc50e025864f7747bf33ee9899296c5fa6297d8dbe5d95d8a79660bc5c: Status 404 returned error can't find the container with id 24d35dcc50e025864f7747bf33ee9899296c5fa6297d8dbe5d95d8a79660bc5c Apr 20 19:18:09.719167 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.719147 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd"] Apr 20 19:18:09.722129 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:18:09.722100 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf69470bf_0c1e_48bf_84b9_490641866559.slice/crio-900c09da09684787f9981803a32c80de9dd2274af708d1fe8289566134bf4696 WatchSource:0}: Error finding container 900c09da09684787f9981803a32c80de9dd2274af708d1fe8289566134bf4696: Status 404 returned error can't find the container with id 900c09da09684787f9981803a32c80de9dd2274af708d1fe8289566134bf4696 Apr 20 19:18:09.827047 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.827017 2572 status_manager.go:895] "Failed to get status for pod" podUID="7ce8f7cc-957d-402d-932a-628b31bd73c9" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-fkr9m" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-fkr9m\" is forbidden: User \"system:node:ip-10-0-136-5.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-5.ec2.internal' and this object" Apr 20 19:18:09.829722 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:09.829698 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ce8f7cc-957d-402d-932a-628b31bd73c9" path="/var/lib/kubelet/pods/7ce8f7cc-957d-402d-932a-628b31bd73c9/volumes" Apr 20 19:18:10.517784 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:10.517748 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" event={"ID":"f69470bf-0c1e-48bf-84b9-490641866559","Type":"ContainerStarted","Data":"6aba2498e9a6d767d2a5d0dc0f93b5d8a6c1b50d85e837b642b4a7cb7d3f7574"} Apr 20 19:18:10.518268 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:10.517791 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" event={"ID":"f69470bf-0c1e-48bf-84b9-490641866559","Type":"ContainerStarted","Data":"900c09da09684787f9981803a32c80de9dd2274af708d1fe8289566134bf4696"} Apr 20 19:18:10.518268 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:10.517843 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" Apr 20 19:18:10.520246 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:10.520221 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" event={"ID":"d4bb7816-15bd-4011-8fe7-e7543af66b1a","Type":"ContainerStarted","Data":"7e37cb4a1eb9047e91780ea6648e1b2093d9b909bddc9992494e7c4338d532b0"} Apr 20 19:18:10.520362 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:10.520249 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" event={"ID":"d4bb7816-15bd-4011-8fe7-e7543af66b1a","Type":"ContainerStarted","Data":"24d35dcc50e025864f7747bf33ee9899296c5fa6297d8dbe5d95d8a79660bc5c"} Apr 20 19:18:10.520362 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:10.520335 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" Apr 20 19:18:10.540414 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:10.540380 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" podStartSLOduration=1.540367366 podStartE2EDuration="1.540367366s" podCreationTimestamp="2026-04-20 19:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:18:10.540050225 +0000 UTC m=+599.259352997" watchObservedRunningTime="2026-04-20 19:18:10.540367366 +0000 UTC m=+599.259670131" Apr 20 19:18:11.730542 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:18:11.730508 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod637ebd26_96fc_4e46_9142_093f29cd3ca9.slice/crio-ae335cc2709d05486172da83646786bb54a7091fc679f1ae2ec27f80d8c74b6c WatchSource:0}: Error finding container ae335cc2709d05486172da83646786bb54a7091fc679f1ae2ec27f80d8c74b6c: Status 404 returned error can't find the container with id ae335cc2709d05486172da83646786bb54a7091fc679f1ae2ec27f80d8c74b6c Apr 20 19:18:12.529437 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:12.529401 2572 generic.go:358] "Generic (PLEG): container finished" podID="637ebd26-96fc-4e46-9142-093f29cd3ca9" containerID="d0c7214853f5d5a54604eff0f7a61086e4b11bc01aa9a7b19035cd85e255a70a" exitCode=1 Apr 20 19:18:12.531661 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:12.531618 2572 status_manager.go:895] "Failed to get status for pod" podUID="637ebd26-96fc-4e46-9142-093f29cd3ca9" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk" err="pods \"kuadrant-operator-controller-manager-84b657d985-rczrk\" is forbidden: User \"system:node:ip-10-0-136-5.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-5.ec2.internal' and this object" Apr 20 19:18:12.557789 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:12.557767 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk" Apr 20 19:18:12.559836 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:12.559808 2572 status_manager.go:895] "Failed to get status for pod" podUID="637ebd26-96fc-4e46-9142-093f29cd3ca9" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk" err="pods \"kuadrant-operator-controller-manager-84b657d985-rczrk\" is forbidden: User \"system:node:ip-10-0-136-5.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-5.ec2.internal' and this object" Apr 20 19:18:12.649416 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:12.649385 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/637ebd26-96fc-4e46-9142-093f29cd3ca9-extensions-socket-volume\") pod \"637ebd26-96fc-4e46-9142-093f29cd3ca9\" (UID: \"637ebd26-96fc-4e46-9142-093f29cd3ca9\") " Apr 20 19:18:12.649577 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:12.649433 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pr6cx\" (UniqueName: \"kubernetes.io/projected/637ebd26-96fc-4e46-9142-093f29cd3ca9-kube-api-access-pr6cx\") pod \"637ebd26-96fc-4e46-9142-093f29cd3ca9\" (UID: \"637ebd26-96fc-4e46-9142-093f29cd3ca9\") " Apr 20 19:18:12.649715 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:12.649689 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/637ebd26-96fc-4e46-9142-093f29cd3ca9-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "637ebd26-96fc-4e46-9142-093f29cd3ca9" (UID: "637ebd26-96fc-4e46-9142-093f29cd3ca9"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:18:12.652227 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:12.652189 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637ebd26-96fc-4e46-9142-093f29cd3ca9-kube-api-access-pr6cx" (OuterVolumeSpecName: "kube-api-access-pr6cx") pod "637ebd26-96fc-4e46-9142-093f29cd3ca9" (UID: "637ebd26-96fc-4e46-9142-093f29cd3ca9"). InnerVolumeSpecName "kube-api-access-pr6cx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:18:12.750554 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:12.750530 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/637ebd26-96fc-4e46-9142-093f29cd3ca9-extensions-socket-volume\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:18:12.750554 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:12.750552 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pr6cx\" (UniqueName: \"kubernetes.io/projected/637ebd26-96fc-4e46-9142-093f29cd3ca9-kube-api-access-pr6cx\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:18:13.533850 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:13.533822 2572 scope.go:117] "RemoveContainer" containerID="d0c7214853f5d5a54604eff0f7a61086e4b11bc01aa9a7b19035cd85e255a70a" Apr 20 19:18:13.534071 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:13.533822 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk" Apr 20 19:18:13.537727 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:13.537692 2572 status_manager.go:895] "Failed to get status for pod" podUID="637ebd26-96fc-4e46-9142-093f29cd3ca9" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk" err="pods \"kuadrant-operator-controller-manager-84b657d985-rczrk\" is forbidden: User \"system:node:ip-10-0-136-5.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-5.ec2.internal' and this object" Apr 20 19:18:13.545275 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:13.545246 2572 status_manager.go:895] "Failed to get status for pod" podUID="637ebd26-96fc-4e46-9142-093f29cd3ca9" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-rczrk" err="pods \"kuadrant-operator-controller-manager-84b657d985-rczrk\" is forbidden: User \"system:node:ip-10-0-136-5.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-5.ec2.internal' and this object" Apr 20 19:18:13.829496 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:13.829400 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637ebd26-96fc-4e46-9142-093f29cd3ca9" path="/var/lib/kubelet/pods/637ebd26-96fc-4e46-9142-093f29cd3ca9/volumes" Apr 20 19:18:21.527062 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.526985 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" Apr 20 19:18:21.527062 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.527039 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" Apr 20 19:18:21.548228 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.548187 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" podStartSLOduration=12.548173012 podStartE2EDuration="12.548173012s" podCreationTimestamp="2026-04-20 19:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:18:10.562847207 +0000 UTC m=+599.282149979" watchObservedRunningTime="2026-04-20 19:18:21.548173012 +0000 UTC m=+610.267475784" Apr 20 19:18:21.604621 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.604593 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247"] Apr 20 19:18:21.604839 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.604795 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" podUID="d4bb7816-15bd-4011-8fe7-e7543af66b1a" containerName="manager" containerID="cri-o://7e37cb4a1eb9047e91780ea6648e1b2093d9b909bddc9992494e7c4338d532b0" gracePeriod=10 Apr 20 19:18:21.846722 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.846698 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" Apr 20 19:18:21.892744 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.892716 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m"] Apr 20 19:18:21.893046 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.893033 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4bb7816-15bd-4011-8fe7-e7543af66b1a" containerName="manager" Apr 20 19:18:21.893093 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.893048 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4bb7816-15bd-4011-8fe7-e7543af66b1a" containerName="manager" Apr 20 19:18:21.893093 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.893056 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="637ebd26-96fc-4e46-9142-093f29cd3ca9" containerName="manager" Apr 20 19:18:21.893093 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.893061 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="637ebd26-96fc-4e46-9142-093f29cd3ca9" containerName="manager" Apr 20 19:18:21.893186 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.893107 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="637ebd26-96fc-4e46-9142-093f29cd3ca9" containerName="manager" Apr 20 19:18:21.893186 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.893116 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4bb7816-15bd-4011-8fe7-e7543af66b1a" containerName="manager" Apr 20 19:18:21.896400 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.896382 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" Apr 20 19:18:21.914086 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.914060 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m"] Apr 20 19:18:21.922126 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.922095 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bdks\" (UniqueName: \"kubernetes.io/projected/d4bb7816-15bd-4011-8fe7-e7543af66b1a-kube-api-access-4bdks\") pod \"d4bb7816-15bd-4011-8fe7-e7543af66b1a\" (UID: \"d4bb7816-15bd-4011-8fe7-e7543af66b1a\") " Apr 20 19:18:21.922309 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.922152 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4bb7816-15bd-4011-8fe7-e7543af66b1a-extensions-socket-volume\") pod \"d4bb7816-15bd-4011-8fe7-e7543af66b1a\" (UID: \"d4bb7816-15bd-4011-8fe7-e7543af66b1a\") " Apr 20 19:18:21.922550 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.922523 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4bb7816-15bd-4011-8fe7-e7543af66b1a-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "d4bb7816-15bd-4011-8fe7-e7543af66b1a" (UID: "d4bb7816-15bd-4011-8fe7-e7543af66b1a"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:18:21.924286 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:21.924260 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4bb7816-15bd-4011-8fe7-e7543af66b1a-kube-api-access-4bdks" (OuterVolumeSpecName: "kube-api-access-4bdks") pod "d4bb7816-15bd-4011-8fe7-e7543af66b1a" (UID: "d4bb7816-15bd-4011-8fe7-e7543af66b1a"). InnerVolumeSpecName "kube-api-access-4bdks". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:18:22.023727 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.023694 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c95lh\" (UniqueName: \"kubernetes.io/projected/ca0c216b-59ae-4b35-bf71-e65828a1a0ac-kube-api-access-c95lh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-7c22m\" (UID: \"ca0c216b-59ae-4b35-bf71-e65828a1a0ac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" Apr 20 19:18:22.023907 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.023738 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ca0c216b-59ae-4b35-bf71-e65828a1a0ac-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-7c22m\" (UID: \"ca0c216b-59ae-4b35-bf71-e65828a1a0ac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" Apr 20 19:18:22.023907 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.023838 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4bdks\" (UniqueName: \"kubernetes.io/projected/d4bb7816-15bd-4011-8fe7-e7543af66b1a-kube-api-access-4bdks\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:18:22.023907 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.023850 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d4bb7816-15bd-4011-8fe7-e7543af66b1a-extensions-socket-volume\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:18:22.125205 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.125122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c95lh\" (UniqueName: \"kubernetes.io/projected/ca0c216b-59ae-4b35-bf71-e65828a1a0ac-kube-api-access-c95lh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-7c22m\" (UID: \"ca0c216b-59ae-4b35-bf71-e65828a1a0ac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" Apr 20 19:18:22.125205 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.125163 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ca0c216b-59ae-4b35-bf71-e65828a1a0ac-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-7c22m\" (UID: \"ca0c216b-59ae-4b35-bf71-e65828a1a0ac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" Apr 20 19:18:22.125644 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.125625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ca0c216b-59ae-4b35-bf71-e65828a1a0ac-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-7c22m\" (UID: \"ca0c216b-59ae-4b35-bf71-e65828a1a0ac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" Apr 20 19:18:22.137441 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.137404 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c95lh\" (UniqueName: \"kubernetes.io/projected/ca0c216b-59ae-4b35-bf71-e65828a1a0ac-kube-api-access-c95lh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-7c22m\" (UID: \"ca0c216b-59ae-4b35-bf71-e65828a1a0ac\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" Apr 20 19:18:22.207358 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.207326 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" Apr 20 19:18:22.326602 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.326383 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m"] Apr 20 19:18:22.329395 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:18:22.329371 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca0c216b_59ae_4b35_bf71_e65828a1a0ac.slice/crio-2f54ea21b4f08526b74574229104daf40030067945595291bacb68e4e9baf802 WatchSource:0}: Error finding container 2f54ea21b4f08526b74574229104daf40030067945595291bacb68e4e9baf802: Status 404 returned error can't find the container with id 2f54ea21b4f08526b74574229104daf40030067945595291bacb68e4e9baf802 Apr 20 19:18:22.563971 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.563931 2572 generic.go:358] "Generic (PLEG): container finished" podID="d4bb7816-15bd-4011-8fe7-e7543af66b1a" containerID="7e37cb4a1eb9047e91780ea6648e1b2093d9b909bddc9992494e7c4338d532b0" exitCode=0 Apr 20 19:18:22.564427 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.564003 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" Apr 20 19:18:22.564427 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.564013 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" event={"ID":"d4bb7816-15bd-4011-8fe7-e7543af66b1a","Type":"ContainerDied","Data":"7e37cb4a1eb9047e91780ea6648e1b2093d9b909bddc9992494e7c4338d532b0"} Apr 20 19:18:22.564427 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.564048 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247" event={"ID":"d4bb7816-15bd-4011-8fe7-e7543af66b1a","Type":"ContainerDied","Data":"24d35dcc50e025864f7747bf33ee9899296c5fa6297d8dbe5d95d8a79660bc5c"} Apr 20 19:18:22.564427 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.564064 2572 scope.go:117] "RemoveContainer" containerID="7e37cb4a1eb9047e91780ea6648e1b2093d9b909bddc9992494e7c4338d532b0" Apr 20 19:18:22.565365 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.565326 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" event={"ID":"ca0c216b-59ae-4b35-bf71-e65828a1a0ac","Type":"ContainerStarted","Data":"cf0fc17358d9af83c6c832995af2ce7c27958722fab8f4e001796ff43ed963da"} Apr 20 19:18:22.565365 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.565360 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" event={"ID":"ca0c216b-59ae-4b35-bf71-e65828a1a0ac","Type":"ContainerStarted","Data":"2f54ea21b4f08526b74574229104daf40030067945595291bacb68e4e9baf802"} Apr 20 19:18:22.565557 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.565485 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" Apr 20 19:18:22.572418 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.572401 2572 scope.go:117] "RemoveContainer" containerID="7e37cb4a1eb9047e91780ea6648e1b2093d9b909bddc9992494e7c4338d532b0" Apr 20 19:18:22.572743 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:18:22.572722 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e37cb4a1eb9047e91780ea6648e1b2093d9b909bddc9992494e7c4338d532b0\": container with ID starting with 7e37cb4a1eb9047e91780ea6648e1b2093d9b909bddc9992494e7c4338d532b0 not found: ID does not exist" containerID="7e37cb4a1eb9047e91780ea6648e1b2093d9b909bddc9992494e7c4338d532b0" Apr 20 19:18:22.572834 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.572753 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e37cb4a1eb9047e91780ea6648e1b2093d9b909bddc9992494e7c4338d532b0"} err="failed to get container status \"7e37cb4a1eb9047e91780ea6648e1b2093d9b909bddc9992494e7c4338d532b0\": rpc error: code = NotFound desc = could not find container \"7e37cb4a1eb9047e91780ea6648e1b2093d9b909bddc9992494e7c4338d532b0\": container with ID starting with 7e37cb4a1eb9047e91780ea6648e1b2093d9b909bddc9992494e7c4338d532b0 not found: ID does not exist" Apr 20 19:18:22.585015 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.584972 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" podStartSLOduration=1.5849612180000001 podStartE2EDuration="1.584961218s" podCreationTimestamp="2026-04-20 19:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:18:22.584210599 +0000 UTC m=+611.303513368" watchObservedRunningTime="2026-04-20 19:18:22.584961218 +0000 UTC m=+611.304263990" Apr 20 19:18:22.603806 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.603782 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247"] Apr 20 19:18:22.609876 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:22.609854 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-jl247"] Apr 20 19:18:23.828857 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:23.828824 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4bb7816-15bd-4011-8fe7-e7543af66b1a" path="/var/lib/kubelet/pods/d4bb7816-15bd-4011-8fe7-e7543af66b1a/volumes" Apr 20 19:18:33.571504 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:33.571457 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" Apr 20 19:18:33.611043 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:33.610984 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd"] Apr 20 19:18:33.611323 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:33.611297 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" podUID="f69470bf-0c1e-48bf-84b9-490641866559" containerName="manager" containerID="cri-o://6aba2498e9a6d767d2a5d0dc0f93b5d8a6c1b50d85e837b642b4a7cb7d3f7574" gracePeriod=10 Apr 20 19:18:33.837858 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:33.837837 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" Apr 20 19:18:33.928691 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:33.928655 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f69470bf-0c1e-48bf-84b9-490641866559-extensions-socket-volume\") pod \"f69470bf-0c1e-48bf-84b9-490641866559\" (UID: \"f69470bf-0c1e-48bf-84b9-490641866559\") " Apr 20 19:18:33.928865 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:33.928724 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gm5d\" (UniqueName: \"kubernetes.io/projected/f69470bf-0c1e-48bf-84b9-490641866559-kube-api-access-5gm5d\") pod \"f69470bf-0c1e-48bf-84b9-490641866559\" (UID: \"f69470bf-0c1e-48bf-84b9-490641866559\") " Apr 20 19:18:33.928998 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:33.928976 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f69470bf-0c1e-48bf-84b9-490641866559-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "f69470bf-0c1e-48bf-84b9-490641866559" (UID: "f69470bf-0c1e-48bf-84b9-490641866559"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:18:33.930707 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:33.930685 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f69470bf-0c1e-48bf-84b9-490641866559-kube-api-access-5gm5d" (OuterVolumeSpecName: "kube-api-access-5gm5d") pod "f69470bf-0c1e-48bf-84b9-490641866559" (UID: "f69470bf-0c1e-48bf-84b9-490641866559"). InnerVolumeSpecName "kube-api-access-5gm5d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:18:34.030340 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:34.030307 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5gm5d\" (UniqueName: \"kubernetes.io/projected/f69470bf-0c1e-48bf-84b9-490641866559-kube-api-access-5gm5d\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:18:34.030340 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:34.030335 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f69470bf-0c1e-48bf-84b9-490641866559-extensions-socket-volume\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:18:34.609405 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:34.609367 2572 generic.go:358] "Generic (PLEG): container finished" podID="f69470bf-0c1e-48bf-84b9-490641866559" containerID="6aba2498e9a6d767d2a5d0dc0f93b5d8a6c1b50d85e837b642b4a7cb7d3f7574" exitCode=0 Apr 20 19:18:34.609864 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:34.609414 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" event={"ID":"f69470bf-0c1e-48bf-84b9-490641866559","Type":"ContainerDied","Data":"6aba2498e9a6d767d2a5d0dc0f93b5d8a6c1b50d85e837b642b4a7cb7d3f7574"} Apr 20 19:18:34.609864 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:34.609443 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" Apr 20 19:18:34.609864 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:34.609459 2572 scope.go:117] "RemoveContainer" containerID="6aba2498e9a6d767d2a5d0dc0f93b5d8a6c1b50d85e837b642b4a7cb7d3f7574" Apr 20 19:18:34.609864 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:34.609446 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd" event={"ID":"f69470bf-0c1e-48bf-84b9-490641866559","Type":"ContainerDied","Data":"900c09da09684787f9981803a32c80de9dd2274af708d1fe8289566134bf4696"} Apr 20 19:18:34.617632 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:34.617444 2572 scope.go:117] "RemoveContainer" containerID="6aba2498e9a6d767d2a5d0dc0f93b5d8a6c1b50d85e837b642b4a7cb7d3f7574" Apr 20 19:18:34.617719 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:18:34.617700 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aba2498e9a6d767d2a5d0dc0f93b5d8a6c1b50d85e837b642b4a7cb7d3f7574\": container with ID starting with 6aba2498e9a6d767d2a5d0dc0f93b5d8a6c1b50d85e837b642b4a7cb7d3f7574 not found: ID does not exist" containerID="6aba2498e9a6d767d2a5d0dc0f93b5d8a6c1b50d85e837b642b4a7cb7d3f7574" Apr 20 19:18:34.617756 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:34.617728 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aba2498e9a6d767d2a5d0dc0f93b5d8a6c1b50d85e837b642b4a7cb7d3f7574"} err="failed to get container status \"6aba2498e9a6d767d2a5d0dc0f93b5d8a6c1b50d85e837b642b4a7cb7d3f7574\": rpc error: code = NotFound desc = could not find container \"6aba2498e9a6d767d2a5d0dc0f93b5d8a6c1b50d85e837b642b4a7cb7d3f7574\": container with ID starting with 6aba2498e9a6d767d2a5d0dc0f93b5d8a6c1b50d85e837b642b4a7cb7d3f7574 not found: ID does not exist" Apr 20 19:18:34.631287 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:34.631266 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd"] Apr 20 19:18:34.636378 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:34.636358 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-s5trd"] Apr 20 19:18:35.828910 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:35.828876 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f69470bf-0c1e-48bf-84b9-490641866559" path="/var/lib/kubelet/pods/f69470bf-0c1e-48bf-84b9-490641866559/volumes" Apr 20 19:18:51.369226 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.369193 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b96nt"] Apr 20 19:18:51.369813 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.369631 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f69470bf-0c1e-48bf-84b9-490641866559" containerName="manager" Apr 20 19:18:51.369813 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.369651 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="f69470bf-0c1e-48bf-84b9-490641866559" containerName="manager" Apr 20 19:18:51.369813 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.369751 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="f69470bf-0c1e-48bf-84b9-490641866559" containerName="manager" Apr 20 19:18:51.373983 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.373963 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" Apr 20 19:18:51.376337 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.376318 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-ggnhw\"" Apr 20 19:18:51.376446 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.376318 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 19:18:51.382846 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.382814 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b96nt"] Apr 20 19:18:51.463884 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.463838 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72svf\" (UniqueName: \"kubernetes.io/projected/fc113cfc-c3d5-42a4-b37c-de4dd4276fad-kube-api-access-72svf\") pod \"limitador-limitador-7d549b5b-b96nt\" (UID: \"fc113cfc-c3d5-42a4-b37c-de4dd4276fad\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" Apr 20 19:18:51.464051 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.463890 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fc113cfc-c3d5-42a4-b37c-de4dd4276fad-config-file\") pod \"limitador-limitador-7d549b5b-b96nt\" (UID: \"fc113cfc-c3d5-42a4-b37c-de4dd4276fad\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" Apr 20 19:18:51.477280 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.474322 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b96nt"] Apr 20 19:18:51.564863 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.564824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72svf\" (UniqueName: \"kubernetes.io/projected/fc113cfc-c3d5-42a4-b37c-de4dd4276fad-kube-api-access-72svf\") pod \"limitador-limitador-7d549b5b-b96nt\" (UID: \"fc113cfc-c3d5-42a4-b37c-de4dd4276fad\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" Apr 20 19:18:51.564863 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.564870 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fc113cfc-c3d5-42a4-b37c-de4dd4276fad-config-file\") pod \"limitador-limitador-7d549b5b-b96nt\" (UID: \"fc113cfc-c3d5-42a4-b37c-de4dd4276fad\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" Apr 20 19:18:51.565532 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.565514 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fc113cfc-c3d5-42a4-b37c-de4dd4276fad-config-file\") pod \"limitador-limitador-7d549b5b-b96nt\" (UID: \"fc113cfc-c3d5-42a4-b37c-de4dd4276fad\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" Apr 20 19:18:51.572893 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.572864 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72svf\" (UniqueName: \"kubernetes.io/projected/fc113cfc-c3d5-42a4-b37c-de4dd4276fad-kube-api-access-72svf\") pod \"limitador-limitador-7d549b5b-b96nt\" (UID: \"fc113cfc-c3d5-42a4-b37c-de4dd4276fad\") " pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" Apr 20 19:18:51.684050 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.684021 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" Apr 20 19:18:51.805344 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:51.805322 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b96nt"] Apr 20 19:18:51.807344 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:18:51.807317 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc113cfc_c3d5_42a4_b37c_de4dd4276fad.slice/crio-0dadf6168a5dd0ba8a58add13a89d08afe9788814fb914d34264f4a0388a07ff WatchSource:0}: Error finding container 0dadf6168a5dd0ba8a58add13a89d08afe9788814fb914d34264f4a0388a07ff: Status 404 returned error can't find the container with id 0dadf6168a5dd0ba8a58add13a89d08afe9788814fb914d34264f4a0388a07ff Apr 20 19:18:52.675970 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:52.675935 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" event={"ID":"fc113cfc-c3d5-42a4-b37c-de4dd4276fad","Type":"ContainerStarted","Data":"0dadf6168a5dd0ba8a58add13a89d08afe9788814fb914d34264f4a0388a07ff"} Apr 20 19:18:55.687326 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:55.687286 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" event={"ID":"fc113cfc-c3d5-42a4-b37c-de4dd4276fad","Type":"ContainerStarted","Data":"606fce47a6863f5cb77f80fe43c6dbb740419d3a6a60da74eb4106812cd4e1a2"} Apr 20 19:18:55.687878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:55.687338 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" Apr 20 19:18:55.704098 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:18:55.704054 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" podStartSLOduration=1.800305114 podStartE2EDuration="4.704040044s" podCreationTimestamp="2026-04-20 19:18:51 +0000 UTC" firstStartedPulling="2026-04-20 19:18:51.809047965 +0000 UTC m=+640.528350715" lastFinishedPulling="2026-04-20 19:18:54.712782884 +0000 UTC m=+643.432085645" observedRunningTime="2026-04-20 19:18:55.702566736 +0000 UTC m=+644.421869510" watchObservedRunningTime="2026-04-20 19:18:55.704040044 +0000 UTC m=+644.423342843" Apr 20 19:19:06.692064 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:06.692032 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" Apr 20 19:19:08.842857 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:08.842812 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b96nt"] Apr 20 19:19:08.843353 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:08.843025 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" podUID="fc113cfc-c3d5-42a4-b37c-de4dd4276fad" containerName="limitador" containerID="cri-o://606fce47a6863f5cb77f80fe43c6dbb740419d3a6a60da74eb4106812cd4e1a2" gracePeriod=30 Apr 20 19:19:09.382171 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.382147 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" Apr 20 19:19:09.512888 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.512800 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fc113cfc-c3d5-42a4-b37c-de4dd4276fad-config-file\") pod \"fc113cfc-c3d5-42a4-b37c-de4dd4276fad\" (UID: \"fc113cfc-c3d5-42a4-b37c-de4dd4276fad\") " Apr 20 19:19:09.513057 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.513004 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72svf\" (UniqueName: \"kubernetes.io/projected/fc113cfc-c3d5-42a4-b37c-de4dd4276fad-kube-api-access-72svf\") pod \"fc113cfc-c3d5-42a4-b37c-de4dd4276fad\" (UID: \"fc113cfc-c3d5-42a4-b37c-de4dd4276fad\") " Apr 20 19:19:09.513167 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.513143 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc113cfc-c3d5-42a4-b37c-de4dd4276fad-config-file" (OuterVolumeSpecName: "config-file") pod "fc113cfc-c3d5-42a4-b37c-de4dd4276fad" (UID: "fc113cfc-c3d5-42a4-b37c-de4dd4276fad"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:19:09.513255 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.513241 2572 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/fc113cfc-c3d5-42a4-b37c-de4dd4276fad-config-file\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:19:09.515017 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.514990 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc113cfc-c3d5-42a4-b37c-de4dd4276fad-kube-api-access-72svf" (OuterVolumeSpecName: "kube-api-access-72svf") pod "fc113cfc-c3d5-42a4-b37c-de4dd4276fad" (UID: "fc113cfc-c3d5-42a4-b37c-de4dd4276fad"). InnerVolumeSpecName "kube-api-access-72svf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:19:09.614436 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.614405 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-72svf\" (UniqueName: \"kubernetes.io/projected/fc113cfc-c3d5-42a4-b37c-de4dd4276fad-kube-api-access-72svf\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:19:09.733892 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.733856 2572 generic.go:358] "Generic (PLEG): container finished" podID="fc113cfc-c3d5-42a4-b37c-de4dd4276fad" containerID="606fce47a6863f5cb77f80fe43c6dbb740419d3a6a60da74eb4106812cd4e1a2" exitCode=0 Apr 20 19:19:09.734070 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.733923 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" event={"ID":"fc113cfc-c3d5-42a4-b37c-de4dd4276fad","Type":"ContainerDied","Data":"606fce47a6863f5cb77f80fe43c6dbb740419d3a6a60da74eb4106812cd4e1a2"} Apr 20 19:19:09.734070 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.733929 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" Apr 20 19:19:09.734070 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.733951 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-b96nt" event={"ID":"fc113cfc-c3d5-42a4-b37c-de4dd4276fad","Type":"ContainerDied","Data":"0dadf6168a5dd0ba8a58add13a89d08afe9788814fb914d34264f4a0388a07ff"} Apr 20 19:19:09.734070 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.733966 2572 scope.go:117] "RemoveContainer" containerID="606fce47a6863f5cb77f80fe43c6dbb740419d3a6a60da74eb4106812cd4e1a2" Apr 20 19:19:09.741960 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.741941 2572 scope.go:117] "RemoveContainer" containerID="606fce47a6863f5cb77f80fe43c6dbb740419d3a6a60da74eb4106812cd4e1a2" Apr 20 19:19:09.742182 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:19:09.742165 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606fce47a6863f5cb77f80fe43c6dbb740419d3a6a60da74eb4106812cd4e1a2\": container with ID starting with 606fce47a6863f5cb77f80fe43c6dbb740419d3a6a60da74eb4106812cd4e1a2 not found: ID does not exist" containerID="606fce47a6863f5cb77f80fe43c6dbb740419d3a6a60da74eb4106812cd4e1a2" Apr 20 19:19:09.742226 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.742190 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606fce47a6863f5cb77f80fe43c6dbb740419d3a6a60da74eb4106812cd4e1a2"} err="failed to get container status \"606fce47a6863f5cb77f80fe43c6dbb740419d3a6a60da74eb4106812cd4e1a2\": rpc error: code = NotFound desc = could not find container \"606fce47a6863f5cb77f80fe43c6dbb740419d3a6a60da74eb4106812cd4e1a2\": container with ID starting with 606fce47a6863f5cb77f80fe43c6dbb740419d3a6a60da74eb4106812cd4e1a2 not found: ID does not exist" Apr 20 19:19:09.754979 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.754856 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b96nt"] Apr 20 19:19:09.758207 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.758170 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-b96nt"] Apr 20 19:19:09.829085 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:09.829003 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc113cfc-c3d5-42a4-b37c-de4dd4276fad" path="/var/lib/kubelet/pods/fc113cfc-c3d5-42a4-b37c-de4dd4276fad/volumes" Apr 20 19:19:13.677858 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.677824 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-kj5qj"] Apr 20 19:19:13.678308 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.678133 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc113cfc-c3d5-42a4-b37c-de4dd4276fad" containerName="limitador" Apr 20 19:19:13.678308 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.678143 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc113cfc-c3d5-42a4-b37c-de4dd4276fad" containerName="limitador" Apr 20 19:19:13.678308 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.678196 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc113cfc-c3d5-42a4-b37c-de4dd4276fad" containerName="limitador" Apr 20 19:19:13.682689 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.682668 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-kj5qj" Apr 20 19:19:13.684905 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.684871 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 19:19:13.685037 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.684871 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-s58ch\"" Apr 20 19:19:13.687734 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.687711 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-kj5qj"] Apr 20 19:19:13.749274 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.749245 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5drrm\" (UniqueName: \"kubernetes.io/projected/b23241e5-4c3e-4302-9dd1-bf63e4497593-kube-api-access-5drrm\") pod \"postgres-868db5846d-kj5qj\" (UID: \"b23241e5-4c3e-4302-9dd1-bf63e4497593\") " pod="opendatahub/postgres-868db5846d-kj5qj" Apr 20 19:19:13.749426 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.749287 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b23241e5-4c3e-4302-9dd1-bf63e4497593-data\") pod \"postgres-868db5846d-kj5qj\" (UID: \"b23241e5-4c3e-4302-9dd1-bf63e4497593\") " pod="opendatahub/postgres-868db5846d-kj5qj" Apr 20 19:19:13.850000 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.849968 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5drrm\" (UniqueName: \"kubernetes.io/projected/b23241e5-4c3e-4302-9dd1-bf63e4497593-kube-api-access-5drrm\") pod \"postgres-868db5846d-kj5qj\" (UID: \"b23241e5-4c3e-4302-9dd1-bf63e4497593\") " pod="opendatahub/postgres-868db5846d-kj5qj" Apr 20 19:19:13.850153 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.850013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b23241e5-4c3e-4302-9dd1-bf63e4497593-data\") pod \"postgres-868db5846d-kj5qj\" (UID: \"b23241e5-4c3e-4302-9dd1-bf63e4497593\") " pod="opendatahub/postgres-868db5846d-kj5qj" Apr 20 19:19:13.850369 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.850351 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b23241e5-4c3e-4302-9dd1-bf63e4497593-data\") pod \"postgres-868db5846d-kj5qj\" (UID: \"b23241e5-4c3e-4302-9dd1-bf63e4497593\") " pod="opendatahub/postgres-868db5846d-kj5qj" Apr 20 19:19:13.857898 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.857868 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5drrm\" (UniqueName: \"kubernetes.io/projected/b23241e5-4c3e-4302-9dd1-bf63e4497593-kube-api-access-5drrm\") pod \"postgres-868db5846d-kj5qj\" (UID: \"b23241e5-4c3e-4302-9dd1-bf63e4497593\") " pod="opendatahub/postgres-868db5846d-kj5qj" Apr 20 19:19:13.994281 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:13.994186 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-kj5qj" Apr 20 19:19:14.116926 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:14.116887 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-kj5qj"] Apr 20 19:19:14.120036 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:19:14.120002 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb23241e5_4c3e_4302_9dd1_bf63e4497593.slice/crio-850ca60afcec31169a05fc7a4f5c5324ab8e145979c5c5aa812a88ad0c2e50bc WatchSource:0}: Error finding container 850ca60afcec31169a05fc7a4f5c5324ab8e145979c5c5aa812a88ad0c2e50bc: Status 404 returned error can't find the container with id 850ca60afcec31169a05fc7a4f5c5324ab8e145979c5c5aa812a88ad0c2e50bc Apr 20 19:19:14.751347 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:14.751310 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-kj5qj" event={"ID":"b23241e5-4c3e-4302-9dd1-bf63e4497593","Type":"ContainerStarted","Data":"850ca60afcec31169a05fc7a4f5c5324ab8e145979c5c5aa812a88ad0c2e50bc"} Apr 20 19:19:19.772848 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:19.772764 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-kj5qj" event={"ID":"b23241e5-4c3e-4302-9dd1-bf63e4497593","Type":"ContainerStarted","Data":"467de42effabde1bf3da2a9513109d7f75a18c46351a797087b8ebfbac80bb63"} Apr 20 19:19:19.773192 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:19.772862 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-kj5qj" Apr 20 19:19:19.788395 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:19.788348 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-kj5qj" podStartSLOduration=1.5047951899999998 podStartE2EDuration="6.788334649s" podCreationTimestamp="2026-04-20 19:19:13 +0000 UTC" firstStartedPulling="2026-04-20 19:19:14.121431666 +0000 UTC m=+662.840734421" lastFinishedPulling="2026-04-20 19:19:19.404971125 +0000 UTC m=+668.124273880" observedRunningTime="2026-04-20 19:19:19.787276089 +0000 UTC m=+668.506578861" watchObservedRunningTime="2026-04-20 19:19:19.788334649 +0000 UTC m=+668.507637422" Apr 20 19:19:25.804100 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:25.804067 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-kj5qj" Apr 20 19:19:28.575206 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.575174 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rnl6w"] Apr 20 19:19:28.579988 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.579969 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" Apr 20 19:19:28.581985 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.581967 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-x4p8t\"" Apr 20 19:19:28.586394 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.586373 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rnl6w"] Apr 20 19:19:28.673133 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.673105 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5xd4\" (UniqueName: \"kubernetes.io/projected/ed7cb01f-de30-474e-b80f-d042a1b5f4a7-kube-api-access-t5xd4\") pod \"maas-controller-6d4c8f55f9-rnl6w\" (UID: \"ed7cb01f-de30-474e-b80f-d042a1b5f4a7\") " pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" Apr 20 19:19:28.731958 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.731923 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-58f59846df-k6ljr"] Apr 20 19:19:28.734249 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.734230 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-58f59846df-k6ljr" Apr 20 19:19:28.744265 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.744238 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-58f59846df-k6ljr"] Apr 20 19:19:28.773529 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.773502 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5xd4\" (UniqueName: \"kubernetes.io/projected/ed7cb01f-de30-474e-b80f-d042a1b5f4a7-kube-api-access-t5xd4\") pod \"maas-controller-6d4c8f55f9-rnl6w\" (UID: \"ed7cb01f-de30-474e-b80f-d042a1b5f4a7\") " pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" Apr 20 19:19:28.781192 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.781171 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5xd4\" (UniqueName: \"kubernetes.io/projected/ed7cb01f-de30-474e-b80f-d042a1b5f4a7-kube-api-access-t5xd4\") pod \"maas-controller-6d4c8f55f9-rnl6w\" (UID: \"ed7cb01f-de30-474e-b80f-d042a1b5f4a7\") " pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" Apr 20 19:19:28.858046 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.857957 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-58f59846df-k6ljr"] Apr 20 19:19:28.858221 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:19:28.858197 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-rbtks], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-58f59846df-k6ljr" podUID="4ceb5b4c-88af-4498-99f5-f515f0db70a9" Apr 20 19:19:28.874568 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.874530 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbtks\" (UniqueName: \"kubernetes.io/projected/4ceb5b4c-88af-4498-99f5-f515f0db70a9-kube-api-access-rbtks\") pod \"maas-controller-58f59846df-k6ljr\" (UID: \"4ceb5b4c-88af-4498-99f5-f515f0db70a9\") " pod="opendatahub/maas-controller-58f59846df-k6ljr" Apr 20 19:19:28.886128 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.886102 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6876fc7db-58nvq"] Apr 20 19:19:28.888456 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.888441 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6876fc7db-58nvq" Apr 20 19:19:28.891887 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.891860 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" Apr 20 19:19:28.898373 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.898350 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6876fc7db-58nvq"] Apr 20 19:19:28.976111 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.976060 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svmbs\" (UniqueName: \"kubernetes.io/projected/664c0a37-161c-4f07-accb-39073fd6972e-kube-api-access-svmbs\") pod \"maas-controller-6876fc7db-58nvq\" (UID: \"664c0a37-161c-4f07-accb-39073fd6972e\") " pod="opendatahub/maas-controller-6876fc7db-58nvq" Apr 20 19:19:28.976279 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.976202 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbtks\" (UniqueName: \"kubernetes.io/projected/4ceb5b4c-88af-4498-99f5-f515f0db70a9-kube-api-access-rbtks\") pod \"maas-controller-58f59846df-k6ljr\" (UID: \"4ceb5b4c-88af-4498-99f5-f515f0db70a9\") " pod="opendatahub/maas-controller-58f59846df-k6ljr" Apr 20 19:19:28.984353 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:28.984326 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbtks\" (UniqueName: \"kubernetes.io/projected/4ceb5b4c-88af-4498-99f5-f515f0db70a9-kube-api-access-rbtks\") pod \"maas-controller-58f59846df-k6ljr\" (UID: \"4ceb5b4c-88af-4498-99f5-f515f0db70a9\") " pod="opendatahub/maas-controller-58f59846df-k6ljr" Apr 20 19:19:29.077522 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:29.077450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svmbs\" (UniqueName: \"kubernetes.io/projected/664c0a37-161c-4f07-accb-39073fd6972e-kube-api-access-svmbs\") pod \"maas-controller-6876fc7db-58nvq\" (UID: \"664c0a37-161c-4f07-accb-39073fd6972e\") " pod="opendatahub/maas-controller-6876fc7db-58nvq" Apr 20 19:19:29.085497 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:29.085454 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svmbs\" (UniqueName: \"kubernetes.io/projected/664c0a37-161c-4f07-accb-39073fd6972e-kube-api-access-svmbs\") pod \"maas-controller-6876fc7db-58nvq\" (UID: \"664c0a37-161c-4f07-accb-39073fd6972e\") " pod="opendatahub/maas-controller-6876fc7db-58nvq" Apr 20 19:19:29.202271 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:29.202241 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6876fc7db-58nvq" Apr 20 19:19:29.221562 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:29.221531 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rnl6w"] Apr 20 19:19:29.223932 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:19:29.223903 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded7cb01f_de30_474e_b80f_d042a1b5f4a7.slice/crio-3f4ad400a168baaaa41ffe84ff9f8fb2b08e446786fb25e65c1775ebc1954181 WatchSource:0}: Error finding container 3f4ad400a168baaaa41ffe84ff9f8fb2b08e446786fb25e65c1775ebc1954181: Status 404 returned error can't find the container with id 3f4ad400a168baaaa41ffe84ff9f8fb2b08e446786fb25e65c1775ebc1954181 Apr 20 19:19:29.322677 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:29.322654 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6876fc7db-58nvq"] Apr 20 19:19:29.325203 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:19:29.325176 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod664c0a37_161c_4f07_accb_39073fd6972e.slice/crio-f005e1addd82623bd046ad01983175afbfcec5647d0c7cc66a0692bf4f45ca52 WatchSource:0}: Error finding container f005e1addd82623bd046ad01983175afbfcec5647d0c7cc66a0692bf4f45ca52: Status 404 returned error can't find the container with id f005e1addd82623bd046ad01983175afbfcec5647d0c7cc66a0692bf4f45ca52 Apr 20 19:19:29.811352 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:29.811315 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6876fc7db-58nvq" event={"ID":"664c0a37-161c-4f07-accb-39073fd6972e","Type":"ContainerStarted","Data":"f005e1addd82623bd046ad01983175afbfcec5647d0c7cc66a0692bf4f45ca52"} Apr 20 19:19:29.812563 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:29.812530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" event={"ID":"ed7cb01f-de30-474e-b80f-d042a1b5f4a7","Type":"ContainerStarted","Data":"3f4ad400a168baaaa41ffe84ff9f8fb2b08e446786fb25e65c1775ebc1954181"} Apr 20 19:19:29.812682 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:29.812603 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-58f59846df-k6ljr" Apr 20 19:19:29.818024 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:29.818005 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-58f59846df-k6ljr" Apr 20 19:19:29.986869 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:29.986827 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbtks\" (UniqueName: \"kubernetes.io/projected/4ceb5b4c-88af-4498-99f5-f515f0db70a9-kube-api-access-rbtks\") pod \"4ceb5b4c-88af-4498-99f5-f515f0db70a9\" (UID: \"4ceb5b4c-88af-4498-99f5-f515f0db70a9\") " Apr 20 19:19:29.989879 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:29.989799 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ceb5b4c-88af-4498-99f5-f515f0db70a9-kube-api-access-rbtks" (OuterVolumeSpecName: "kube-api-access-rbtks") pod "4ceb5b4c-88af-4498-99f5-f515f0db70a9" (UID: "4ceb5b4c-88af-4498-99f5-f515f0db70a9"). InnerVolumeSpecName "kube-api-access-rbtks". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:19:30.087876 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:30.087791 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rbtks\" (UniqueName: \"kubernetes.io/projected/4ceb5b4c-88af-4498-99f5-f515f0db70a9-kube-api-access-rbtks\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:19:30.816775 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:30.816730 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-58f59846df-k6ljr" Apr 20 19:19:30.849954 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:30.849923 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-58f59846df-k6ljr"] Apr 20 19:19:30.853420 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:30.853393 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-58f59846df-k6ljr"] Apr 20 19:19:31.829859 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:31.829827 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ceb5b4c-88af-4498-99f5-f515f0db70a9" path="/var/lib/kubelet/pods/4ceb5b4c-88af-4498-99f5-f515f0db70a9/volumes" Apr 20 19:19:32.826237 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:32.826207 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6876fc7db-58nvq" event={"ID":"664c0a37-161c-4f07-accb-39073fd6972e","Type":"ContainerStarted","Data":"3cf3b866b809ddb5ae5789a85707d04e19ab1d57b48f8f52d106b7d24a482d79"} Apr 20 19:19:32.826435 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:32.826360 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6876fc7db-58nvq" Apr 20 19:19:32.827873 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:32.827832 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" event={"ID":"ed7cb01f-de30-474e-b80f-d042a1b5f4a7","Type":"ContainerStarted","Data":"1ce1011d9af3ee8f400b425cd72b6f683bc19dae4e6e694b4c9b97d542d14b36"} Apr 20 19:19:32.827991 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:32.827962 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" Apr 20 19:19:32.846712 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:32.846654 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6876fc7db-58nvq" podStartSLOduration=1.9844223909999998 podStartE2EDuration="4.846635066s" podCreationTimestamp="2026-04-20 19:19:28 +0000 UTC" firstStartedPulling="2026-04-20 19:19:29.326520015 +0000 UTC m=+678.045822778" lastFinishedPulling="2026-04-20 19:19:32.188732699 +0000 UTC m=+680.908035453" observedRunningTime="2026-04-20 19:19:32.844837041 +0000 UTC m=+681.564139813" watchObservedRunningTime="2026-04-20 19:19:32.846635066 +0000 UTC m=+681.565937859" Apr 20 19:19:32.865672 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:32.865633 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" podStartSLOduration=1.904205146 podStartE2EDuration="4.865621231s" podCreationTimestamp="2026-04-20 19:19:28 +0000 UTC" firstStartedPulling="2026-04-20 19:19:29.224993824 +0000 UTC m=+677.944296573" lastFinishedPulling="2026-04-20 19:19:32.186409907 +0000 UTC m=+680.905712658" observedRunningTime="2026-04-20 19:19:32.863870396 +0000 UTC m=+681.583173179" watchObservedRunningTime="2026-04-20 19:19:32.865621231 +0000 UTC m=+681.584924003" Apr 20 19:19:34.375742 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.375706 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-69876cf954-mqlfg"] Apr 20 19:19:34.377997 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.377982 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-69876cf954-mqlfg" Apr 20 19:19:34.380107 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.380082 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 19:19:34.380107 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.380087 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 19:19:34.380303 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.380121 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-qbkhb\"" Apr 20 19:19:34.387988 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.387964 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-69876cf954-mqlfg"] Apr 20 19:19:34.526888 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.526851 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk6c6\" (UniqueName: \"kubernetes.io/projected/77981269-735e-4b56-b8b2-884528ae4940-kube-api-access-kk6c6\") pod \"maas-api-69876cf954-mqlfg\" (UID: \"77981269-735e-4b56-b8b2-884528ae4940\") " pod="opendatahub/maas-api-69876cf954-mqlfg" Apr 20 19:19:34.527106 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.526940 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/77981269-735e-4b56-b8b2-884528ae4940-maas-api-tls\") pod \"maas-api-69876cf954-mqlfg\" (UID: \"77981269-735e-4b56-b8b2-884528ae4940\") " pod="opendatahub/maas-api-69876cf954-mqlfg" Apr 20 19:19:34.627885 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.627791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kk6c6\" (UniqueName: \"kubernetes.io/projected/77981269-735e-4b56-b8b2-884528ae4940-kube-api-access-kk6c6\") pod \"maas-api-69876cf954-mqlfg\" (UID: \"77981269-735e-4b56-b8b2-884528ae4940\") " pod="opendatahub/maas-api-69876cf954-mqlfg" Apr 20 19:19:34.627885 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.627868 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/77981269-735e-4b56-b8b2-884528ae4940-maas-api-tls\") pod \"maas-api-69876cf954-mqlfg\" (UID: \"77981269-735e-4b56-b8b2-884528ae4940\") " pod="opendatahub/maas-api-69876cf954-mqlfg" Apr 20 19:19:34.630320 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.630283 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/77981269-735e-4b56-b8b2-884528ae4940-maas-api-tls\") pod \"maas-api-69876cf954-mqlfg\" (UID: \"77981269-735e-4b56-b8b2-884528ae4940\") " pod="opendatahub/maas-api-69876cf954-mqlfg" Apr 20 19:19:34.637165 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.637134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk6c6\" (UniqueName: \"kubernetes.io/projected/77981269-735e-4b56-b8b2-884528ae4940-kube-api-access-kk6c6\") pod \"maas-api-69876cf954-mqlfg\" (UID: \"77981269-735e-4b56-b8b2-884528ae4940\") " pod="opendatahub/maas-api-69876cf954-mqlfg" Apr 20 19:19:34.688282 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.688247 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-69876cf954-mqlfg" Apr 20 19:19:34.820166 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.820141 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-69876cf954-mqlfg"] Apr 20 19:19:34.822499 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:19:34.822453 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77981269_735e_4b56_b8b2_884528ae4940.slice/crio-eb8464d1968059716d5d637011cb5b43eecc7116d3c91a49150d023f1c5baf59 WatchSource:0}: Error finding container eb8464d1968059716d5d637011cb5b43eecc7116d3c91a49150d023f1c5baf59: Status 404 returned error can't find the container with id eb8464d1968059716d5d637011cb5b43eecc7116d3c91a49150d023f1c5baf59 Apr 20 19:19:34.834527 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:34.834498 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-69876cf954-mqlfg" event={"ID":"77981269-735e-4b56-b8b2-884528ae4940","Type":"ContainerStarted","Data":"eb8464d1968059716d5d637011cb5b43eecc7116d3c91a49150d023f1c5baf59"} Apr 20 19:19:36.841985 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:36.841952 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-69876cf954-mqlfg" event={"ID":"77981269-735e-4b56-b8b2-884528ae4940","Type":"ContainerStarted","Data":"d84fc53fc30def46e7fd5449b9a206509e906c965989d9743554d1f02f02f0e1"} Apr 20 19:19:36.842339 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:36.842072 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-69876cf954-mqlfg" Apr 20 19:19:36.867463 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:36.867414 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-69876cf954-mqlfg" podStartSLOduration=1.44036045 podStartE2EDuration="2.867399287s" podCreationTimestamp="2026-04-20 19:19:34 +0000 UTC" firstStartedPulling="2026-04-20 19:19:34.823988813 +0000 UTC m=+683.543291569" lastFinishedPulling="2026-04-20 19:19:36.251027653 +0000 UTC m=+684.970330406" observedRunningTime="2026-04-20 19:19:36.865785072 +0000 UTC m=+685.585087846" watchObservedRunningTime="2026-04-20 19:19:36.867399287 +0000 UTC m=+685.586702116" Apr 20 19:19:42.851073 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:42.851043 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-69876cf954-mqlfg" Apr 20 19:19:43.838012 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:43.837981 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" Apr 20 19:19:43.838215 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:43.838033 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6876fc7db-58nvq" Apr 20 19:19:43.909981 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:43.909948 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rnl6w"] Apr 20 19:19:43.910385 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:43.910145 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" podUID="ed7cb01f-de30-474e-b80f-d042a1b5f4a7" containerName="manager" containerID="cri-o://1ce1011d9af3ee8f400b425cd72b6f683bc19dae4e6e694b4c9b97d542d14b36" gracePeriod=10 Apr 20 19:19:44.151176 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.151152 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" Apr 20 19:19:44.216046 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.216009 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5xd4\" (UniqueName: \"kubernetes.io/projected/ed7cb01f-de30-474e-b80f-d042a1b5f4a7-kube-api-access-t5xd4\") pod \"ed7cb01f-de30-474e-b80f-d042a1b5f4a7\" (UID: \"ed7cb01f-de30-474e-b80f-d042a1b5f4a7\") " Apr 20 19:19:44.218140 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.218114 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed7cb01f-de30-474e-b80f-d042a1b5f4a7-kube-api-access-t5xd4" (OuterVolumeSpecName: "kube-api-access-t5xd4") pod "ed7cb01f-de30-474e-b80f-d042a1b5f4a7" (UID: "ed7cb01f-de30-474e-b80f-d042a1b5f4a7"). InnerVolumeSpecName "kube-api-access-t5xd4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:19:44.249571 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.249535 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6448987b54-4qsm8"] Apr 20 19:19:44.249888 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.249874 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed7cb01f-de30-474e-b80f-d042a1b5f4a7" containerName="manager" Apr 20 19:19:44.249935 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.249891 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed7cb01f-de30-474e-b80f-d042a1b5f4a7" containerName="manager" Apr 20 19:19:44.249969 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.249943 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed7cb01f-de30-474e-b80f-d042a1b5f4a7" containerName="manager" Apr 20 19:19:44.252206 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.252190 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6448987b54-4qsm8" Apr 20 19:19:44.260138 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.260111 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6448987b54-4qsm8"] Apr 20 19:19:44.316829 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.316786 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7km9\" (UniqueName: \"kubernetes.io/projected/2fca5714-012b-46cb-b1d1-0ff24b14b8b3-kube-api-access-s7km9\") pod \"maas-controller-6448987b54-4qsm8\" (UID: \"2fca5714-012b-46cb-b1d1-0ff24b14b8b3\") " pod="opendatahub/maas-controller-6448987b54-4qsm8" Apr 20 19:19:44.316979 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.316856 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t5xd4\" (UniqueName: \"kubernetes.io/projected/ed7cb01f-de30-474e-b80f-d042a1b5f4a7-kube-api-access-t5xd4\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:19:44.418107 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.418073 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7km9\" (UniqueName: \"kubernetes.io/projected/2fca5714-012b-46cb-b1d1-0ff24b14b8b3-kube-api-access-s7km9\") pod \"maas-controller-6448987b54-4qsm8\" (UID: \"2fca5714-012b-46cb-b1d1-0ff24b14b8b3\") " pod="opendatahub/maas-controller-6448987b54-4qsm8" Apr 20 19:19:44.427674 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.427640 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7km9\" (UniqueName: \"kubernetes.io/projected/2fca5714-012b-46cb-b1d1-0ff24b14b8b3-kube-api-access-s7km9\") pod \"maas-controller-6448987b54-4qsm8\" (UID: \"2fca5714-012b-46cb-b1d1-0ff24b14b8b3\") " pod="opendatahub/maas-controller-6448987b54-4qsm8" Apr 20 19:19:44.564219 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.564181 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6448987b54-4qsm8" Apr 20 19:19:44.683737 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.683708 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6448987b54-4qsm8"] Apr 20 19:19:44.686076 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:19:44.686048 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fca5714_012b_46cb_b1d1_0ff24b14b8b3.slice/crio-28154e19ccdd578ba38421bbd2afa31c56eb3a5c12c7d00bd0c2475d4978747c WatchSource:0}: Error finding container 28154e19ccdd578ba38421bbd2afa31c56eb3a5c12c7d00bd0c2475d4978747c: Status 404 returned error can't find the container with id 28154e19ccdd578ba38421bbd2afa31c56eb3a5c12c7d00bd0c2475d4978747c Apr 20 19:19:44.868327 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.868294 2572 generic.go:358] "Generic (PLEG): container finished" podID="ed7cb01f-de30-474e-b80f-d042a1b5f4a7" containerID="1ce1011d9af3ee8f400b425cd72b6f683bc19dae4e6e694b4c9b97d542d14b36" exitCode=0 Apr 20 19:19:44.868538 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.868352 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" Apr 20 19:19:44.868538 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.868388 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" event={"ID":"ed7cb01f-de30-474e-b80f-d042a1b5f4a7","Type":"ContainerDied","Data":"1ce1011d9af3ee8f400b425cd72b6f683bc19dae4e6e694b4c9b97d542d14b36"} Apr 20 19:19:44.868538 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.868421 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-rnl6w" event={"ID":"ed7cb01f-de30-474e-b80f-d042a1b5f4a7","Type":"ContainerDied","Data":"3f4ad400a168baaaa41ffe84ff9f8fb2b08e446786fb25e65c1775ebc1954181"} Apr 20 19:19:44.868538 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.868441 2572 scope.go:117] "RemoveContainer" containerID="1ce1011d9af3ee8f400b425cd72b6f683bc19dae4e6e694b4c9b97d542d14b36" Apr 20 19:19:44.869547 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.869527 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6448987b54-4qsm8" event={"ID":"2fca5714-012b-46cb-b1d1-0ff24b14b8b3","Type":"ContainerStarted","Data":"28154e19ccdd578ba38421bbd2afa31c56eb3a5c12c7d00bd0c2475d4978747c"} Apr 20 19:19:44.876652 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.876617 2572 scope.go:117] "RemoveContainer" containerID="1ce1011d9af3ee8f400b425cd72b6f683bc19dae4e6e694b4c9b97d542d14b36" Apr 20 19:19:44.876883 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:19:44.876863 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ce1011d9af3ee8f400b425cd72b6f683bc19dae4e6e694b4c9b97d542d14b36\": container with ID starting with 1ce1011d9af3ee8f400b425cd72b6f683bc19dae4e6e694b4c9b97d542d14b36 not found: ID does not exist" containerID="1ce1011d9af3ee8f400b425cd72b6f683bc19dae4e6e694b4c9b97d542d14b36" Apr 20 19:19:44.876949 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.876889 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce1011d9af3ee8f400b425cd72b6f683bc19dae4e6e694b4c9b97d542d14b36"} err="failed to get container status \"1ce1011d9af3ee8f400b425cd72b6f683bc19dae4e6e694b4c9b97d542d14b36\": rpc error: code = NotFound desc = could not find container \"1ce1011d9af3ee8f400b425cd72b6f683bc19dae4e6e694b4c9b97d542d14b36\": container with ID starting with 1ce1011d9af3ee8f400b425cd72b6f683bc19dae4e6e694b4c9b97d542d14b36 not found: ID does not exist" Apr 20 19:19:44.890650 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.890627 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rnl6w"] Apr 20 19:19:44.891981 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:44.891961 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-rnl6w"] Apr 20 19:19:45.829020 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:45.828978 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed7cb01f-de30-474e-b80f-d042a1b5f4a7" path="/var/lib/kubelet/pods/ed7cb01f-de30-474e-b80f-d042a1b5f4a7/volumes" Apr 20 19:19:45.874566 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:45.874532 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6448987b54-4qsm8" event={"ID":"2fca5714-012b-46cb-b1d1-0ff24b14b8b3","Type":"ContainerStarted","Data":"648e9e9635a39dc71956244c6b4548d653f459a2a995e614e23daaf7985a2171"} Apr 20 19:19:45.874736 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:45.874612 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6448987b54-4qsm8" Apr 20 19:19:45.889793 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:45.889742 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6448987b54-4qsm8" podStartSLOduration=1.477296102 podStartE2EDuration="1.889727616s" podCreationTimestamp="2026-04-20 19:19:44 +0000 UTC" firstStartedPulling="2026-04-20 19:19:44.687282767 +0000 UTC m=+693.406585518" lastFinishedPulling="2026-04-20 19:19:45.099714283 +0000 UTC m=+693.819017032" observedRunningTime="2026-04-20 19:19:45.889268964 +0000 UTC m=+694.608571733" watchObservedRunningTime="2026-04-20 19:19:45.889727616 +0000 UTC m=+694.609030388" Apr 20 19:19:56.884054 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:56.883983 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6448987b54-4qsm8" Apr 20 19:19:56.931243 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:56.931212 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6876fc7db-58nvq"] Apr 20 19:19:56.931433 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:56.931413 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6876fc7db-58nvq" podUID="664c0a37-161c-4f07-accb-39073fd6972e" containerName="manager" containerID="cri-o://3cf3b866b809ddb5ae5789a85707d04e19ab1d57b48f8f52d106b7d24a482d79" gracePeriod=10 Apr 20 19:19:57.170063 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:57.170034 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6876fc7db-58nvq" Apr 20 19:19:57.226346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:57.226318 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svmbs\" (UniqueName: \"kubernetes.io/projected/664c0a37-161c-4f07-accb-39073fd6972e-kube-api-access-svmbs\") pod \"664c0a37-161c-4f07-accb-39073fd6972e\" (UID: \"664c0a37-161c-4f07-accb-39073fd6972e\") " Apr 20 19:19:57.228279 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:57.228249 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/664c0a37-161c-4f07-accb-39073fd6972e-kube-api-access-svmbs" (OuterVolumeSpecName: "kube-api-access-svmbs") pod "664c0a37-161c-4f07-accb-39073fd6972e" (UID: "664c0a37-161c-4f07-accb-39073fd6972e"). InnerVolumeSpecName "kube-api-access-svmbs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:19:57.326934 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:57.326887 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-svmbs\" (UniqueName: \"kubernetes.io/projected/664c0a37-161c-4f07-accb-39073fd6972e-kube-api-access-svmbs\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:19:57.919891 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:57.919834 2572 generic.go:358] "Generic (PLEG): container finished" podID="664c0a37-161c-4f07-accb-39073fd6972e" containerID="3cf3b866b809ddb5ae5789a85707d04e19ab1d57b48f8f52d106b7d24a482d79" exitCode=0 Apr 20 19:19:57.920346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:57.919925 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6876fc7db-58nvq" event={"ID":"664c0a37-161c-4f07-accb-39073fd6972e","Type":"ContainerDied","Data":"3cf3b866b809ddb5ae5789a85707d04e19ab1d57b48f8f52d106b7d24a482d79"} Apr 20 19:19:57.920346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:57.919951 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6876fc7db-58nvq" Apr 20 19:19:57.920346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:57.919974 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6876fc7db-58nvq" event={"ID":"664c0a37-161c-4f07-accb-39073fd6972e","Type":"ContainerDied","Data":"f005e1addd82623bd046ad01983175afbfcec5647d0c7cc66a0692bf4f45ca52"} Apr 20 19:19:57.920346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:57.919997 2572 scope.go:117] "RemoveContainer" containerID="3cf3b866b809ddb5ae5789a85707d04e19ab1d57b48f8f52d106b7d24a482d79" Apr 20 19:19:57.932985 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:57.932693 2572 scope.go:117] "RemoveContainer" containerID="3cf3b866b809ddb5ae5789a85707d04e19ab1d57b48f8f52d106b7d24a482d79" Apr 20 19:19:57.933070 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:19:57.933019 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cf3b866b809ddb5ae5789a85707d04e19ab1d57b48f8f52d106b7d24a482d79\": container with ID starting with 3cf3b866b809ddb5ae5789a85707d04e19ab1d57b48f8f52d106b7d24a482d79 not found: ID does not exist" containerID="3cf3b866b809ddb5ae5789a85707d04e19ab1d57b48f8f52d106b7d24a482d79" Apr 20 19:19:57.933070 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:57.933044 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cf3b866b809ddb5ae5789a85707d04e19ab1d57b48f8f52d106b7d24a482d79"} err="failed to get container status \"3cf3b866b809ddb5ae5789a85707d04e19ab1d57b48f8f52d106b7d24a482d79\": rpc error: code = NotFound desc = could not find container \"3cf3b866b809ddb5ae5789a85707d04e19ab1d57b48f8f52d106b7d24a482d79\": container with ID starting with 3cf3b866b809ddb5ae5789a85707d04e19ab1d57b48f8f52d106b7d24a482d79 not found: ID does not exist" Apr 20 19:19:57.945445 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:57.945412 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6876fc7db-58nvq"] Apr 20 19:19:57.947911 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:57.947886 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6876fc7db-58nvq"] Apr 20 19:19:59.829765 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:19:59.829722 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="664c0a37-161c-4f07-accb-39073fd6972e" path="/var/lib/kubelet/pods/664c0a37-161c-4f07-accb-39073fd6972e/volumes" Apr 20 19:20:04.771978 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.771947 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2"] Apr 20 19:20:04.772366 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.772273 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="664c0a37-161c-4f07-accb-39073fd6972e" containerName="manager" Apr 20 19:20:04.772366 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.772285 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="664c0a37-161c-4f07-accb-39073fd6972e" containerName="manager" Apr 20 19:20:04.772366 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.772351 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="664c0a37-161c-4f07-accb-39073fd6972e" containerName="manager" Apr 20 19:20:04.821856 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.821811 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2"] Apr 20 19:20:04.822043 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.821964 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.832057 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.824538 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 19:20:04.832057 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.824966 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 19:20:04.832057 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.825463 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 19:20:04.832057 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.825980 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-lf8z7\"" Apr 20 19:20:04.892655 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.892625 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.892799 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.892663 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.892799 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.892686 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.892799 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.892745 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmmb5\" (UniqueName: \"kubernetes.io/projected/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-kube-api-access-bmmb5\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.892903 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.892810 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.892903 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.892866 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.993915 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.993879 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.994077 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.993926 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.994077 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.993948 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.994077 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.993969 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.994077 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.994009 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmmb5\" (UniqueName: \"kubernetes.io/projected/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-kube-api-access-bmmb5\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.994077 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.994064 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.994408 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.994377 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.994503 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.994416 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.994503 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.994466 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.996210 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.996184 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:04.996442 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:04.996426 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:05.004683 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:05.004655 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmmb5\" (UniqueName: \"kubernetes.io/projected/ec4ed0e4-0cfd-43c3-838a-658200ce0ee1-kube-api-access-bmmb5\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2\" (UID: \"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:05.139853 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:05.139764 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:05.269933 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:05.269907 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2"] Apr 20 19:20:05.271198 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:20:05.271175 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec4ed0e4_0cfd_43c3_838a_658200ce0ee1.slice/crio-36f5328dc033c5b6fd0ba09ddaab87630f041efbc74757fa7a060be101b85de5 WatchSource:0}: Error finding container 36f5328dc033c5b6fd0ba09ddaab87630f041efbc74757fa7a060be101b85de5: Status 404 returned error can't find the container with id 36f5328dc033c5b6fd0ba09ddaab87630f041efbc74757fa7a060be101b85de5 Apr 20 19:20:05.948907 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:05.948871 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" event={"ID":"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1","Type":"ContainerStarted","Data":"36f5328dc033c5b6fd0ba09ddaab87630f041efbc74757fa7a060be101b85de5"} Apr 20 19:20:12.974951 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:12.974914 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" event={"ID":"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1","Type":"ContainerStarted","Data":"8a5059ce54092255af9d036c94b873f31861056498b2bfecafbd751476626441"} Apr 20 19:20:16.448153 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.448120 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq"] Apr 20 19:20:16.451681 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.451660 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.453521 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.453502 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 19:20:16.463495 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.463454 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq"] Apr 20 19:20:16.605429 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.605389 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6a01b803-2b21-4324-8b86-e76b2cb58875-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.605660 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.605441 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a01b803-2b21-4324-8b86-e76b2cb58875-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.605660 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.605518 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6a01b803-2b21-4324-8b86-e76b2cb58875-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.605660 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.605569 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a01b803-2b21-4324-8b86-e76b2cb58875-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.605660 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.605609 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6a01b803-2b21-4324-8b86-e76b2cb58875-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.605881 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.605661 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h72j6\" (UniqueName: \"kubernetes.io/projected/6a01b803-2b21-4324-8b86-e76b2cb58875-kube-api-access-h72j6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.707000 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.706901 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a01b803-2b21-4324-8b86-e76b2cb58875-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.707000 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.706989 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6a01b803-2b21-4324-8b86-e76b2cb58875-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.707234 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.707049 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a01b803-2b21-4324-8b86-e76b2cb58875-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.707234 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.707094 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6a01b803-2b21-4324-8b86-e76b2cb58875-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.707234 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.707160 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h72j6\" (UniqueName: \"kubernetes.io/projected/6a01b803-2b21-4324-8b86-e76b2cb58875-kube-api-access-h72j6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.707234 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.707198 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6a01b803-2b21-4324-8b86-e76b2cb58875-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.707635 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.707610 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6a01b803-2b21-4324-8b86-e76b2cb58875-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.707850 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.707832 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a01b803-2b21-4324-8b86-e76b2cb58875-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.708752 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.708727 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a01b803-2b21-4324-8b86-e76b2cb58875-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.710695 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.710649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6a01b803-2b21-4324-8b86-e76b2cb58875-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.711030 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.711007 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6a01b803-2b21-4324-8b86-e76b2cb58875-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.716287 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.716254 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h72j6\" (UniqueName: \"kubernetes.io/projected/6a01b803-2b21-4324-8b86-e76b2cb58875-kube-api-access-h72j6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq\" (UID: \"6a01b803-2b21-4324-8b86-e76b2cb58875\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.768641 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.768603 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:16.913518 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.912004 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq"] Apr 20 19:20:16.917695 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:20:16.917668 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a01b803_2b21_4324_8b86_e76b2cb58875.slice/crio-5fd9dd5f1a40ba1b4ee63e301e3f490f4aa47b91ece813c86b45085c396d331a WatchSource:0}: Error finding container 5fd9dd5f1a40ba1b4ee63e301e3f490f4aa47b91ece813c86b45085c396d331a: Status 404 returned error can't find the container with id 5fd9dd5f1a40ba1b4ee63e301e3f490f4aa47b91ece813c86b45085c396d331a Apr 20 19:20:16.990899 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.990859 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" event={"ID":"6a01b803-2b21-4324-8b86-e76b2cb58875","Type":"ContainerStarted","Data":"cd4c698382c87c8ce68226e075afae9084a16c90bc05d65c0b62838c56d6496e"} Apr 20 19:20:16.991017 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:16.990910 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" event={"ID":"6a01b803-2b21-4324-8b86-e76b2cb58875","Type":"ContainerStarted","Data":"5fd9dd5f1a40ba1b4ee63e301e3f490f4aa47b91ece813c86b45085c396d331a"} Apr 20 19:20:18.998704 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:18.998662 2572 generic.go:358] "Generic (PLEG): container finished" podID="ec4ed0e4-0cfd-43c3-838a-658200ce0ee1" containerID="8a5059ce54092255af9d036c94b873f31861056498b2bfecafbd751476626441" exitCode=0 Apr 20 19:20:18.999192 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:18.998735 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" event={"ID":"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1","Type":"ContainerDied","Data":"8a5059ce54092255af9d036c94b873f31861056498b2bfecafbd751476626441"} Apr 20 19:20:21.009366 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:21.008521 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" event={"ID":"ec4ed0e4-0cfd-43c3-838a-658200ce0ee1","Type":"ContainerStarted","Data":"db889ac81a2ca2fb2ff3343b5853ec10fbf8b34806aad4d7bd5b39e66a4fd743"} Apr 20 19:20:21.009366 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:21.009317 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:21.031042 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:21.030981 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" podStartSLOduration=1.924872651 podStartE2EDuration="17.030961343s" podCreationTimestamp="2026-04-20 19:20:04 +0000 UTC" firstStartedPulling="2026-04-20 19:20:05.272959376 +0000 UTC m=+713.992262126" lastFinishedPulling="2026-04-20 19:20:20.379048065 +0000 UTC m=+729.098350818" observedRunningTime="2026-04-20 19:20:21.028839509 +0000 UTC m=+729.748142306" watchObservedRunningTime="2026-04-20 19:20:21.030961343 +0000 UTC m=+729.750264116" Apr 20 19:20:23.016503 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:23.016452 2572 generic.go:358] "Generic (PLEG): container finished" podID="6a01b803-2b21-4324-8b86-e76b2cb58875" containerID="cd4c698382c87c8ce68226e075afae9084a16c90bc05d65c0b62838c56d6496e" exitCode=0 Apr 20 19:20:23.016898 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:23.016524 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" event={"ID":"6a01b803-2b21-4324-8b86-e76b2cb58875","Type":"ContainerDied","Data":"cd4c698382c87c8ce68226e075afae9084a16c90bc05d65c0b62838c56d6496e"} Apr 20 19:20:24.021808 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:24.021774 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" event={"ID":"6a01b803-2b21-4324-8b86-e76b2cb58875","Type":"ContainerStarted","Data":"d97f2a4a5cf3e482cf5f333b53afa7f6edcc532f34382563689254e27388f9ac"} Apr 20 19:20:24.022205 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:24.021989 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:24.039541 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:24.039499 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" podStartSLOduration=7.796782718 podStartE2EDuration="8.039484798s" podCreationTimestamp="2026-04-20 19:20:16 +0000 UTC" firstStartedPulling="2026-04-20 19:20:23.017115824 +0000 UTC m=+731.736418574" lastFinishedPulling="2026-04-20 19:20:23.25981789 +0000 UTC m=+731.979120654" observedRunningTime="2026-04-20 19:20:24.037803369 +0000 UTC m=+732.757106140" watchObservedRunningTime="2026-04-20 19:20:24.039484798 +0000 UTC m=+732.758787571" Apr 20 19:20:30.162982 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:30.162939 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-69876cf954-mqlfg"] Apr 20 19:20:30.163511 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:30.163268 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-69876cf954-mqlfg" podUID="77981269-735e-4b56-b8b2-884528ae4940" containerName="maas-api" containerID="cri-o://d84fc53fc30def46e7fd5449b9a206509e906c965989d9743554d1f02f02f0e1" gracePeriod=30 Apr 20 19:20:30.403150 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:30.403127 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-69876cf954-mqlfg" Apr 20 19:20:30.537055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:30.536964 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk6c6\" (UniqueName: \"kubernetes.io/projected/77981269-735e-4b56-b8b2-884528ae4940-kube-api-access-kk6c6\") pod \"77981269-735e-4b56-b8b2-884528ae4940\" (UID: \"77981269-735e-4b56-b8b2-884528ae4940\") " Apr 20 19:20:30.537055 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:30.537018 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/77981269-735e-4b56-b8b2-884528ae4940-maas-api-tls\") pod \"77981269-735e-4b56-b8b2-884528ae4940\" (UID: \"77981269-735e-4b56-b8b2-884528ae4940\") " Apr 20 19:20:30.539140 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:30.539111 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77981269-735e-4b56-b8b2-884528ae4940-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "77981269-735e-4b56-b8b2-884528ae4940" (UID: "77981269-735e-4b56-b8b2-884528ae4940"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:20:30.539267 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:30.539164 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77981269-735e-4b56-b8b2-884528ae4940-kube-api-access-kk6c6" (OuterVolumeSpecName: "kube-api-access-kk6c6") pod "77981269-735e-4b56-b8b2-884528ae4940" (UID: "77981269-735e-4b56-b8b2-884528ae4940"). InnerVolumeSpecName "kube-api-access-kk6c6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:20:30.638531 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:30.638497 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kk6c6\" (UniqueName: \"kubernetes.io/projected/77981269-735e-4b56-b8b2-884528ae4940-kube-api-access-kk6c6\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:20:30.638531 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:30.638528 2572 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/77981269-735e-4b56-b8b2-884528ae4940-maas-api-tls\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:20:31.048211 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:31.048180 2572 generic.go:358] "Generic (PLEG): container finished" podID="77981269-735e-4b56-b8b2-884528ae4940" containerID="d84fc53fc30def46e7fd5449b9a206509e906c965989d9743554d1f02f02f0e1" exitCode=0 Apr 20 19:20:31.048417 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:31.048264 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-69876cf954-mqlfg" Apr 20 19:20:31.048417 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:31.048272 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-69876cf954-mqlfg" event={"ID":"77981269-735e-4b56-b8b2-884528ae4940","Type":"ContainerDied","Data":"d84fc53fc30def46e7fd5449b9a206509e906c965989d9743554d1f02f02f0e1"} Apr 20 19:20:31.048417 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:31.048319 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-69876cf954-mqlfg" event={"ID":"77981269-735e-4b56-b8b2-884528ae4940","Type":"ContainerDied","Data":"eb8464d1968059716d5d637011cb5b43eecc7116d3c91a49150d023f1c5baf59"} Apr 20 19:20:31.048417 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:31.048340 2572 scope.go:117] "RemoveContainer" containerID="d84fc53fc30def46e7fd5449b9a206509e906c965989d9743554d1f02f02f0e1" Apr 20 19:20:31.056601 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:31.056581 2572 scope.go:117] "RemoveContainer" containerID="d84fc53fc30def46e7fd5449b9a206509e906c965989d9743554d1f02f02f0e1" Apr 20 19:20:31.056850 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:20:31.056833 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d84fc53fc30def46e7fd5449b9a206509e906c965989d9743554d1f02f02f0e1\": container with ID starting with d84fc53fc30def46e7fd5449b9a206509e906c965989d9743554d1f02f02f0e1 not found: ID does not exist" containerID="d84fc53fc30def46e7fd5449b9a206509e906c965989d9743554d1f02f02f0e1" Apr 20 19:20:31.056898 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:31.056856 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84fc53fc30def46e7fd5449b9a206509e906c965989d9743554d1f02f02f0e1"} err="failed to get container status \"d84fc53fc30def46e7fd5449b9a206509e906c965989d9743554d1f02f02f0e1\": rpc error: code = NotFound desc = could not find container \"d84fc53fc30def46e7fd5449b9a206509e906c965989d9743554d1f02f02f0e1\": container with ID starting with d84fc53fc30def46e7fd5449b9a206509e906c965989d9743554d1f02f02f0e1 not found: ID does not exist" Apr 20 19:20:31.067036 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:31.067011 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-69876cf954-mqlfg"] Apr 20 19:20:31.073222 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:31.073203 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-69876cf954-mqlfg"] Apr 20 19:20:31.829845 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:31.829814 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77981269-735e-4b56-b8b2-884528ae4940" path="/var/lib/kubelet/pods/77981269-735e-4b56-b8b2-884528ae4940/volumes" Apr 20 19:20:32.025723 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:32.025696 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2" Apr 20 19:20:35.038272 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:35.038238 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq" Apr 20 19:20:40.049340 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.049300 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh"] Apr 20 19:20:40.049949 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.049794 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77981269-735e-4b56-b8b2-884528ae4940" containerName="maas-api" Apr 20 19:20:40.049949 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.049814 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="77981269-735e-4b56-b8b2-884528ae4940" containerName="maas-api" Apr 20 19:20:40.049949 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.049915 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="77981269-735e-4b56-b8b2-884528ae4940" containerName="maas-api" Apr 20 19:20:40.054110 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.054088 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.056109 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.056086 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 19:20:40.061320 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.061263 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh"] Apr 20 19:20:40.110283 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.110247 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.110516 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.110298 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.110516 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.110410 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99rnx\" (UniqueName: \"kubernetes.io/projected/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-kube-api-access-99rnx\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.110516 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.110457 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.110516 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.110511 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.110747 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.110544 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.211878 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.211843 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99rnx\" (UniqueName: \"kubernetes.io/projected/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-kube-api-access-99rnx\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.212061 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.211992 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.212061 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.212039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.212185 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.212075 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.212185 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.212140 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.212291 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.212192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.212718 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.212690 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.213038 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.213004 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.213169 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.213064 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.215452 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.215362 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.215637 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.215615 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.218934 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.218907 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99rnx\" (UniqueName: \"kubernetes.io/projected/3870e9c5-9b15-4a36-96ea-563e5f8d27dc-kube-api-access-99rnx\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh\" (UID: \"3870e9c5-9b15-4a36-96ea-563e5f8d27dc\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.364705 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.364617 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:40.491565 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:40.491539 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh"] Apr 20 19:20:40.493823 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:20:40.493792 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3870e9c5_9b15_4a36_96ea_563e5f8d27dc.slice/crio-a9e64477c894dcaa19734de9cc15df7086f945f6eb08e4266a5c3cd8483f18c9 WatchSource:0}: Error finding container a9e64477c894dcaa19734de9cc15df7086f945f6eb08e4266a5c3cd8483f18c9: Status 404 returned error can't find the container with id a9e64477c894dcaa19734de9cc15df7086f945f6eb08e4266a5c3cd8483f18c9 Apr 20 19:20:41.092653 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:41.092616 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" event={"ID":"3870e9c5-9b15-4a36-96ea-563e5f8d27dc","Type":"ContainerStarted","Data":"38993cfcfb71d6c5567155650a3b20e15c059f89068c123dcc2ecff3bdc9ebdc"} Apr 20 19:20:41.092653 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:41.092657 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" event={"ID":"3870e9c5-9b15-4a36-96ea-563e5f8d27dc","Type":"ContainerStarted","Data":"a9e64477c894dcaa19734de9cc15df7086f945f6eb08e4266a5c3cd8483f18c9"} Apr 20 19:20:47.115503 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:47.115449 2572 generic.go:358] "Generic (PLEG): container finished" podID="3870e9c5-9b15-4a36-96ea-563e5f8d27dc" containerID="38993cfcfb71d6c5567155650a3b20e15c059f89068c123dcc2ecff3bdc9ebdc" exitCode=0 Apr 20 19:20:47.115967 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:47.115525 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" event={"ID":"3870e9c5-9b15-4a36-96ea-563e5f8d27dc","Type":"ContainerDied","Data":"38993cfcfb71d6c5567155650a3b20e15c059f89068c123dcc2ecff3bdc9ebdc"} Apr 20 19:20:48.121407 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:48.121317 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" event={"ID":"3870e9c5-9b15-4a36-96ea-563e5f8d27dc","Type":"ContainerStarted","Data":"534e9dd12407cc029cb5f2ffacec1d42e5fb9718ef0470ade4a94ffabd456fd8"} Apr 20 19:20:48.121849 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:48.121631 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:20:48.138124 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:48.138069 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" podStartSLOduration=7.897102268 podStartE2EDuration="8.13805378s" podCreationTimestamp="2026-04-20 19:20:40 +0000 UTC" firstStartedPulling="2026-04-20 19:20:47.116175681 +0000 UTC m=+755.835478444" lastFinishedPulling="2026-04-20 19:20:47.3571272 +0000 UTC m=+756.076429956" observedRunningTime="2026-04-20 19:20:48.137761281 +0000 UTC m=+756.857064055" watchObservedRunningTime="2026-04-20 19:20:48.13805378 +0000 UTC m=+756.857356551" Apr 20 19:20:59.138024 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:20:59.137995 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh" Apr 20 19:21:08.652967 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.652929 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc"] Apr 20 19:21:08.655201 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.655184 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.657262 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.657240 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 19:21:08.665558 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.665532 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc"] Apr 20 19:21:08.755657 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.755618 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd1631cf-d803-4b9a-a454-35d4f710604d-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.755825 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.755671 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd1631cf-d803-4b9a-a454-35d4f710604d-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.755825 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.755717 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd1631cf-d803-4b9a-a454-35d4f710604d-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.755825 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.755733 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd1631cf-d803-4b9a-a454-35d4f710604d-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.755950 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.755826 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlthd\" (UniqueName: \"kubernetes.io/projected/dd1631cf-d803-4b9a-a454-35d4f710604d-kube-api-access-hlthd\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.755950 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.755887 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1631cf-d803-4b9a-a454-35d4f710604d-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.856594 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.856559 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1631cf-d803-4b9a-a454-35d4f710604d-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.856776 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.856610 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd1631cf-d803-4b9a-a454-35d4f710604d-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.856776 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.856646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd1631cf-d803-4b9a-a454-35d4f710604d-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.856776 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.856672 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd1631cf-d803-4b9a-a454-35d4f710604d-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.856776 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.856687 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd1631cf-d803-4b9a-a454-35d4f710604d-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.856776 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.856722 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlthd\" (UniqueName: \"kubernetes.io/projected/dd1631cf-d803-4b9a-a454-35d4f710604d-kube-api-access-hlthd\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.857139 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.857107 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd1631cf-d803-4b9a-a454-35d4f710604d-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.857273 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.857117 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd1631cf-d803-4b9a-a454-35d4f710604d-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.857273 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.857151 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd1631cf-d803-4b9a-a454-35d4f710604d-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.858875 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.858855 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd1631cf-d803-4b9a-a454-35d4f710604d-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.859060 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.859045 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd1631cf-d803-4b9a-a454-35d4f710604d-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.864008 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.863976 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlthd\" (UniqueName: \"kubernetes.io/projected/dd1631cf-d803-4b9a-a454-35d4f710604d-kube-api-access-hlthd\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-kslsc\" (UID: \"dd1631cf-d803-4b9a-a454-35d4f710604d\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:08.964372 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:08.964294 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:09.096755 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:09.096727 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc"] Apr 20 19:21:09.098826 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:21:09.098788 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd1631cf_d803_4b9a_a454_35d4f710604d.slice/crio-1a11daf059b761c5d4ed5d21c284c92934277d9ab9a4b92d4c8e21ca36d81cd0 WatchSource:0}: Error finding container 1a11daf059b761c5d4ed5d21c284c92934277d9ab9a4b92d4c8e21ca36d81cd0: Status 404 returned error can't find the container with id 1a11daf059b761c5d4ed5d21c284c92934277d9ab9a4b92d4c8e21ca36d81cd0 Apr 20 19:21:09.100482 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:09.100457 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:21:09.193762 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:09.193726 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" event={"ID":"dd1631cf-d803-4b9a-a454-35d4f710604d","Type":"ContainerStarted","Data":"b0334cb6818d3801f4a7b4d0f51f94e7084912ed6ee58656c31c4d976cc8f973"} Apr 20 19:21:09.193873 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:09.193768 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" event={"ID":"dd1631cf-d803-4b9a-a454-35d4f710604d","Type":"ContainerStarted","Data":"1a11daf059b761c5d4ed5d21c284c92934277d9ab9a4b92d4c8e21ca36d81cd0"} Apr 20 19:21:18.229673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:18.229638 2572 generic.go:358] "Generic (PLEG): container finished" podID="dd1631cf-d803-4b9a-a454-35d4f710604d" containerID="b0334cb6818d3801f4a7b4d0f51f94e7084912ed6ee58656c31c4d976cc8f973" exitCode=0 Apr 20 19:21:18.230124 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:18.229697 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" event={"ID":"dd1631cf-d803-4b9a-a454-35d4f710604d","Type":"ContainerDied","Data":"b0334cb6818d3801f4a7b4d0f51f94e7084912ed6ee58656c31c4d976cc8f973"} Apr 20 19:21:19.234266 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:19.234230 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" event={"ID":"dd1631cf-d803-4b9a-a454-35d4f710604d","Type":"ContainerStarted","Data":"59aac2b40c5e0d3dfb6064a4126be4ac5566d08da1092a555b3637751ccc8a1a"} Apr 20 19:21:19.234677 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:19.234427 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:21:19.253284 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:19.253236 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" podStartSLOduration=11.047758015 podStartE2EDuration="11.25322105s" podCreationTimestamp="2026-04-20 19:21:08 +0000 UTC" firstStartedPulling="2026-04-20 19:21:18.230321533 +0000 UTC m=+786.949624283" lastFinishedPulling="2026-04-20 19:21:18.435784565 +0000 UTC m=+787.155087318" observedRunningTime="2026-04-20 19:21:19.250960953 +0000 UTC m=+787.970263725" watchObservedRunningTime="2026-04-20 19:21:19.25322105 +0000 UTC m=+787.972523822" Apr 20 19:21:30.254503 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:21:30.254409 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-kslsc" Apr 20 19:23:02.319898 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.319816 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6448987b54-4qsm8"] Apr 20 19:23:02.320506 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.320117 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6448987b54-4qsm8" podUID="2fca5714-012b-46cb-b1d1-0ff24b14b8b3" containerName="manager" containerID="cri-o://648e9e9635a39dc71956244c6b4548d653f459a2a995e614e23daaf7985a2171" gracePeriod=10 Apr 20 19:23:02.568717 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.568694 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6448987b54-4qsm8" Apr 20 19:23:02.587399 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.587326 2572 generic.go:358] "Generic (PLEG): container finished" podID="2fca5714-012b-46cb-b1d1-0ff24b14b8b3" containerID="648e9e9635a39dc71956244c6b4548d653f459a2a995e614e23daaf7985a2171" exitCode=0 Apr 20 19:23:02.587399 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.587381 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6448987b54-4qsm8" event={"ID":"2fca5714-012b-46cb-b1d1-0ff24b14b8b3","Type":"ContainerDied","Data":"648e9e9635a39dc71956244c6b4548d653f459a2a995e614e23daaf7985a2171"} Apr 20 19:23:02.587581 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.587409 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6448987b54-4qsm8" event={"ID":"2fca5714-012b-46cb-b1d1-0ff24b14b8b3","Type":"ContainerDied","Data":"28154e19ccdd578ba38421bbd2afa31c56eb3a5c12c7d00bd0c2475d4978747c"} Apr 20 19:23:02.587581 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.587428 2572 scope.go:117] "RemoveContainer" containerID="648e9e9635a39dc71956244c6b4548d653f459a2a995e614e23daaf7985a2171" Apr 20 19:23:02.587670 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.587594 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6448987b54-4qsm8" Apr 20 19:23:02.596310 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.596292 2572 scope.go:117] "RemoveContainer" containerID="648e9e9635a39dc71956244c6b4548d653f459a2a995e614e23daaf7985a2171" Apr 20 19:23:02.596595 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:23:02.596576 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"648e9e9635a39dc71956244c6b4548d653f459a2a995e614e23daaf7985a2171\": container with ID starting with 648e9e9635a39dc71956244c6b4548d653f459a2a995e614e23daaf7985a2171 not found: ID does not exist" containerID="648e9e9635a39dc71956244c6b4548d653f459a2a995e614e23daaf7985a2171" Apr 20 19:23:02.596667 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.596608 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648e9e9635a39dc71956244c6b4548d653f459a2a995e614e23daaf7985a2171"} err="failed to get container status \"648e9e9635a39dc71956244c6b4548d653f459a2a995e614e23daaf7985a2171\": rpc error: code = NotFound desc = could not find container \"648e9e9635a39dc71956244c6b4548d653f459a2a995e614e23daaf7985a2171\": container with ID starting with 648e9e9635a39dc71956244c6b4548d653f459a2a995e614e23daaf7985a2171 not found: ID does not exist" Apr 20 19:23:02.753103 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.753064 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7km9\" (UniqueName: \"kubernetes.io/projected/2fca5714-012b-46cb-b1d1-0ff24b14b8b3-kube-api-access-s7km9\") pod \"2fca5714-012b-46cb-b1d1-0ff24b14b8b3\" (UID: \"2fca5714-012b-46cb-b1d1-0ff24b14b8b3\") " Apr 20 19:23:02.755093 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.755071 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fca5714-012b-46cb-b1d1-0ff24b14b8b3-kube-api-access-s7km9" (OuterVolumeSpecName: "kube-api-access-s7km9") pod "2fca5714-012b-46cb-b1d1-0ff24b14b8b3" (UID: "2fca5714-012b-46cb-b1d1-0ff24b14b8b3"). InnerVolumeSpecName "kube-api-access-s7km9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:23:02.854329 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.854252 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s7km9\" (UniqueName: \"kubernetes.io/projected/2fca5714-012b-46cb-b1d1-0ff24b14b8b3-kube-api-access-s7km9\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:23:02.909122 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.909092 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6448987b54-4qsm8"] Apr 20 19:23:02.914396 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:02.914368 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6448987b54-4qsm8"] Apr 20 19:23:03.634679 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:03.634591 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6448987b54-t6t94"] Apr 20 19:23:03.635204 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:03.635073 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fca5714-012b-46cb-b1d1-0ff24b14b8b3" containerName="manager" Apr 20 19:23:03.635204 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:03.635092 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fca5714-012b-46cb-b1d1-0ff24b14b8b3" containerName="manager" Apr 20 19:23:03.635204 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:03.635202 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fca5714-012b-46cb-b1d1-0ff24b14b8b3" containerName="manager" Apr 20 19:23:03.639577 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:03.639556 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6448987b54-t6t94" Apr 20 19:23:03.641609 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:03.641591 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-x4p8t\"" Apr 20 19:23:03.645889 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:03.645861 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6448987b54-t6t94"] Apr 20 19:23:03.763880 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:03.763843 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsm5q\" (UniqueName: \"kubernetes.io/projected/fe74a19d-0983-487f-ad18-62767b812076-kube-api-access-lsm5q\") pod \"maas-controller-6448987b54-t6t94\" (UID: \"fe74a19d-0983-487f-ad18-62767b812076\") " pod="opendatahub/maas-controller-6448987b54-t6t94" Apr 20 19:23:03.829768 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:03.829735 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fca5714-012b-46cb-b1d1-0ff24b14b8b3" path="/var/lib/kubelet/pods/2fca5714-012b-46cb-b1d1-0ff24b14b8b3/volumes" Apr 20 19:23:03.864270 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:03.864241 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsm5q\" (UniqueName: \"kubernetes.io/projected/fe74a19d-0983-487f-ad18-62767b812076-kube-api-access-lsm5q\") pod \"maas-controller-6448987b54-t6t94\" (UID: \"fe74a19d-0983-487f-ad18-62767b812076\") " pod="opendatahub/maas-controller-6448987b54-t6t94" Apr 20 19:23:03.872293 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:03.872258 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsm5q\" (UniqueName: \"kubernetes.io/projected/fe74a19d-0983-487f-ad18-62767b812076-kube-api-access-lsm5q\") pod \"maas-controller-6448987b54-t6t94\" (UID: \"fe74a19d-0983-487f-ad18-62767b812076\") " pod="opendatahub/maas-controller-6448987b54-t6t94" Apr 20 19:23:03.951241 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:03.951208 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6448987b54-t6t94" Apr 20 19:23:04.067507 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:04.067465 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6448987b54-t6t94"] Apr 20 19:23:04.070083 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:23:04.070052 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe74a19d_0983_487f_ad18_62767b812076.slice/crio-19a6c094b608e8beee4989d6ee203025993d1e4ea198eefff8dd0f819c0b4183 WatchSource:0}: Error finding container 19a6c094b608e8beee4989d6ee203025993d1e4ea198eefff8dd0f819c0b4183: Status 404 returned error can't find the container with id 19a6c094b608e8beee4989d6ee203025993d1e4ea198eefff8dd0f819c0b4183 Apr 20 19:23:04.596093 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:04.596059 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6448987b54-t6t94" event={"ID":"fe74a19d-0983-487f-ad18-62767b812076","Type":"ContainerStarted","Data":"9b81a417ea3ccbfb49d729452db28626c1c4283ac41a708cd972556d8878ef6d"} Apr 20 19:23:04.596093 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:04.596099 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6448987b54-t6t94" event={"ID":"fe74a19d-0983-487f-ad18-62767b812076","Type":"ContainerStarted","Data":"19a6c094b608e8beee4989d6ee203025993d1e4ea198eefff8dd0f819c0b4183"} Apr 20 19:23:04.596339 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:04.596232 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6448987b54-t6t94" Apr 20 19:23:04.614106 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:04.614052 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6448987b54-t6t94" podStartSLOduration=1.197585205 podStartE2EDuration="1.614039576s" podCreationTimestamp="2026-04-20 19:23:03 +0000 UTC" firstStartedPulling="2026-04-20 19:23:04.071231753 +0000 UTC m=+892.790534503" lastFinishedPulling="2026-04-20 19:23:04.487686121 +0000 UTC m=+893.206988874" observedRunningTime="2026-04-20 19:23:04.612160651 +0000 UTC m=+893.331463423" watchObservedRunningTime="2026-04-20 19:23:04.614039576 +0000 UTC m=+893.333342348" Apr 20 19:23:15.606686 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:23:15.606653 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6448987b54-t6t94" Apr 20 19:33:13.263371 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.263337 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m"] Apr 20 19:33:13.266014 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.263586 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" podUID="ca0c216b-59ae-4b35-bf71-e65828a1a0ac" containerName="manager" containerID="cri-o://cf0fc17358d9af83c6c832995af2ce7c27958722fab8f4e001796ff43ed963da" gracePeriod=10 Apr 20 19:33:13.608876 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.608849 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" Apr 20 19:33:13.636519 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.636459 2572 generic.go:358] "Generic (PLEG): container finished" podID="ca0c216b-59ae-4b35-bf71-e65828a1a0ac" containerID="cf0fc17358d9af83c6c832995af2ce7c27958722fab8f4e001796ff43ed963da" exitCode=0 Apr 20 19:33:13.636647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.636563 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" Apr 20 19:33:13.636647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.636578 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" event={"ID":"ca0c216b-59ae-4b35-bf71-e65828a1a0ac","Type":"ContainerDied","Data":"cf0fc17358d9af83c6c832995af2ce7c27958722fab8f4e001796ff43ed963da"} Apr 20 19:33:13.636647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.636615 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" event={"ID":"ca0c216b-59ae-4b35-bf71-e65828a1a0ac","Type":"ContainerDied","Data":"2f54ea21b4f08526b74574229104daf40030067945595291bacb68e4e9baf802"} Apr 20 19:33:13.636647 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.636631 2572 scope.go:117] "RemoveContainer" containerID="cf0fc17358d9af83c6c832995af2ce7c27958722fab8f4e001796ff43ed963da" Apr 20 19:33:13.645432 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.645408 2572 scope.go:117] "RemoveContainer" containerID="cf0fc17358d9af83c6c832995af2ce7c27958722fab8f4e001796ff43ed963da" Apr 20 19:33:13.645781 ip-10-0-136-5 kubenswrapper[2572]: E0420 19:33:13.645744 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0fc17358d9af83c6c832995af2ce7c27958722fab8f4e001796ff43ed963da\": container with ID starting with cf0fc17358d9af83c6c832995af2ce7c27958722fab8f4e001796ff43ed963da not found: ID does not exist" containerID="cf0fc17358d9af83c6c832995af2ce7c27958722fab8f4e001796ff43ed963da" Apr 20 19:33:13.645876 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.645791 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0fc17358d9af83c6c832995af2ce7c27958722fab8f4e001796ff43ed963da"} err="failed to get container status \"cf0fc17358d9af83c6c832995af2ce7c27958722fab8f4e001796ff43ed963da\": rpc error: code = NotFound desc = could not find container \"cf0fc17358d9af83c6c832995af2ce7c27958722fab8f4e001796ff43ed963da\": container with ID starting with cf0fc17358d9af83c6c832995af2ce7c27958722fab8f4e001796ff43ed963da not found: ID does not exist" Apr 20 19:33:13.646993 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.646972 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ca0c216b-59ae-4b35-bf71-e65828a1a0ac-extensions-socket-volume\") pod \"ca0c216b-59ae-4b35-bf71-e65828a1a0ac\" (UID: \"ca0c216b-59ae-4b35-bf71-e65828a1a0ac\") " Apr 20 19:33:13.647099 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.647011 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c95lh\" (UniqueName: \"kubernetes.io/projected/ca0c216b-59ae-4b35-bf71-e65828a1a0ac-kube-api-access-c95lh\") pod \"ca0c216b-59ae-4b35-bf71-e65828a1a0ac\" (UID: \"ca0c216b-59ae-4b35-bf71-e65828a1a0ac\") " Apr 20 19:33:13.647368 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.647343 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca0c216b-59ae-4b35-bf71-e65828a1a0ac-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "ca0c216b-59ae-4b35-bf71-e65828a1a0ac" (UID: "ca0c216b-59ae-4b35-bf71-e65828a1a0ac"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:33:13.649114 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.649088 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca0c216b-59ae-4b35-bf71-e65828a1a0ac-kube-api-access-c95lh" (OuterVolumeSpecName: "kube-api-access-c95lh") pod "ca0c216b-59ae-4b35-bf71-e65828a1a0ac" (UID: "ca0c216b-59ae-4b35-bf71-e65828a1a0ac"). InnerVolumeSpecName "kube-api-access-c95lh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:33:13.747753 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.747721 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/ca0c216b-59ae-4b35-bf71-e65828a1a0ac-extensions-socket-volume\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:33:13.747753 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.747747 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c95lh\" (UniqueName: \"kubernetes.io/projected/ca0c216b-59ae-4b35-bf71-e65828a1a0ac-kube-api-access-c95lh\") on node \"ip-10-0-136-5.ec2.internal\" DevicePath \"\"" Apr 20 19:33:13.952561 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.952527 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m"] Apr 20 19:33:13.956805 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:13.956781 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m"] Apr 20 19:33:14.570662 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:14.570618 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-7c22m" podUID="ca0c216b-59ae-4b35-bf71-e65828a1a0ac" containerName="manager" probeResult="failure" output="Get \"http://10.133.0.24:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 20 19:33:15.829292 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:33:15.829258 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca0c216b-59ae-4b35-bf71-e65828a1a0ac" path="/var/lib/kubelet/pods/ca0c216b-59ae-4b35-bf71-e65828a1a0ac/volumes" Apr 20 19:34:19.400438 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.400406 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr"] Apr 20 19:34:19.400928 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.400740 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca0c216b-59ae-4b35-bf71-e65828a1a0ac" containerName="manager" Apr 20 19:34:19.400928 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.400751 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca0c216b-59ae-4b35-bf71-e65828a1a0ac" containerName="manager" Apr 20 19:34:19.400928 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.400814 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca0c216b-59ae-4b35-bf71-e65828a1a0ac" containerName="manager" Apr 20 19:34:19.404091 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.404075 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr" Apr 20 19:34:19.407346 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.407328 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 19:34:19.407505 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.407464 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 19:34:19.407605 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.407588 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-npjhc\"" Apr 20 19:34:19.416893 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.416873 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr"] Apr 20 19:34:19.527203 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.527159 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skw7n\" (UniqueName: \"kubernetes.io/projected/eaf85fb7-acba-4720-adb0-27233fe067a4-kube-api-access-skw7n\") pod \"kuadrant-operator-controller-manager-55c7f4c975-mmxsr\" (UID: \"eaf85fb7-acba-4720-adb0-27233fe067a4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr" Apr 20 19:34:19.527382 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.527221 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eaf85fb7-acba-4720-adb0-27233fe067a4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-mmxsr\" (UID: \"eaf85fb7-acba-4720-adb0-27233fe067a4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr" Apr 20 19:34:19.628269 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.628239 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eaf85fb7-acba-4720-adb0-27233fe067a4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-mmxsr\" (UID: \"eaf85fb7-acba-4720-adb0-27233fe067a4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr" Apr 20 19:34:19.628416 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.628307 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-skw7n\" (UniqueName: \"kubernetes.io/projected/eaf85fb7-acba-4720-adb0-27233fe067a4-kube-api-access-skw7n\") pod \"kuadrant-operator-controller-manager-55c7f4c975-mmxsr\" (UID: \"eaf85fb7-acba-4720-adb0-27233fe067a4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr" Apr 20 19:34:19.628637 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.628619 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/eaf85fb7-acba-4720-adb0-27233fe067a4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-mmxsr\" (UID: \"eaf85fb7-acba-4720-adb0-27233fe067a4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr" Apr 20 19:34:19.637036 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.637012 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-skw7n\" (UniqueName: \"kubernetes.io/projected/eaf85fb7-acba-4720-adb0-27233fe067a4-kube-api-access-skw7n\") pod \"kuadrant-operator-controller-manager-55c7f4c975-mmxsr\" (UID: \"eaf85fb7-acba-4720-adb0-27233fe067a4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr" Apr 20 19:34:19.714384 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.714290 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr" Apr 20 19:34:19.838258 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.838238 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr"] Apr 20 19:34:19.840610 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:34:19.840581 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaf85fb7_acba_4720_adb0_27233fe067a4.slice/crio-aa9bb559a9a63ff88892a650c58bab755fe8b6a84ce36b47aa1042409dba2300 WatchSource:0}: Error finding container aa9bb559a9a63ff88892a650c58bab755fe8b6a84ce36b47aa1042409dba2300: Status 404 returned error can't find the container with id aa9bb559a9a63ff88892a650c58bab755fe8b6a84ce36b47aa1042409dba2300 Apr 20 19:34:19.842754 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.842734 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:34:19.855998 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:19.855972 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr" event={"ID":"eaf85fb7-acba-4720-adb0-27233fe067a4","Type":"ContainerStarted","Data":"aa9bb559a9a63ff88892a650c58bab755fe8b6a84ce36b47aa1042409dba2300"} Apr 20 19:34:20.861036 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:20.861001 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr" event={"ID":"eaf85fb7-acba-4720-adb0-27233fe067a4","Type":"ContainerStarted","Data":"297852e149e8b89fae2f7a0913827fa4bcfb59ebbefed26578fa31a582292926"} Apr 20 19:34:20.861404 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:20.861156 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr" Apr 20 19:34:20.878318 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:20.878273 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr" podStartSLOduration=1.8782582030000001 podStartE2EDuration="1.878258203s" podCreationTimestamp="2026-04-20 19:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:34:20.876607117 +0000 UTC m=+1569.595909888" watchObservedRunningTime="2026-04-20 19:34:20.878258203 +0000 UTC m=+1569.597560975" Apr 20 19:34:31.866571 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:34:31.866541 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-mmxsr" Apr 20 19:44:10.455713 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:10.455680 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-njx8s_5e6fec0a-e407-4959-8eac-96add8aa5367/manager/0.log" Apr 20 19:44:10.697199 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:10.697165 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-6448987b54-t6t94_fe74a19d-0983-487f-ad18-62767b812076/manager/0.log" Apr 20 19:44:10.819358 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:10.819285 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-hkrrj_549ea97d-23c0-456d-b087-b04bb3694d05/manager/2.log" Apr 20 19:44:10.946295 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:10.946263 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c77764cd6-d25gj_26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba/manager/0.log" Apr 20 19:44:11.297623 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:11.297593 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-kj5qj_b23241e5-4c3e-4302-9dd1-bf63e4497593/postgres/0.log" Apr 20 19:44:13.140692 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:13.140653 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-mmxsr_eaf85fb7-acba-4720-adb0-27233fe067a4/manager/0.log" Apr 20 19:44:13.834401 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:13.834368 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-8ptvr_7f2c3012-6e24-4bb5-b0e9-72fb8186d007/discovery/0.log" Apr 20 19:44:13.949993 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:13.949962 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5489467c57-rkknm_8208c3ef-bdb3-4df4-81b8-808e9113792f/kube-auth-proxy/0.log" Apr 20 19:44:14.613993 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:14.613955 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq_6a01b803-2b21-4324-8b86-e76b2cb58875/storage-initializer/0.log" Apr 20 19:44:14.620719 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:14.620696 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-2wglq_6a01b803-2b21-4324-8b86-e76b2cb58875/main/0.log" Apr 20 19:44:14.729012 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:14.728980 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh_3870e9c5-9b15-4a36-96ea-563e5f8d27dc/storage-initializer/0.log" Apr 20 19:44:14.736581 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:14.736556 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-jh2sh_3870e9c5-9b15-4a36-96ea-563e5f8d27dc/main/0.log" Apr 20 19:44:14.960003 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:14.959977 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2_ec4ed0e4-0cfd-43c3-838a-658200ce0ee1/storage-initializer/0.log" Apr 20 19:44:14.968546 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:14.968516 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc5bfs2_ec4ed0e4-0cfd-43c3-838a-658200ce0ee1/main/0.log" Apr 20 19:44:15.193976 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:15.193950 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-kslsc_dd1631cf-d803-4b9a-a454-35d4f710604d/storage-initializer/0.log" Apr 20 19:44:15.201771 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:15.201749 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-kslsc_dd1631cf-d803-4b9a-a454-35d4f710604d/main/0.log" Apr 20 19:44:22.408123 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:22.408092 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5nsbx_7850d8bf-83d1-45ed-9a2d-cccbc11a2db8/global-pull-secret-syncer/0.log" Apr 20 19:44:22.541428 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:22.541399 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pt5md_f9b82167-c5ea-4998-8d5c-93cec402a0fa/konnectivity-agent/0.log" Apr 20 19:44:22.701100 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:22.701024 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-5.ec2.internal_c759d37c693dd42a96a635e33d6e4429/haproxy/0.log" Apr 20 19:44:26.538886 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:26.538849 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-mmxsr_eaf85fb7-acba-4720-adb0-27233fe067a4/manager/0.log" Apr 20 19:44:28.176954 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:28.176924 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v9q78_41dab2ed-afae-48a6-8ea7-794f4f1f5e76/node-exporter/0.log" Apr 20 19:44:28.191412 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:28.191384 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v9q78_41dab2ed-afae-48a6-8ea7-794f4f1f5e76/kube-rbac-proxy/0.log" Apr 20 19:44:28.206266 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:28.206240 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v9q78_41dab2ed-afae-48a6-8ea7-794f4f1f5e76/init-textfile/0.log" Apr 20 19:44:28.433557 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:28.433530 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2da19f46-34a6-4ebb-863a-83d31b4ab964/prometheus/0.log" Apr 20 19:44:28.453164 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:28.453141 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2da19f46-34a6-4ebb-863a-83d31b4ab964/config-reloader/0.log" Apr 20 19:44:28.470821 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:28.470787 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2da19f46-34a6-4ebb-863a-83d31b4ab964/thanos-sidecar/0.log" Apr 20 19:44:28.491070 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:28.491043 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2da19f46-34a6-4ebb-863a-83d31b4ab964/kube-rbac-proxy-web/0.log" Apr 20 19:44:28.518046 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:28.518024 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2da19f46-34a6-4ebb-863a-83d31b4ab964/kube-rbac-proxy/0.log" Apr 20 19:44:28.538821 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:28.538796 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2da19f46-34a6-4ebb-863a-83d31b4ab964/kube-rbac-proxy-thanos/0.log" Apr 20 19:44:28.557427 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:28.557403 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2da19f46-34a6-4ebb-863a-83d31b4ab964/init-config-reloader/0.log" Apr 20 19:44:31.297089 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.297050 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5"] Apr 20 19:44:31.300861 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.300838 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.302702 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.302679 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xgkc7\"/\"default-dockercfg-7j246\"" Apr 20 19:44:31.302864 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.302849 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xgkc7\"/\"openshift-service-ca.crt\"" Apr 20 19:44:31.303132 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.303117 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xgkc7\"/\"kube-root-ca.crt\"" Apr 20 19:44:31.307134 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.307071 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5"] Apr 20 19:44:31.396448 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.396417 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-sys\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.396651 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.396507 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-podres\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.396651 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.396534 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-proc\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.396651 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.396590 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-lib-modules\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.396651 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.396629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffllk\" (UniqueName: \"kubernetes.io/projected/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-kube-api-access-ffllk\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.498034 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.498002 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-lib-modules\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.498197 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.498044 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffllk\" (UniqueName: \"kubernetes.io/projected/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-kube-api-access-ffllk\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.498197 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.498081 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-sys\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.498197 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.498117 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-podres\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.498197 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.498133 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-proc\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.498197 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.498182 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-lib-modules\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.498420 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.498194 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-sys\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.498420 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.498247 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-podres\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.498420 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.498298 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-proc\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.505638 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.505610 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffllk\" (UniqueName: \"kubernetes.io/projected/fc7c3a4f-f37f-4c04-adc9-b35e4baf182b-kube-api-access-ffllk\") pod \"perf-node-gather-daemonset-lmbz5\" (UID: \"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.611720 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.611629 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.734673 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.734656 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5"] Apr 20 19:44:31.736901 ip-10-0-136-5 kubenswrapper[2572]: W0420 19:44:31.736865 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfc7c3a4f_f37f_4c04_adc9_b35e4baf182b.slice/crio-48be58ce9076c10079ed20ac236147114f96d768db911e21b358efacf210acfb WatchSource:0}: Error finding container 48be58ce9076c10079ed20ac236147114f96d768db911e21b358efacf210acfb: Status 404 returned error can't find the container with id 48be58ce9076c10079ed20ac236147114f96d768db911e21b358efacf210acfb Apr 20 19:44:31.738452 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.738435 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:44:31.935167 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.935118 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" event={"ID":"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b","Type":"ContainerStarted","Data":"8e9e53375909dac376216f17bbd3d40fe3694077ad813f6d437cab7321deec4d"} Apr 20 19:44:31.935167 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.935160 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" event={"ID":"fc7c3a4f-f37f-4c04-adc9-b35e4baf182b","Type":"ContainerStarted","Data":"48be58ce9076c10079ed20ac236147114f96d768db911e21b358efacf210acfb"} Apr 20 19:44:31.935373 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.935267 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:31.951653 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:31.951602 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" podStartSLOduration=0.951584628 podStartE2EDuration="951.584628ms" podCreationTimestamp="2026-04-20 19:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:44:31.950145697 +0000 UTC m=+2180.669448468" watchObservedRunningTime="2026-04-20 19:44:31.951584628 +0000 UTC m=+2180.670887401" Apr 20 19:44:32.470699 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:32.470674 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-srqmp_808a40da-6675-4800-964d-852bb302978e/dns/0.log" Apr 20 19:44:32.489912 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:32.489870 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-srqmp_808a40da-6675-4800-964d-852bb302978e/kube-rbac-proxy/0.log" Apr 20 19:44:32.571352 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:32.571324 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lghm5_a8d74104-dc6c-4c07-a9eb-c5e76c1d3b08/dns-node-resolver/0.log" Apr 20 19:44:33.078750 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:33.078716 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-665dcb5b4b-6xv5x_0c5ac44b-a86d-483e-a156-7693b47bc2db/registry/0.log" Apr 20 19:44:33.101253 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:33.101224 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-dzjv7_89b86827-3229-4d28-8418-3ba07654afdd/node-ca/0.log" Apr 20 19:44:34.034637 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:34.034607 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-8ptvr_7f2c3012-6e24-4bb5-b0e9-72fb8186d007/discovery/0.log" Apr 20 19:44:34.049766 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:34.049726 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5489467c57-rkknm_8208c3ef-bdb3-4df4-81b8-808e9113792f/kube-auth-proxy/0.log" Apr 20 19:44:34.603214 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:34.603188 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8c6k6_70ee668a-415c-4913-9312-a001e69b58d8/serve-healthcheck-canary/0.log" Apr 20 19:44:35.174968 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:35.174943 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xgj9z_e9e9b8ab-e36a-433d-90a4-607be9937a16/kube-rbac-proxy/0.log" Apr 20 19:44:35.191607 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:35.191583 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xgj9z_e9e9b8ab-e36a-433d-90a4-607be9937a16/exporter/0.log" Apr 20 19:44:35.207763 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:35.207717 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xgj9z_e9e9b8ab-e36a-433d-90a4-607be9937a16/extractor/0.log" Apr 20 19:44:37.033058 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:37.033004 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-njx8s_5e6fec0a-e407-4959-8eac-96add8aa5367/manager/0.log" Apr 20 19:44:37.137626 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:37.137596 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-6448987b54-t6t94_fe74a19d-0983-487f-ad18-62767b812076/manager/0.log" Apr 20 19:44:37.152089 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:37.152061 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-hkrrj_549ea97d-23c0-456d-b087-b04bb3694d05/manager/1.log" Apr 20 19:44:37.174843 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:37.174806 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-hkrrj_549ea97d-23c0-456d-b087-b04bb3694d05/manager/2.log" Apr 20 19:44:37.223857 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:37.223817 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c77764cd6-d25gj_26e1776a-b8a0-4fa8-aa99-2e7d50f0e3ba/manager/0.log" Apr 20 19:44:37.276395 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:37.276367 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-kj5qj_b23241e5-4c3e-4302-9dd1-bf63e4497593/postgres/0.log" Apr 20 19:44:37.948330 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:37.948306 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-lmbz5" Apr 20 19:44:43.982070 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:43.982039 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tng2d_09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b/kube-multus-additional-cni-plugins/0.log" Apr 20 19:44:43.997908 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:43.997883 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tng2d_09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b/egress-router-binary-copy/0.log" Apr 20 19:44:44.014566 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:44.014544 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tng2d_09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b/cni-plugins/0.log" Apr 20 19:44:44.056333 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:44.056305 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tng2d_09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b/bond-cni-plugin/0.log" Apr 20 19:44:44.072335 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:44.072307 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tng2d_09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b/routeoverride-cni/0.log" Apr 20 19:44:44.088516 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:44.088489 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tng2d_09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b/whereabouts-cni-bincopy/0.log" Apr 20 19:44:44.103956 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:44.103940 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tng2d_09af58d0-bcc7-4f62-a6cc-c8e13d0edc7b/whereabouts-cni/0.log" Apr 20 19:44:44.135077 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:44.135056 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hktrf_5cfa2a80-feb6-4f70-8ca4-75225aa4dae7/kube-multus/0.log" Apr 20 19:44:44.264646 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:44.264574 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rwnnv_e9c238c6-ab0d-4140-b842-f59e7642479c/network-metrics-daemon/0.log" Apr 20 19:44:44.278161 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:44.278129 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rwnnv_e9c238c6-ab0d-4140-b842-f59e7642479c/kube-rbac-proxy/0.log" Apr 20 19:44:45.251258 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:45.251212 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hqs9s_73d65bde-5605-46fa-9c02-62fdc3f51501/ovn-controller/0.log" Apr 20 19:44:45.283152 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:45.283117 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hqs9s_73d65bde-5605-46fa-9c02-62fdc3f51501/ovn-acl-logging/0.log" Apr 20 19:44:45.300059 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:45.300036 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hqs9s_73d65bde-5605-46fa-9c02-62fdc3f51501/kube-rbac-proxy-node/0.log" Apr 20 19:44:45.319158 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:45.319136 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hqs9s_73d65bde-5605-46fa-9c02-62fdc3f51501/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 19:44:45.330644 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:45.330606 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hqs9s_73d65bde-5605-46fa-9c02-62fdc3f51501/northd/0.log" Apr 20 19:44:45.344994 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:45.344967 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hqs9s_73d65bde-5605-46fa-9c02-62fdc3f51501/nbdb/0.log" Apr 20 19:44:45.360289 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:45.360270 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hqs9s_73d65bde-5605-46fa-9c02-62fdc3f51501/sbdb/0.log" Apr 20 19:44:45.518071 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:45.517989 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hqs9s_73d65bde-5605-46fa-9c02-62fdc3f51501/ovnkube-controller/0.log" Apr 20 19:44:46.831614 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:46.831588 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-nvhzk_d9f0d4ce-307e-4bf5-99dc-e95da51d3c9e/network-check-target-container/0.log" Apr 20 19:44:47.760871 ip-10-0-136-5 kubenswrapper[2572]: I0420 19:44:47.760834 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-dz9mm_a96f7ddd-9780-4b41-abdb-dc8d64c0cb84/iptables-alerter/0.log"