Apr 16 19:30:16.956018 ip-10-0-133-198 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:30:17.391714 ip-10-0-133-198 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:30:17.391714 ip-10-0-133-198 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:30:17.391714 ip-10-0-133-198 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:30:17.391714 ip-10-0-133-198 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:30:17.391714 ip-10-0-133-198 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:30:17.392985 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.392535 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:30:17.397463 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397441 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:17.397463 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397459 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:17.397463 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397466 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397471 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397477 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397481 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397485 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397489 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397493 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397497 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397501 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397505 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397511 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397516 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397519 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397539 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397544 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397548 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397551 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397555 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397559 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397563 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:17.397759 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397567 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397570 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397575 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397579 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397583 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397587 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397591 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397595 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397600 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397605 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397610 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397614 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397618 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397623 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397627 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397632 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397637 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397643 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397647 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397651 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:17.398567 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397656 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397660 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397664 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397687 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397693 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397697 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397702 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397706 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397710 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397714 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397718 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397722 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397726 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397730 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397734 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397738 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397742 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397747 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397751 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397755 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:17.399445 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397759 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397770 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397774 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397779 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397783 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397788 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397792 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397796 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397800 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397804 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397809 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397813 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397817 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397821 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397825 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397829 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397833 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397837 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397841 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397847 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:17.400323 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397851 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397855 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397859 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.397863 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398458 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398466 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398471 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398475 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398479 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398483 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398488 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398492 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398498 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398502 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398506 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398510 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398515 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398520 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398524 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398528 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:17.400859 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398532 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398537 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398541 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398545 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398549 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398554 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398560 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398565 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398569 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398573 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398577 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398583 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398587 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398591 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398595 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398599 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398603 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398608 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398612 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:17.401409 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398616 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398621 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398625 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398629 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398634 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398638 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398642 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398646 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398651 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398655 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398659 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398664 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398692 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398697 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398702 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398723 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398727 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398732 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398736 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398740 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:17.402088 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398744 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398748 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398753 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398757 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398761 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398766 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398771 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398775 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398779 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398783 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398788 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398792 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398796 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398801 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398805 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398809 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398815 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398819 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398823 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398830 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:17.402647 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398836 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398841 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398846 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398851 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398855 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398859 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398863 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398868 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398872 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398876 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.398881 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.398982 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.398993 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399004 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399011 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399018 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399023 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399030 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399038 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399044 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399049 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:30:17.403218 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399055 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399060 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399065 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399070 2568 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399074 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399079 2568 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399084 2568 flags.go:64] FLAG: --cloud-config="" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399089 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399093 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399099 2568 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399104 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399119 2568 flags.go:64] FLAG: --config-dir="" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399123 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399129 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399136 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399141 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399146 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399151 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399156 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399160 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399165 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399170 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399175 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399182 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399187 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:30:17.403798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399192 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399197 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399202 2568 flags.go:64] FLAG: --enable-server="true" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399206 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399214 2568 flags.go:64] FLAG: --event-burst="100" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399220 2568 flags.go:64] FLAG: --event-qps="50" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399225 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399230 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399235 2568 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399241 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399245 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399250 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399255 2568 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399259 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399264 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399269 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399274 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399278 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399283 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399288 2568 flags.go:64] FLAG: --feature-gates="" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399294 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399300 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399305 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399310 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399315 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:30:17.404414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399320 2568 flags.go:64] FLAG: --help="false" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399325 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399330 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399335 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399340 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399346 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399351 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399363 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399368 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399373 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399378 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399383 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399388 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399393 2568 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399398 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399403 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399408 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399412 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399417 2568 flags.go:64] FLAG: --lock-file="" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399421 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399426 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399431 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399440 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:30:17.405033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399444 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399448 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399452 2568 flags.go:64] FLAG: --logging-format="text" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399457 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399462 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399466 2568 flags.go:64] FLAG: --manifest-url="" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399471 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399478 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399483 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399490 2568 flags.go:64] FLAG: --max-pods="110" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399495 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399500 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399505 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399510 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399514 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399519 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399527 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399541 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399546 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399551 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399555 2568 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399560 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399569 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399575 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:30:17.405595 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399580 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399584 2568 flags.go:64] FLAG: --port="10250" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399589 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399594 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e39b0b258b04dd64" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399599 2568 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399604 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399608 2568 flags.go:64] FLAG: --register-node="true" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399613 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399618 2568 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399624 2568 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399629 2568 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399633 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399638 2568 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399643 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399648 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399654 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399659 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399663 2568 flags.go:64] FLAG: --runonce="false" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399668 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399693 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399698 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399702 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399707 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399712 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399718 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399723 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:30:17.406356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399729 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399734 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399739 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399743 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399748 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399753 2568 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399759 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399768 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399772 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399777 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399784 2568 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399788 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399792 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399797 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399802 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399807 2568 flags.go:64] FLAG: --v="2" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399814 2568 flags.go:64] FLAG: --version="false" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399820 2568 flags.go:64] FLAG: --vmodule="" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399827 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.399832 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.399978 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.399985 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.399990 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.399994 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:17.406996 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.399999 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400003 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400007 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400011 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400015 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400022 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400029 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400033 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400039 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400044 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400048 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400053 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400057 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400061 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400066 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400071 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400075 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400079 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400083 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400087 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:17.407716 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400092 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400096 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400100 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400104 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400108 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400112 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400116 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400120 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400124 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400129 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400133 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400137 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400141 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400145 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400150 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400153 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400157 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400162 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400168 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400173 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:17.408237 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400179 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400183 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400187 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400192 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400196 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400200 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400204 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400210 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400217 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400221 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400225 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400230 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400234 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400238 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400243 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400247 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400252 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400256 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400261 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:17.408754 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400265 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400270 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400274 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400278 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400282 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400286 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400290 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400295 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400299 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400303 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400308 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400313 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400317 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400322 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400326 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400331 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400335 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400339 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400343 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400348 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:17.409223 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400352 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400356 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.400361 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.401055 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.409183 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.409199 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409246 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409251 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409255 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409259 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409262 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409265 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409268 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409270 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409273 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:17.409732 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409275 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409278 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409280 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409283 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409286 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409289 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409291 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409294 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409296 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409299 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409301 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409304 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409306 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409309 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409311 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409314 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409316 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409319 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409322 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409328 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:17.410103 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409331 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409333 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409336 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409339 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409342 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409344 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409347 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409349 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409352 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409354 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409357 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409359 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409362 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409364 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409367 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409369 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409372 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409374 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409378 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:17.410589 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409382 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409386 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409388 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409391 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409394 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409397 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409399 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409402 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409405 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409407 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409410 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409412 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409415 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409418 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409420 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409423 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409426 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409429 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409431 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409434 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:17.411120 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409436 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409439 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409442 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409444 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409447 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409449 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409452 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409454 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409463 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409467 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409471 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409473 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409476 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409479 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409482 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409484 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409487 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:17.411598 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409489 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.409495 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409590 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409595 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409598 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409601 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409604 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409606 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409609 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409611 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409614 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409617 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409621 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409625 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409628 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:17.412056 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409631 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409634 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409637 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409640 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409642 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409645 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409647 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409650 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409653 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409656 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409659 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409661 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409664 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409666 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409669 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409686 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409689 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409691 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409694 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:17.412427 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409696 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409699 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409702 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409704 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409707 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409709 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409711 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409714 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409717 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409721 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409723 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409726 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409728 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409731 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409733 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409736 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409739 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409741 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409744 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409747 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:17.412927 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409749 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409752 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409754 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409757 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409759 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409762 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409765 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409768 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409770 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409773 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409775 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409777 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409780 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409783 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409785 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409788 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409790 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409793 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409795 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409798 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409800 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:17.413415 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409803 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409806 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409809 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409813 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409816 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409818 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409821 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409823 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409826 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409828 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409830 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409833 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:17.409835 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.409840 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.410544 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:30:17.413939 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.412439 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:30:17.414304 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.413424 2568 server.go:1019] "Starting client certificate rotation" Apr 16 19:30:17.414304 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.413526 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:30:17.414304 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.413556 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:30:17.436520 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.436493 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:30:17.439087 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.439061 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:30:17.454150 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.454128 2568 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:30:17.460040 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.460026 2568 log.go:25] "Validated CRI v1 image API" Apr 16 19:30:17.461350 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.461335 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:30:17.466328 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.466310 2568 fs.go:135] Filesystem UUIDs: map[09796d53-e7e1-4d64-9a51-b9bd610cf1db:/dev/nvme0n1p4 3603fc90-6d66-4ee7-b1b3-040f20310945:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 19:30:17.466384 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.466329 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:30:17.471172 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.471150 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:30:17.473708 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.473585 2568 manager.go:217] Machine: {Timestamp:2026-04-16 19:30:17.471828516 +0000 UTC m=+0.395525107 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098922 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec226a6fa013e6d98ade00447fe1f384 SystemUUID:ec226a6f-a013-e6d9-8ade-00447fe1f384 BootID:df464178-c3a0-4b7e-8db9-85e26e7e33ea Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:ab:c3:c4:dd:91 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:ab:c3:c4:dd:91 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e6:e1:86:77:4f:56 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:30:17.474138 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.474123 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:30:17.474286 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.474265 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:30:17.475446 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.475423 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:30:17.475607 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.475448 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-198.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:30:17.475707 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.475623 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:30:17.475707 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.475636 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:30:17.475707 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.475654 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:30:17.476620 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.476607 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:30:17.477495 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.477484 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:30:17.477627 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.477616 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:30:17.480218 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.480207 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:30:17.480273 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.480224 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:30:17.480273 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.480242 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:30:17.480273 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.480257 2568 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:30:17.480273 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.480271 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:30:17.481355 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.481341 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:30:17.481416 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.481364 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:30:17.484736 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.484711 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:30:17.486002 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.485990 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:30:17.487813 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.487800 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:30:17.487877 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.487817 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:30:17.487877 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.487824 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:30:17.487877 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.487833 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:30:17.487877 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.487842 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:30:17.487877 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.487848 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:30:17.487877 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.487854 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:30:17.487877 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.487859 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:30:17.487877 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.487865 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:30:17.487877 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.487872 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:30:17.487877 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.487881 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:30:17.488232 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.487891 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:30:17.488566 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.488548 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:30:17.488566 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.488559 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:30:17.492297 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.492164 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:30:17.492353 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.492320 2568 server.go:1295] "Started kubelet" Apr 16 19:30:17.492407 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.492227 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mp67t" Apr 16 19:30:17.492469 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.492439 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:30:17.492523 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.492460 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:30:17.492552 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.492520 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:30:17.493003 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.492973 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-198.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:30:17.493003 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.492973 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:30:17.493186 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.493170 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-198.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:30:17.493230 ip-10-0-133-198 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:30:17.493807 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.493713 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:30:17.495010 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.494994 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:30:17.499556 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.499532 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:30:17.500261 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.500244 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:30:17.500884 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.500867 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:30:17.500884 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.500887 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:30:17.501011 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.500906 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:30:17.501080 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.501065 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:30:17.501135 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.501087 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:30:17.501317 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.501214 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 16 19:30:17.501443 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.501427 2568 factory.go:55] Registering systemd factory Apr 16 19:30:17.501547 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.501533 2568 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:30:17.501761 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.501745 2568 factory.go:153] Registering CRI-O factory Apr 16 19:30:17.501761 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.501763 2568 factory.go:223] Registration of the crio container factory successfully Apr 16 19:30:17.501906 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.501811 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:30:17.501906 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.501842 2568 factory.go:103] Registering Raw factory Apr 16 19:30:17.501906 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.501857 2568 manager.go:1196] Started watching for new ooms in manager Apr 16 19:30:17.502658 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.502642 2568 manager.go:319] Starting recovery of all containers Apr 16 19:30:17.502866 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.502845 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:30:17.502974 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.502946 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 19:30:17.503067 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.503008 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-198.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 19:30:17.507017 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.506994 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-mp67t" Apr 16 19:30:17.507219 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.503056 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-198.ec2.internal.18a6ed1e50108381 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-198.ec2.internal,UID:ip-10-0-133-198.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-198.ec2.internal,},FirstTimestamp:2026-04-16 19:30:17.492300673 +0000 UTC m=+0.415997264,LastTimestamp:2026-04-16 19:30:17.492300673 +0000 UTC m=+0.415997264,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-198.ec2.internal,}" Apr 16 19:30:17.514768 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.514730 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:30:17.517385 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.517370 2568 manager.go:324] Recovery completed Apr 16 19:30:17.521313 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.521300 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:17.524371 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.524357 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:17.524451 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.524384 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:17.524451 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.524394 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:17.524833 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.524820 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:30:17.524833 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.524831 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:30:17.524925 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.524846 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:30:17.526687 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.526618 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-198.ec2.internal.18a6ed1e51f9ded7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-198.ec2.internal,UID:ip-10-0-133-198.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-198.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-198.ec2.internal,},FirstTimestamp:2026-04-16 19:30:17.524371159 +0000 UTC m=+0.448067749,LastTimestamp:2026-04-16 19:30:17.524371159 +0000 UTC m=+0.448067749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-198.ec2.internal,}" Apr 16 19:30:17.526788 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.526776 2568 policy_none.go:49] "None policy: Start" Apr 16 19:30:17.526819 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.526797 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:30:17.526819 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.526808 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:30:17.563395 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.563328 2568 manager.go:341] "Starting Device Plugin manager" Apr 16 19:30:17.575564 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.563451 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:30:17.575564 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.563467 2568 server.go:85] "Starting device plugin registration server" Apr 16 19:30:17.575564 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.563708 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:30:17.575564 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.563721 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:30:17.575564 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.563812 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:30:17.575564 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.563901 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:30:17.575564 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.563910 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:30:17.575564 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.564396 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:30:17.575564 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.564436 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-198.ec2.internal\" not found" Apr 16 19:30:17.647385 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.647322 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:30:17.647385 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.647356 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:30:17.647385 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.647375 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:30:17.647385 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.647384 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:30:17.647636 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.647415 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:30:17.651317 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.651294 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:17.664638 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.664618 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:17.665923 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.665905 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:17.666010 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.665939 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:17.666010 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.665955 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:17.666010 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.665983 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.674348 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.674328 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.674431 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.674349 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-198.ec2.internal\": node \"ip-10-0-133-198.ec2.internal\" not found" Apr 16 19:30:17.689545 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.689519 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 16 19:30:17.747806 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.747787 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal"] Apr 16 19:30:17.747859 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.747846 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:17.748669 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.748652 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:17.748774 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.748697 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:17.748774 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.748713 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:17.749845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.749833 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:17.750005 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.749992 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.750057 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.750020 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:17.750578 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.750559 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:17.750578 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.750579 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:17.750733 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.750593 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:17.750908 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.750893 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:17.751000 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.750922 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:17.751000 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.750935 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:17.751909 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.751896 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.751977 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.751921 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:17.752511 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.752495 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:17.752592 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.752517 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:17.752592 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.752551 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:17.766182 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.766163 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-198.ec2.internal\" not found" node="ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.770274 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.770257 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-198.ec2.internal\" not found" node="ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.789580 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.789552 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 16 19:30:17.802007 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.801989 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9fc3cee085866514e6ec8498fa3dbfcb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal\" (UID: \"9fc3cee085866514e6ec8498fa3dbfcb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.802061 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.802019 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fc3cee085866514e6ec8498fa3dbfcb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal\" (UID: \"9fc3cee085866514e6ec8498fa3dbfcb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.802105 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.802060 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4cf8424ac7d796a78f96e62791daed1d-config\") pod \"kube-apiserver-proxy-ip-10-0-133-198.ec2.internal\" (UID: \"4cf8424ac7d796a78f96e62791daed1d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.890385 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.890364 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 16 19:30:17.902664 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.902622 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4cf8424ac7d796a78f96e62791daed1d-config\") pod \"kube-apiserver-proxy-ip-10-0-133-198.ec2.internal\" (UID: \"4cf8424ac7d796a78f96e62791daed1d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.902664 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.902646 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9fc3cee085866514e6ec8498fa3dbfcb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal\" (UID: \"9fc3cee085866514e6ec8498fa3dbfcb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.902664 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.902663 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fc3cee085866514e6ec8498fa3dbfcb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal\" (UID: \"9fc3cee085866514e6ec8498fa3dbfcb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.902814 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.902706 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9fc3cee085866514e6ec8498fa3dbfcb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal\" (UID: \"9fc3cee085866514e6ec8498fa3dbfcb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.902814 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.902718 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4cf8424ac7d796a78f96e62791daed1d-config\") pod \"kube-apiserver-proxy-ip-10-0-133-198.ec2.internal\" (UID: \"4cf8424ac7d796a78f96e62791daed1d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.902814 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:17.902749 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9fc3cee085866514e6ec8498fa3dbfcb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal\" (UID: \"9fc3cee085866514e6ec8498fa3dbfcb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 16 19:30:17.991068 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:17.991030 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 16 19:30:18.068546 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.068524 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" Apr 16 19:30:18.073124 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.073105 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 16 19:30:18.091536 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:18.091516 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 16 19:30:18.192141 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:18.192116 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 16 19:30:18.292668 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:18.292623 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 16 19:30:18.369980 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.369951 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:18.393379 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:18.393352 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 16 19:30:18.413793 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.413775 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:30:18.413896 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.413881 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:30:18.413940 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.413921 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:30:18.493465 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:18.493433 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-198.ec2.internal\" not found" Apr 16 19:30:18.499621 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.499605 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:30:18.509078 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.509041 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:25:17 +0000 UTC" deadline="2028-01-18 15:54:52.385474994 +0000 UTC" Apr 16 19:30:18.509078 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.509074 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15404h24m33.876404071s" Apr 16 19:30:18.514496 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.514473 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:30:18.535312 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:18.535286 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fc3cee085866514e6ec8498fa3dbfcb.slice/crio-29c385e2e5a7803bd8b1cc5f4441ae6dd5a3f852aae058d8ba2a1bed72d24c89 WatchSource:0}: Error finding container 29c385e2e5a7803bd8b1cc5f4441ae6dd5a3f852aae058d8ba2a1bed72d24c89: Status 404 returned error can't find the container with id 29c385e2e5a7803bd8b1cc5f4441ae6dd5a3f852aae058d8ba2a1bed72d24c89 Apr 16 19:30:18.535564 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:18.535546 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cf8424ac7d796a78f96e62791daed1d.slice/crio-497f2094b5dc540964cde58efd7f984ecfdb7786a5377d76c775eddab3aaa5f0 WatchSource:0}: Error finding container 497f2094b5dc540964cde58efd7f984ecfdb7786a5377d76c775eddab3aaa5f0: Status 404 returned error can't find the container with id 497f2094b5dc540964cde58efd7f984ecfdb7786a5377d76c775eddab3aaa5f0 Apr 16 19:30:18.537347 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.537327 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fln8r" Apr 16 19:30:18.539501 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.539484 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:30:18.543628 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.543613 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fln8r" Apr 16 19:30:18.580732 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.580713 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:18.601435 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.601412 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" Apr 16 19:30:18.614016 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.613992 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:30:18.615006 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.614991 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" Apr 16 19:30:18.625155 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.625133 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:30:18.649966 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.649915 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" event={"ID":"4cf8424ac7d796a78f96e62791daed1d","Type":"ContainerStarted","Data":"497f2094b5dc540964cde58efd7f984ecfdb7786a5377d76c775eddab3aaa5f0"} Apr 16 19:30:18.650824 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.650804 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" event={"ID":"9fc3cee085866514e6ec8498fa3dbfcb","Type":"ContainerStarted","Data":"29c385e2e5a7803bd8b1cc5f4441ae6dd5a3f852aae058d8ba2a1bed72d24c89"} Apr 16 19:30:18.956610 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:18.956578 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:19.250231 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.250190 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:19.481609 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.481572 2568 apiserver.go:52] "Watching apiserver" Apr 16 19:30:19.488524 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.488500 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:30:19.489900 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.489873 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-vsb4w","openshift-multus/multus-additional-cni-plugins-gt9gg","openshift-multus/network-metrics-daemon-p9mcp","openshift-network-diagnostics/network-check-target-h6v8t","openshift-ovn-kubernetes/ovnkube-node-vf8sc","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt","openshift-cluster-node-tuning-operator/tuned-2nws7","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal","openshift-multus/multus-g2sd7","openshift-network-operator/iptables-alerter-spns4","kube-system/konnectivity-agent-dxnh8","kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal"] Apr 16 19:30:19.491703 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.491665 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vsb4w" Apr 16 19:30:19.492879 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.492857 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.494518 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.494481 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4kztg\"" Apr 16 19:30:19.494767 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.494743 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:30:19.494885 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.494745 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:19.494885 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.494790 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:30:19.494885 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.494820 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:30:19.494885 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.494835 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:19.495194 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:19.494879 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:19.495194 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:19.494899 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:19.495338 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.495237 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:30:19.495408 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.495345 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-m7qsg\"" Apr 16 19:30:19.495510 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.495485 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:30:19.495640 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.495564 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:30:19.495640 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.495615 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:30:19.495752 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.495698 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:30:19.496064 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.496046 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.497326 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.497304 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.498184 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.498161 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:30:19.498459 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.498442 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dkf26\"" Apr 16 19:30:19.498610 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.498585 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:30:19.499074 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.499023 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:30:19.499321 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.499305 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:30:19.499528 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.499514 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:30:19.501819 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.500138 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:30:19.501819 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.500382 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:30:19.501819 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.500447 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:30:19.501819 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.500487 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-67p2p\"" Apr 16 19:30:19.501819 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.500944 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:30:19.502548 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.502524 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.504461 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.504436 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.504592 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.504573 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:30:19.504886 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.504865 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:30:19.505003 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.504977 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2zhxj\"" Apr 16 19:30:19.505557 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.505539 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-spns4" Apr 16 19:30:19.506712 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.506689 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dxnh8" Apr 16 19:30:19.506829 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.506767 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:30:19.506893 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.506832 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fqv6z\"" Apr 16 19:30:19.507757 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.507738 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:30:19.508655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.508625 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:30:19.508760 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.508711 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:30:19.508871 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.508853 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:30:19.509130 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.509112 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8zsmq\"" Apr 16 19:30:19.509205 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.509171 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mpp7w\"" Apr 16 19:30:19.509205 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.509179 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:30:19.511523 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511504 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-var-lib-openvswitch\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.511596 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511538 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-etc-openvswitch\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.511596 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511565 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-log-socket\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.511596 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511585 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-device-dir\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.511732 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511600 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-run-netns\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.511732 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511615 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-run-ovn-kubernetes\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.511732 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511648 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vkkv\" (UniqueName: \"kubernetes.io/projected/a149ee97-cada-4c41-b88a-0351739b3d48-kube-api-access-6vkkv\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.511732 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511712 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-sys\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.511923 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511759 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f58z2\" (UniqueName: \"kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2\") pod \"network-check-target-h6v8t\" (UID: \"9406cd72-aaad-46f9-8e4d-59aa641ee42f\") " pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:19.511923 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511790 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a149ee97-cada-4c41-b88a-0351739b3d48-ovnkube-config\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.511923 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511825 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a149ee97-cada-4c41-b88a-0351739b3d48-env-overrides\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.511923 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511849 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvwjb\" (UniqueName: \"kubernetes.io/projected/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-kube-api-access-qvwjb\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.511923 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511877 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-kubernetes\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.511923 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511900 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-kubelet\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.512167 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511924 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-slash\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.512167 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.511963 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-run-systemd\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.512167 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512011 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.512167 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512041 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a149ee97-cada-4c41-b88a-0351739b3d48-ovn-node-metrics-cert\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.512167 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512088 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-socket-dir\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.512167 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512115 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/901bc30b-5940-410f-8379-b703113afa1a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.512167 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512140 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-registration-dir\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.512493 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512174 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-sys-fs\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.512493 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512205 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n46p9\" (UniqueName: \"kubernetes.io/projected/930ab398-8c7e-4088-bae6-3bde68db4b00-kube-api-access-n46p9\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.512493 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512228 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-sysctl-d\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.512493 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512253 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-systemd\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.512493 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512286 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-host\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.512493 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512311 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/901bc30b-5940-410f-8379-b703113afa1a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.512493 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512337 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ppg2\" (UniqueName: \"kubernetes.io/projected/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-kube-api-access-4ppg2\") pod \"network-metrics-daemon-p9mcp\" (UID: \"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6\") " pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:19.512493 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512373 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55b65\" (UniqueName: \"kubernetes.io/projected/ab25b071-b28e-4ed4-8595-4ea92620f2bd-kube-api-access-55b65\") pod \"node-ca-vsb4w\" (UID: \"ab25b071-b28e-4ed4-8595-4ea92620f2bd\") " pod="openshift-image-registry/node-ca-vsb4w" Apr 16 19:30:19.512493 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512397 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-run-openvswitch\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.512493 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512420 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.512493 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512445 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-modprobe-d\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.512493 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512477 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/901bc30b-5940-410f-8379-b703113afa1a-cni-binary-copy\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512514 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-systemd-units\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512545 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-etc-selinux\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512580 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-var-lib-kubelet\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512607 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-cni-bin\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512631 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-run\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512653 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-sysconfig\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512696 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/901bc30b-5940-410f-8379-b703113afa1a-system-cni-dir\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512719 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/901bc30b-5940-410f-8379-b703113afa1a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512745 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/901bc30b-5940-410f-8379-b703113afa1a-os-release\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512767 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-tuned\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512792 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-tmp\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512824 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs\") pod \"network-metrics-daemon-p9mcp\" (UID: \"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6\") " pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512860 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-node-log\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512878 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a149ee97-cada-4c41-b88a-0351739b3d48-ovnkube-script-lib\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512895 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab25b071-b28e-4ed4-8595-4ea92620f2bd-host\") pod \"node-ca-vsb4w\" (UID: \"ab25b071-b28e-4ed4-8595-4ea92620f2bd\") " pod="openshift-image-registry/node-ca-vsb4w" Apr 16 19:30:19.512983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512912 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab25b071-b28e-4ed4-8595-4ea92620f2bd-serviceca\") pod \"node-ca-vsb4w\" (UID: \"ab25b071-b28e-4ed4-8595-4ea92620f2bd\") " pod="openshift-image-registry/node-ca-vsb4w" Apr 16 19:30:19.513432 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512949 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-run-ovn\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.513432 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.512985 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-cni-netd\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.513432 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.513010 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-lib-modules\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.513432 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.513031 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-sysctl-conf\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.513432 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.513054 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/901bc30b-5940-410f-8379-b703113afa1a-cnibin\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.513432 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.513079 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvbcr\" (UniqueName: \"kubernetes.io/projected/901bc30b-5940-410f-8379-b703113afa1a-kube-api-access-nvbcr\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.544883 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.544856 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:25:18 +0000 UTC" deadline="2027-12-24 11:54:38.957400176 +0000 UTC" Apr 16 19:30:19.544883 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.544881 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14800h24m19.412521645s" Apr 16 19:30:19.601731 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.601700 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:30:19.613556 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613523 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab25b071-b28e-4ed4-8595-4ea92620f2bd-serviceca\") pod \"node-ca-vsb4w\" (UID: \"ab25b071-b28e-4ed4-8595-4ea92620f2bd\") " pod="openshift-image-registry/node-ca-vsb4w" Apr 16 19:30:19.613556 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613561 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-run-ovn\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.613556 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613577 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-cni-netd\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.613845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613597 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-lib-modules\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.613845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613625 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-run-k8s-cni-cncf-io\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.613845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613647 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-sysctl-conf\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.613845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613656 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-run-ovn\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.613845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613691 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/901bc30b-5940-410f-8379-b703113afa1a-cnibin\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.613845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613717 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvbcr\" (UniqueName: \"kubernetes.io/projected/901bc30b-5940-410f-8379-b703113afa1a-kube-api-access-nvbcr\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.613845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613744 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-var-lib-openvswitch\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.613845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613769 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-etc-openvswitch\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.613845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613794 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-log-socket\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.613845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613816 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-device-dir\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.613845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613841 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-run-netns\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.613845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613852 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-lib-modules\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613744 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-cni-netd\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613865 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-run-netns\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613907 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-run-netns\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613951 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-run-ovn-kubernetes\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613992 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-etc-openvswitch\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614016 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ab25b071-b28e-4ed4-8595-4ea92620f2bd-serviceca\") pod \"node-ca-vsb4w\" (UID: \"ab25b071-b28e-4ed4-8595-4ea92620f2bd\") " pod="openshift-image-registry/node-ca-vsb4w" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614026 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-sysctl-conf\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614030 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-log-socket\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614070 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/901bc30b-5940-410f-8379-b703113afa1a-cnibin\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614080 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-device-dir\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.613907 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-run-ovn-kubernetes\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614110 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-var-lib-openvswitch\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614141 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vkkv\" (UniqueName: \"kubernetes.io/projected/a149ee97-cada-4c41-b88a-0351739b3d48-kube-api-access-6vkkv\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614199 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-sys\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614226 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-var-lib-kubelet\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614252 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f58z2\" (UniqueName: \"kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2\") pod \"network-check-target-h6v8t\" (UID: \"9406cd72-aaad-46f9-8e4d-59aa641ee42f\") " pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:19.614361 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614274 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a149ee97-cada-4c41-b88a-0351739b3d48-ovnkube-config\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614297 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a149ee97-cada-4c41-b88a-0351739b3d48-env-overrides\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614304 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-sys\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614325 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvwjb\" (UniqueName: \"kubernetes.io/projected/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-kube-api-access-qvwjb\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-system-cni-dir\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614383 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aff5494c-5205-4f24-a716-e1d25bc64f7c-cni-binary-copy\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614408 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-kubernetes\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614464 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-kubelet\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614489 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-slash\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614515 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-run-systemd\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614542 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614572 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a149ee97-cada-4c41-b88a-0351739b3d48-ovn-node-metrics-cert\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614599 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-multus-socket-dir-parent\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614621 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-hostroot\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614649 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-socket-dir\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614696 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxz6j\" (UniqueName: \"kubernetes.io/projected/aff5494c-5205-4f24-a716-e1d25bc64f7c-kube-api-access-mxz6j\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614727 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/901bc30b-5940-410f-8379-b703113afa1a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.615165 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614752 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-registration-dir\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614776 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-sys-fs\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614810 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n46p9\" (UniqueName: \"kubernetes.io/projected/930ab398-8c7e-4088-bae6-3bde68db4b00-kube-api-access-n46p9\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614818 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a149ee97-cada-4c41-b88a-0351739b3d48-env-overrides\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614834 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-sysctl-d\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614860 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e224c06c-213a-487e-844c-5d72da62ac07-konnectivity-ca\") pod \"konnectivity-agent-dxnh8\" (UID: \"e224c06c-213a-487e-844c-5d72da62ac07\") " pod="kube-system/konnectivity-agent-dxnh8" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614887 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d2974a6f-4954-4cca-a5c8-ba2980890b8d-iptables-alerter-script\") pod \"iptables-alerter-spns4\" (UID: \"d2974a6f-4954-4cca-a5c8-ba2980890b8d\") " pod="openshift-network-operator/iptables-alerter-spns4" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614912 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-var-lib-cni-multus\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614940 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-systemd\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614947 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-kubelet\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614964 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-host\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614990 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-slash\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615054 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615115 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-socket-dir\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615158 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-run-systemd\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614913 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-kubernetes\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615231 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-systemd\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.615849 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615255 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a149ee97-cada-4c41-b88a-0351739b3d48-ovnkube-config\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.614991 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-etc-kubernetes\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615274 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-host\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615318 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-sys-fs\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615346 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/901bc30b-5940-410f-8379-b703113afa1a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615363 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/901bc30b-5940-410f-8379-b703113afa1a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615371 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ppg2\" (UniqueName: \"kubernetes.io/projected/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-kube-api-access-4ppg2\") pod \"network-metrics-daemon-p9mcp\" (UID: \"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6\") " pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615396 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55b65\" (UniqueName: \"kubernetes.io/projected/ab25b071-b28e-4ed4-8595-4ea92620f2bd-kube-api-access-55b65\") pod \"node-ca-vsb4w\" (UID: \"ab25b071-b28e-4ed4-8595-4ea92620f2bd\") " pod="openshift-image-registry/node-ca-vsb4w" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615382 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615422 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-run-openvswitch\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615447 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615501 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-sysctl-d\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615519 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-modprobe-d\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615577 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615587 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-registration-dir\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615607 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-multus-conf-dir\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615634 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/901bc30b-5940-410f-8379-b703113afa1a-cni-binary-copy\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.616402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615657 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-systemd-units\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615698 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-etc-selinux\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615722 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-var-lib-kubelet\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615735 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-modprobe-d\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615745 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-cni-bin\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615767 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-run\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615796 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv5jc\" (UniqueName: \"kubernetes.io/projected/d2974a6f-4954-4cca-a5c8-ba2980890b8d-kube-api-access-nv5jc\") pod \"iptables-alerter-spns4\" (UID: \"d2974a6f-4954-4cca-a5c8-ba2980890b8d\") " pod="openshift-network-operator/iptables-alerter-spns4" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615821 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-sysconfig\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615846 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/901bc30b-5940-410f-8379-b703113afa1a-system-cni-dir\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615876 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/901bc30b-5940-410f-8379-b703113afa1a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615902 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-run-multus-certs\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615929 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/901bc30b-5940-410f-8379-b703113afa1a-os-release\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615957 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-run-openvswitch\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615965 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-tuned\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.615995 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-tmp\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616022 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e224c06c-213a-487e-844c-5d72da62ac07-agent-certs\") pod \"konnectivity-agent-dxnh8\" (UID: \"e224c06c-213a-487e-844c-5d72da62ac07\") " pod="kube-system/konnectivity-agent-dxnh8" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616043 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/901bc30b-5940-410f-8379-b703113afa1a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.617365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616051 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-multus-cni-dir\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616075 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-cnibin\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616099 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-os-release\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616148 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs\") pod \"network-metrics-daemon-p9mcp\" (UID: \"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6\") " pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616174 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-node-log\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616200 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a149ee97-cada-4c41-b88a-0351739b3d48-ovnkube-script-lib\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616226 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d2974a6f-4954-4cca-a5c8-ba2980890b8d-host-slash\") pod \"iptables-alerter-spns4\" (UID: \"d2974a6f-4954-4cca-a5c8-ba2980890b8d\") " pod="openshift-network-operator/iptables-alerter-spns4" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616251 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-var-lib-cni-bin\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616271 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab25b071-b28e-4ed4-8595-4ea92620f2bd-host\") pod \"node-ca-vsb4w\" (UID: \"ab25b071-b28e-4ed4-8595-4ea92620f2bd\") " pod="openshift-image-registry/node-ca-vsb4w" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616287 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aff5494c-5205-4f24-a716-e1d25bc64f7c-multus-daemon-config\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616367 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/901bc30b-5940-410f-8379-b703113afa1a-cni-binary-copy\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616432 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-systemd-units\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616469 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/930ab398-8c7e-4088-bae6-3bde68db4b00-etc-selinux\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616500 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-var-lib-kubelet\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616524 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-host-cni-bin\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:19.616595 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:19.616700 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs podName:1a9f09ff-a8fe-41b7-b833-8c5091a88fb6 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:20.116656719 +0000 UTC m=+3.040353296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs") pod "network-metrics-daemon-p9mcp" (UID: "1a9f09ff-a8fe-41b7-b833-8c5091a88fb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:19.618219 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616765 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/901bc30b-5940-410f-8379-b703113afa1a-system-cni-dir\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.618835 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.616878 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-run\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.618835 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.617000 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-sysconfig\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.618835 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.617008 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ab25b071-b28e-4ed4-8595-4ea92620f2bd-host\") pod \"node-ca-vsb4w\" (UID: \"ab25b071-b28e-4ed4-8595-4ea92620f2bd\") " pod="openshift-image-registry/node-ca-vsb4w" Apr 16 19:30:19.618835 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.617014 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a149ee97-cada-4c41-b88a-0351739b3d48-node-log\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.618835 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.617113 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/901bc30b-5940-410f-8379-b703113afa1a-os-release\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.618835 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.617452 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a149ee97-cada-4c41-b88a-0351739b3d48-ovnkube-script-lib\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.618835 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.617914 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/901bc30b-5940-410f-8379-b703113afa1a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.618835 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.618613 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-tmp\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.618835 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.618708 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-etc-tuned\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.618835 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.618796 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a149ee97-cada-4c41-b88a-0351739b3d48-ovn-node-metrics-cert\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.619870 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:19.619807 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:19.619870 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:19.619832 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:19.619870 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:19.619845 2568 projected.go:194] Error preparing data for projected volume kube-api-access-f58z2 for pod openshift-network-diagnostics/network-check-target-h6v8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:19.620078 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:19.619895 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2 podName:9406cd72-aaad-46f9-8e4d-59aa641ee42f nodeName:}" failed. No retries permitted until 2026-04-16 19:30:20.119877646 +0000 UTC m=+3.043574224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-f58z2" (UniqueName: "kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2") pod "network-check-target-h6v8t" (UID: "9406cd72-aaad-46f9-8e4d-59aa641ee42f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:19.622551 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.622522 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvbcr\" (UniqueName: \"kubernetes.io/projected/901bc30b-5940-410f-8379-b703113afa1a-kube-api-access-nvbcr\") pod \"multus-additional-cni-plugins-gt9gg\" (UID: \"901bc30b-5940-410f-8379-b703113afa1a\") " pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.623013 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.622967 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vkkv\" (UniqueName: \"kubernetes.io/projected/a149ee97-cada-4c41-b88a-0351739b3d48-kube-api-access-6vkkv\") pod \"ovnkube-node-vf8sc\" (UID: \"a149ee97-cada-4c41-b88a-0351739b3d48\") " pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.623251 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.623230 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvwjb\" (UniqueName: \"kubernetes.io/projected/6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2-kube-api-access-qvwjb\") pod \"tuned-2nws7\" (UID: \"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2\") " pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.623688 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.623650 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n46p9\" (UniqueName: \"kubernetes.io/projected/930ab398-8c7e-4088-bae6-3bde68db4b00-kube-api-access-n46p9\") pod \"aws-ebs-csi-driver-node-2klnt\" (UID: \"930ab398-8c7e-4088-bae6-3bde68db4b00\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.623786 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.623767 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55b65\" (UniqueName: \"kubernetes.io/projected/ab25b071-b28e-4ed4-8595-4ea92620f2bd-kube-api-access-55b65\") pod \"node-ca-vsb4w\" (UID: \"ab25b071-b28e-4ed4-8595-4ea92620f2bd\") " pod="openshift-image-registry/node-ca-vsb4w" Apr 16 19:30:19.624772 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.624756 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ppg2\" (UniqueName: \"kubernetes.io/projected/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-kube-api-access-4ppg2\") pod \"network-metrics-daemon-p9mcp\" (UID: \"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6\") " pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:19.717483 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717451 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d2974a6f-4954-4cca-a5c8-ba2980890b8d-iptables-alerter-script\") pod \"iptables-alerter-spns4\" (UID: \"d2974a6f-4954-4cca-a5c8-ba2980890b8d\") " pod="openshift-network-operator/iptables-alerter-spns4" Apr 16 19:30:19.717646 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717491 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-var-lib-cni-multus\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.717646 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717548 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-etc-kubernetes\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.717646 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717581 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-multus-conf-dir\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.717646 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717607 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nv5jc\" (UniqueName: \"kubernetes.io/projected/d2974a6f-4954-4cca-a5c8-ba2980890b8d-kube-api-access-nv5jc\") pod \"iptables-alerter-spns4\" (UID: \"d2974a6f-4954-4cca-a5c8-ba2980890b8d\") " pod="openshift-network-operator/iptables-alerter-spns4" Apr 16 19:30:19.717646 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717630 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-run-multus-certs\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.717646 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717632 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-var-lib-cni-multus\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.717928 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717656 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e224c06c-213a-487e-844c-5d72da62ac07-agent-certs\") pod \"konnectivity-agent-dxnh8\" (UID: \"e224c06c-213a-487e-844c-5d72da62ac07\") " pod="kube-system/konnectivity-agent-dxnh8" Apr 16 19:30:19.717928 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717689 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-etc-kubernetes\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.717928 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717702 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-multus-conf-dir\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.717928 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717715 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-run-multus-certs\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.717928 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717698 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-multus-cni-dir\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.717928 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717759 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-multus-cni-dir\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.717928 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717834 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-cnibin\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.717928 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717884 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-cnibin\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.717928 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717890 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-os-release\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717933 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d2974a6f-4954-4cca-a5c8-ba2980890b8d-host-slash\") pod \"iptables-alerter-spns4\" (UID: \"d2974a6f-4954-4cca-a5c8-ba2980890b8d\") " pod="openshift-network-operator/iptables-alerter-spns4" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717940 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-os-release\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717958 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-var-lib-cni-bin\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717971 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d2974a6f-4954-4cca-a5c8-ba2980890b8d-host-slash\") pod \"iptables-alerter-spns4\" (UID: \"d2974a6f-4954-4cca-a5c8-ba2980890b8d\") " pod="openshift-network-operator/iptables-alerter-spns4" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.717982 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aff5494c-5205-4f24-a716-e1d25bc64f7c-multus-daemon-config\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718003 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-var-lib-cni-bin\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718008 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-run-k8s-cni-cncf-io\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718029 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-run-k8s-cni-cncf-io\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718035 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-run-netns\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718062 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-var-lib-kubelet\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718096 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-system-cni-dir\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718097 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-run-netns\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718117 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aff5494c-5205-4f24-a716-e1d25bc64f7c-cni-binary-copy\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718133 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-host-var-lib-kubelet\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718147 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-multus-socket-dir-parent\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718167 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-hostroot\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718179 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-system-cni-dir\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718194 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxz6j\" (UniqueName: \"kubernetes.io/projected/aff5494c-5205-4f24-a716-e1d25bc64f7c-kube-api-access-mxz6j\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718846 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718241 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-multus-socket-dir-parent\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718846 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718261 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e224c06c-213a-487e-844c-5d72da62ac07-konnectivity-ca\") pod \"konnectivity-agent-dxnh8\" (UID: \"e224c06c-213a-487e-844c-5d72da62ac07\") " pod="kube-system/konnectivity-agent-dxnh8" Apr 16 19:30:19.718846 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718282 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aff5494c-5205-4f24-a716-e1d25bc64f7c-hostroot\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718846 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aff5494c-5205-4f24-a716-e1d25bc64f7c-multus-daemon-config\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.718846 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718561 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d2974a6f-4954-4cca-a5c8-ba2980890b8d-iptables-alerter-script\") pod \"iptables-alerter-spns4\" (UID: \"d2974a6f-4954-4cca-a5c8-ba2980890b8d\") " pod="openshift-network-operator/iptables-alerter-spns4" Apr 16 19:30:19.718846 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718791 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aff5494c-5205-4f24-a716-e1d25bc64f7c-cni-binary-copy\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.719094 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.718862 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e224c06c-213a-487e-844c-5d72da62ac07-konnectivity-ca\") pod \"konnectivity-agent-dxnh8\" (UID: \"e224c06c-213a-487e-844c-5d72da62ac07\") " pod="kube-system/konnectivity-agent-dxnh8" Apr 16 19:30:19.720132 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.720113 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e224c06c-213a-487e-844c-5d72da62ac07-agent-certs\") pod \"konnectivity-agent-dxnh8\" (UID: \"e224c06c-213a-487e-844c-5d72da62ac07\") " pod="kube-system/konnectivity-agent-dxnh8" Apr 16 19:30:19.725825 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.725806 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxz6j\" (UniqueName: \"kubernetes.io/projected/aff5494c-5205-4f24-a716-e1d25bc64f7c-kube-api-access-mxz6j\") pod \"multus-g2sd7\" (UID: \"aff5494c-5205-4f24-a716-e1d25bc64f7c\") " pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.725927 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.725892 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv5jc\" (UniqueName: \"kubernetes.io/projected/d2974a6f-4954-4cca-a5c8-ba2980890b8d-kube-api-access-nv5jc\") pod \"iptables-alerter-spns4\" (UID: \"d2974a6f-4954-4cca-a5c8-ba2980890b8d\") " pod="openshift-network-operator/iptables-alerter-spns4" Apr 16 19:30:19.805033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.804947 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vsb4w" Apr 16 19:30:19.811776 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.811744 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gt9gg" Apr 16 19:30:19.819374 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.819339 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:19.823123 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.823095 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" Apr 16 19:30:19.828711 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.828668 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2nws7" Apr 16 19:30:19.835367 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.835344 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g2sd7" Apr 16 19:30:19.843019 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.842996 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-spns4" Apr 16 19:30:19.848727 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:19.848706 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dxnh8" Apr 16 19:30:20.057539 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:20.057505 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f2c8aa4_8f62_4688_865f_bcefc4f5fbc2.slice/crio-aa442728e652a181c3f8b8db91ec70f439eb2480bd57c4df888e03101f7ca0ad WatchSource:0}: Error finding container aa442728e652a181c3f8b8db91ec70f439eb2480bd57c4df888e03101f7ca0ad: Status 404 returned error can't find the container with id aa442728e652a181c3f8b8db91ec70f439eb2480bd57c4df888e03101f7ca0ad Apr 16 19:30:20.058464 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:20.058428 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaff5494c_5205_4f24_a716_e1d25bc64f7c.slice/crio-09cf3abb274fd64dea9b0578c5847b6a0332dce68a0b08eb6dd4556372f26d72 WatchSource:0}: Error finding container 09cf3abb274fd64dea9b0578c5847b6a0332dce68a0b08eb6dd4556372f26d72: Status 404 returned error can't find the container with id 09cf3abb274fd64dea9b0578c5847b6a0332dce68a0b08eb6dd4556372f26d72 Apr 16 19:30:20.059460 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:20.059380 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda149ee97_cada_4c41_b88a_0351739b3d48.slice/crio-8db83c21991e8b662c9263553f3b524016c800f366ee7f7b5887694c004edbde WatchSource:0}: Error finding container 8db83c21991e8b662c9263553f3b524016c800f366ee7f7b5887694c004edbde: Status 404 returned error can't find the container with id 8db83c21991e8b662c9263553f3b524016c800f366ee7f7b5887694c004edbde Apr 16 19:30:20.063878 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:20.063755 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode224c06c_213a_487e_844c_5d72da62ac07.slice/crio-349fe10642d9b02e1ddcbc0ea6f2c540e657ec3460cbbacc6fb371c45af4a48e WatchSource:0}: Error finding container 349fe10642d9b02e1ddcbc0ea6f2c540e657ec3460cbbacc6fb371c45af4a48e: Status 404 returned error can't find the container with id 349fe10642d9b02e1ddcbc0ea6f2c540e657ec3460cbbacc6fb371c45af4a48e Apr 16 19:30:20.068094 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:20.067397 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab25b071_b28e_4ed4_8595_4ea92620f2bd.slice/crio-6da65fa8f9204dc77376ffc0743fc4d52b542f1a416b02a8c13fad8eb627145a WatchSource:0}: Error finding container 6da65fa8f9204dc77376ffc0743fc4d52b542f1a416b02a8c13fad8eb627145a: Status 404 returned error can't find the container with id 6da65fa8f9204dc77376ffc0743fc4d52b542f1a416b02a8c13fad8eb627145a Apr 16 19:30:20.070079 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:20.070057 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod930ab398_8c7e_4088_bae6_3bde68db4b00.slice/crio-e95f422f822dbeef38eac386d529200fff3fec819f6b261178106d0465eae893 WatchSource:0}: Error finding container e95f422f822dbeef38eac386d529200fff3fec819f6b261178106d0465eae893: Status 404 returned error can't find the container with id e95f422f822dbeef38eac386d529200fff3fec819f6b261178106d0465eae893 Apr 16 19:30:20.116961 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.116924 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:20.120625 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.120601 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs\") pod \"network-metrics-daemon-p9mcp\" (UID: \"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6\") " pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:20.120748 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.120637 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f58z2\" (UniqueName: \"kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2\") pod \"network-check-target-h6v8t\" (UID: \"9406cd72-aaad-46f9-8e4d-59aa641ee42f\") " pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:20.120748 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:20.120746 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:20.120848 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:20.120759 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:20.120848 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:20.120768 2568 projected.go:194] Error preparing data for projected volume kube-api-access-f58z2 for pod openshift-network-diagnostics/network-check-target-h6v8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:20.120848 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:20.120771 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:20.120848 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:20.120816 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2 podName:9406cd72-aaad-46f9-8e4d-59aa641ee42f nodeName:}" failed. No retries permitted until 2026-04-16 19:30:21.120800556 +0000 UTC m=+4.044497151 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-f58z2" (UniqueName: "kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2") pod "network-check-target-h6v8t" (UID: "9406cd72-aaad-46f9-8e4d-59aa641ee42f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:20.120848 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:20.120834 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs podName:1a9f09ff-a8fe-41b7-b833-8c5091a88fb6 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:21.120825361 +0000 UTC m=+4.044521939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs") pod "network-metrics-daemon-p9mcp" (UID: "1a9f09ff-a8fe-41b7-b833-8c5091a88fb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:20.546113 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.546063 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:25:18 +0000 UTC" deadline="2027-10-31 15:00:42.219946801 +0000 UTC" Apr 16 19:30:20.546113 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.546107 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13507h30m21.673843616s" Apr 16 19:30:20.647903 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.647865 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:20.648085 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:20.647998 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:20.648446 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.648425 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:20.648559 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:20.648530 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:20.664925 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.664886 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" event={"ID":"4cf8424ac7d796a78f96e62791daed1d","Type":"ContainerStarted","Data":"0b38fc0db3e31e23fcfe6bdf2031265c7f7781ec6f7b292720d0c8045c308462"} Apr 16 19:30:20.669454 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.669422 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dxnh8" event={"ID":"e224c06c-213a-487e-844c-5d72da62ac07","Type":"ContainerStarted","Data":"349fe10642d9b02e1ddcbc0ea6f2c540e657ec3460cbbacc6fb371c45af4a48e"} Apr 16 19:30:20.672462 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.672427 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" event={"ID":"a149ee97-cada-4c41-b88a-0351739b3d48","Type":"ContainerStarted","Data":"8db83c21991e8b662c9263553f3b524016c800f366ee7f7b5887694c004edbde"} Apr 16 19:30:20.673785 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.673761 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g2sd7" event={"ID":"aff5494c-5205-4f24-a716-e1d25bc64f7c","Type":"ContainerStarted","Data":"09cf3abb274fd64dea9b0578c5847b6a0332dce68a0b08eb6dd4556372f26d72"} Apr 16 19:30:20.683980 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.683948 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vsb4w" event={"ID":"ab25b071-b28e-4ed4-8595-4ea92620f2bd","Type":"ContainerStarted","Data":"6da65fa8f9204dc77376ffc0743fc4d52b542f1a416b02a8c13fad8eb627145a"} Apr 16 19:30:20.685583 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.685554 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" event={"ID":"930ab398-8c7e-4088-bae6-3bde68db4b00","Type":"ContainerStarted","Data":"e95f422f822dbeef38eac386d529200fff3fec819f6b261178106d0465eae893"} Apr 16 19:30:20.690507 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.690483 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gt9gg" event={"ID":"901bc30b-5940-410f-8379-b703113afa1a","Type":"ContainerStarted","Data":"9f190944d0eb60988c33c8bc9f9230b2e00879338d642433c5fbf222ee0e4423"} Apr 16 19:30:20.695034 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.694987 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-spns4" event={"ID":"d2974a6f-4954-4cca-a5c8-ba2980890b8d","Type":"ContainerStarted","Data":"4df59b70325eb46873d12f3772ffab79a17252a29d893729ec2a98fd38afb1f9"} Apr 16 19:30:20.696388 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:20.696361 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2nws7" event={"ID":"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2","Type":"ContainerStarted","Data":"aa442728e652a181c3f8b8db91ec70f439eb2480bd57c4df888e03101f7ca0ad"} Apr 16 19:30:21.131254 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:21.131163 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs\") pod \"network-metrics-daemon-p9mcp\" (UID: \"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6\") " pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:21.131254 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:21.131210 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f58z2\" (UniqueName: \"kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2\") pod \"network-check-target-h6v8t\" (UID: \"9406cd72-aaad-46f9-8e4d-59aa641ee42f\") " pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:21.131469 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:21.131319 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:21.131469 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:21.131332 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:21.131469 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:21.131341 2568 projected.go:194] Error preparing data for projected volume kube-api-access-f58z2 for pod openshift-network-diagnostics/network-check-target-h6v8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:21.131469 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:21.131383 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2 podName:9406cd72-aaad-46f9-8e4d-59aa641ee42f nodeName:}" failed. No retries permitted until 2026-04-16 19:30:23.131371469 +0000 UTC m=+6.055068060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-f58z2" (UniqueName: "kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2") pod "network-check-target-h6v8t" (UID: "9406cd72-aaad-46f9-8e4d-59aa641ee42f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:21.131886 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:21.131782 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:21.131886 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:21.131837 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs podName:1a9f09ff-a8fe-41b7-b833-8c5091a88fb6 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:23.131820439 +0000 UTC m=+6.055517021 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs") pod "network-metrics-daemon-p9mcp" (UID: "1a9f09ff-a8fe-41b7-b833-8c5091a88fb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:21.715518 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:21.715459 2568 generic.go:358] "Generic (PLEG): container finished" podID="9fc3cee085866514e6ec8498fa3dbfcb" containerID="707e87618de6e305725743d1a864687d0e3c2b52a407ca900ac659a3221f4f0c" exitCode=0 Apr 16 19:30:21.716380 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:21.716354 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" event={"ID":"9fc3cee085866514e6ec8498fa3dbfcb","Type":"ContainerDied","Data":"707e87618de6e305725743d1a864687d0e3c2b52a407ca900ac659a3221f4f0c"} Apr 16 19:30:21.730148 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:21.729628 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-198.ec2.internal" podStartSLOduration=3.729607061 podStartE2EDuration="3.729607061s" podCreationTimestamp="2026-04-16 19:30:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:30:20.679725562 +0000 UTC m=+3.603422162" watchObservedRunningTime="2026-04-16 19:30:21.729607061 +0000 UTC m=+4.653303667" Apr 16 19:30:22.207539 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:22.207505 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-krqn6"] Apr 16 19:30:22.209467 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:22.209443 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:22.209605 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:22.209527 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:22.240830 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:22.240739 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:22.240830 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:22.240821 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/252a776b-215a-41af-9c37-185dedf959ea-dbus\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:22.241061 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:22.240849 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/252a776b-215a-41af-9c37-185dedf959ea-kubelet-config\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:22.341894 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:22.341849 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:22.342113 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:22.341933 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/252a776b-215a-41af-9c37-185dedf959ea-dbus\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:22.342113 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:22.341961 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/252a776b-215a-41af-9c37-185dedf959ea-kubelet-config\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:22.342113 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:22.342076 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/252a776b-215a-41af-9c37-185dedf959ea-kubelet-config\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:22.342334 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:22.342182 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:22.342334 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:22.342241 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret podName:252a776b-215a-41af-9c37-185dedf959ea nodeName:}" failed. No retries permitted until 2026-04-16 19:30:22.842222483 +0000 UTC m=+5.765919062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret") pod "global-pull-secret-syncer-krqn6" (UID: "252a776b-215a-41af-9c37-185dedf959ea") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:22.342638 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:22.342603 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/252a776b-215a-41af-9c37-185dedf959ea-dbus\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:22.648909 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:22.648214 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:22.648909 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:22.648341 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:22.648909 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:22.648769 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:22.648909 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:22.648870 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:22.721484 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:22.721448 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" event={"ID":"9fc3cee085866514e6ec8498fa3dbfcb","Type":"ContainerStarted","Data":"a084e8368a950101d359fc7d76c4a83d83c395228a634327fb2eded71e0a193a"} Apr 16 19:30:22.846897 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:22.846864 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:22.860103 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:22.860070 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:22.860278 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:22.860161 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret podName:252a776b-215a-41af-9c37-185dedf959ea nodeName:}" failed. No retries permitted until 2026-04-16 19:30:23.860144821 +0000 UTC m=+6.783841398 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret") pod "global-pull-secret-syncer-krqn6" (UID: "252a776b-215a-41af-9c37-185dedf959ea") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:23.149239 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:23.149186 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs\") pod \"network-metrics-daemon-p9mcp\" (UID: \"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6\") " pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:23.149418 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:23.149262 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f58z2\" (UniqueName: \"kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2\") pod \"network-check-target-h6v8t\" (UID: \"9406cd72-aaad-46f9-8e4d-59aa641ee42f\") " pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:23.149500 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:23.149431 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:23.149500 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:23.149449 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:23.149500 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:23.149461 2568 projected.go:194] Error preparing data for projected volume kube-api-access-f58z2 for pod openshift-network-diagnostics/network-check-target-h6v8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:23.150011 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:23.149519 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2 podName:9406cd72-aaad-46f9-8e4d-59aa641ee42f nodeName:}" failed. No retries permitted until 2026-04-16 19:30:27.149500761 +0000 UTC m=+10.073197356 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-f58z2" (UniqueName: "kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2") pod "network-check-target-h6v8t" (UID: "9406cd72-aaad-46f9-8e4d-59aa641ee42f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:23.150011 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:23.149733 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:23.150011 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:23.149801 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs podName:1a9f09ff-a8fe-41b7-b833-8c5091a88fb6 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:27.149781386 +0000 UTC m=+10.073477972 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs") pod "network-metrics-daemon-p9mcp" (UID: "1a9f09ff-a8fe-41b7-b833-8c5091a88fb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:23.649258 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:23.648896 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:23.649258 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:23.649017 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:23.954860 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:23.954781 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:23.955312 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:23.954938 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:23.955312 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:23.954993 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret podName:252a776b-215a-41af-9c37-185dedf959ea nodeName:}" failed. No retries permitted until 2026-04-16 19:30:25.954975766 +0000 UTC m=+8.878672348 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret") pod "global-pull-secret-syncer-krqn6" (UID: "252a776b-215a-41af-9c37-185dedf959ea") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:24.649149 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:24.648482 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:24.649149 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:24.648603 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:24.649149 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:24.648997 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:24.649149 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:24.649093 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:25.650355 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:25.650311 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:25.650834 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:25.650451 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:25.971773 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:25.971687 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:25.971939 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:25.971844 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:25.971939 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:25.971906 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret podName:252a776b-215a-41af-9c37-185dedf959ea nodeName:}" failed. No retries permitted until 2026-04-16 19:30:29.971887952 +0000 UTC m=+12.895584537 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret") pod "global-pull-secret-syncer-krqn6" (UID: "252a776b-215a-41af-9c37-185dedf959ea") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:26.648226 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:26.648191 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:26.648405 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:26.648195 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:26.648405 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:26.648326 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:26.648494 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:26.648431 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:27.180774 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:27.180733 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs\") pod \"network-metrics-daemon-p9mcp\" (UID: \"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6\") " pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:27.181296 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:27.180802 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f58z2\" (UniqueName: \"kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2\") pod \"network-check-target-h6v8t\" (UID: \"9406cd72-aaad-46f9-8e4d-59aa641ee42f\") " pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:27.181296 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:27.180972 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:27.181296 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:27.180994 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:27.181296 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:27.181006 2568 projected.go:194] Error preparing data for projected volume kube-api-access-f58z2 for pod openshift-network-diagnostics/network-check-target-h6v8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:27.181296 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:27.181067 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2 podName:9406cd72-aaad-46f9-8e4d-59aa641ee42f nodeName:}" failed. No retries permitted until 2026-04-16 19:30:35.181048085 +0000 UTC m=+18.104744689 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-f58z2" (UniqueName: "kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2") pod "network-check-target-h6v8t" (UID: "9406cd72-aaad-46f9-8e4d-59aa641ee42f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:27.181296 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:27.181147 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:27.181296 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:27.181179 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs podName:1a9f09ff-a8fe-41b7-b833-8c5091a88fb6 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:35.181167577 +0000 UTC m=+18.104864171 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs") pod "network-metrics-daemon-p9mcp" (UID: "1a9f09ff-a8fe-41b7-b833-8c5091a88fb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:27.651092 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:27.651056 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:27.651271 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:27.651188 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:28.648371 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:28.648337 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:28.648814 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:28.648394 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:28.648814 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:28.648477 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:28.648814 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:28.648622 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:29.648202 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:29.648161 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:29.648404 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:29.648299 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:30.003291 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:30.003199 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:30.003452 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:30.003372 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:30.003524 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:30.003454 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret podName:252a776b-215a-41af-9c37-185dedf959ea nodeName:}" failed. No retries permitted until 2026-04-16 19:30:38.003433943 +0000 UTC m=+20.927130535 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret") pod "global-pull-secret-syncer-krqn6" (UID: "252a776b-215a-41af-9c37-185dedf959ea") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:30.648382 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:30.648346 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:30.648567 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:30.648352 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:30.648567 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:30.648510 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:30.648986 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:30.648566 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:31.648570 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:31.648529 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:31.648964 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:31.648699 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:32.647832 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:32.647791 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:32.648006 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:32.647810 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:32.648006 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:32.647930 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:32.648112 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:32.648028 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:33.648522 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:33.648484 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:33.648985 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:33.648624 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:34.648400 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:34.648367 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:34.648400 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:34.648384 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:34.648895 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:34.648494 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:34.648895 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:34.648633 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:35.238685 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:35.238626 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs\") pod \"network-metrics-daemon-p9mcp\" (UID: \"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6\") " pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:35.238873 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:35.238720 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f58z2\" (UniqueName: \"kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2\") pod \"network-check-target-h6v8t\" (UID: \"9406cd72-aaad-46f9-8e4d-59aa641ee42f\") " pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:35.238873 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:35.238787 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:35.238873 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:35.238862 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:35.238873 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:35.238869 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs podName:1a9f09ff-a8fe-41b7-b833-8c5091a88fb6 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:51.238845421 +0000 UTC m=+34.162542019 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs") pod "network-metrics-daemon-p9mcp" (UID: "1a9f09ff-a8fe-41b7-b833-8c5091a88fb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:35.239032 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:35.238880 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:35.239032 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:35.238894 2568 projected.go:194] Error preparing data for projected volume kube-api-access-f58z2 for pod openshift-network-diagnostics/network-check-target-h6v8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:35.239032 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:35.238958 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2 podName:9406cd72-aaad-46f9-8e4d-59aa641ee42f nodeName:}" failed. No retries permitted until 2026-04-16 19:30:51.238942553 +0000 UTC m=+34.162639136 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-f58z2" (UniqueName: "kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2") pod "network-check-target-h6v8t" (UID: "9406cd72-aaad-46f9-8e4d-59aa641ee42f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:35.648096 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:35.648056 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:35.648279 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:35.648194 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:36.647669 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:36.647625 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:36.647669 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:36.647641 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:36.648157 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:36.647806 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:36.648157 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:36.647940 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:37.648722 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.648384 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:37.649381 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:37.648787 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:37.748913 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.748879 2568 generic.go:358] "Generic (PLEG): container finished" podID="901bc30b-5940-410f-8379-b703113afa1a" containerID="c499c8e2e69b6280693a71ab440cc7f6b262ccd7c8f7122ba3fb8911141ed71e" exitCode=0 Apr 16 19:30:37.749214 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.749171 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gt9gg" event={"ID":"901bc30b-5940-410f-8379-b703113afa1a","Type":"ContainerDied","Data":"c499c8e2e69b6280693a71ab440cc7f6b262ccd7c8f7122ba3fb8911141ed71e"} Apr 16 19:30:37.750800 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.750770 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2nws7" event={"ID":"6f2c8aa4-8f62-4688-865f-bcefc4f5fbc2","Type":"ContainerStarted","Data":"f86d4294f6c2055507d463a974d539e81d09c1f5a9d1cf5021cea84082f0a808"} Apr 16 19:30:37.754123 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.754095 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dxnh8" event={"ID":"e224c06c-213a-487e-844c-5d72da62ac07","Type":"ContainerStarted","Data":"08be0fb7282606fd52888849459b2f0c4d002431542ca02f17e925c8e5c7ff09"} Apr 16 19:30:37.759519 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.759486 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" event={"ID":"a149ee97-cada-4c41-b88a-0351739b3d48","Type":"ContainerStarted","Data":"2fe6e0eac0f28f0212f11e602c341342bd329a7d074148df0a37ad81118ad547"} Apr 16 19:30:37.759655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.759530 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" event={"ID":"a149ee97-cada-4c41-b88a-0351739b3d48","Type":"ContainerStarted","Data":"eef72ace716d5136831fb4bd942613d56843397bca13b65dc92cbdfbba36bdc7"} Apr 16 19:30:37.759655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.759546 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" event={"ID":"a149ee97-cada-4c41-b88a-0351739b3d48","Type":"ContainerStarted","Data":"6718f51284785f5a0f03fad43917ff18870939e15ab34feb2b6112e815899663"} Apr 16 19:30:37.759655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.759563 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" event={"ID":"a149ee97-cada-4c41-b88a-0351739b3d48","Type":"ContainerStarted","Data":"6af5827ebbbf95c3a93e1b56e6b3753d3828e9a134fc8282a39af368420015d6"} Apr 16 19:30:37.759655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.759575 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" event={"ID":"a149ee97-cada-4c41-b88a-0351739b3d48","Type":"ContainerStarted","Data":"fe3e16e35f5ac76c2c68555809a8a68c15fd18cab31a417c747ec09c15beeab6"} Apr 16 19:30:37.761116 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.761082 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g2sd7" event={"ID":"aff5494c-5205-4f24-a716-e1d25bc64f7c","Type":"ContainerStarted","Data":"78081c1d60bb13cd965b4fc4edc0e0edadf28108c09e17de55d94ed85a934797"} Apr 16 19:30:37.763531 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.763504 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vsb4w" event={"ID":"ab25b071-b28e-4ed4-8595-4ea92620f2bd","Type":"ContainerStarted","Data":"ccef4a05d06a46cabb1ee78b3a1e5ab2ce1d70d5789d36560bb1288192037d44"} Apr 16 19:30:37.765809 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.765780 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" event={"ID":"930ab398-8c7e-4088-bae6-3bde68db4b00","Type":"ContainerStarted","Data":"d6d976152321d90abaf88c269a7811901f659e639c3649b81fba0df41883841c"} Apr 16 19:30:37.770091 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.770028 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-198.ec2.internal" podStartSLOduration=19.770008559 podStartE2EDuration="19.770008559s" podCreationTimestamp="2026-04-16 19:30:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:30:22.735272822 +0000 UTC m=+5.658969424" watchObservedRunningTime="2026-04-16 19:30:37.770008559 +0000 UTC m=+20.693705160" Apr 16 19:30:37.800028 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.799982 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2nws7" podStartSLOduration=3.787647325 podStartE2EDuration="20.799966401s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:30:20.059944104 +0000 UTC m=+2.983640688" lastFinishedPulling="2026-04-16 19:30:37.072263182 +0000 UTC m=+19.995959764" observedRunningTime="2026-04-16 19:30:37.799372688 +0000 UTC m=+20.723069289" watchObservedRunningTime="2026-04-16 19:30:37.799966401 +0000 UTC m=+20.723663001" Apr 16 19:30:37.800230 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.800209 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dxnh8" podStartSLOduration=3.7934822070000003 podStartE2EDuration="20.800203132s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:30:20.065545127 +0000 UTC m=+2.989241705" lastFinishedPulling="2026-04-16 19:30:37.072266052 +0000 UTC m=+19.995962630" observedRunningTime="2026-04-16 19:30:37.784776957 +0000 UTC m=+20.708473557" watchObservedRunningTime="2026-04-16 19:30:37.800203132 +0000 UTC m=+20.723899732" Apr 16 19:30:37.815920 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.815844 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-g2sd7" podStartSLOduration=3.7712954610000002 podStartE2EDuration="20.815826697s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:30:20.061775059 +0000 UTC m=+2.985471637" lastFinishedPulling="2026-04-16 19:30:37.106306295 +0000 UTC m=+20.030002873" observedRunningTime="2026-04-16 19:30:37.815240776 +0000 UTC m=+20.738937373" watchObservedRunningTime="2026-04-16 19:30:37.815826697 +0000 UTC m=+20.739523296" Apr 16 19:30:37.827766 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:37.827708 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vsb4w" podStartSLOduration=3.824483797 podStartE2EDuration="20.827669147s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:30:20.069177796 +0000 UTC m=+2.992874382" lastFinishedPulling="2026-04-16 19:30:37.072363139 +0000 UTC m=+19.996059732" observedRunningTime="2026-04-16 19:30:37.827478147 +0000 UTC m=+20.751174748" watchObservedRunningTime="2026-04-16 19:30:37.827669147 +0000 UTC m=+20.751365747" Apr 16 19:30:38.059714 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.059585 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:38.059877 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:38.059787 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:38.059877 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:38.059870 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret podName:252a776b-215a-41af-9c37-185dedf959ea nodeName:}" failed. No retries permitted until 2026-04-16 19:30:54.059848972 +0000 UTC m=+36.983545565 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret") pod "global-pull-secret-syncer-krqn6" (UID: "252a776b-215a-41af-9c37-185dedf959ea") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:38.244898 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.244622 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:30:38.574575 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.574540 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-ftbrq"] Apr 16 19:30:38.574780 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.574577 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:30:38.244893157Z","UUID":"05197c5c-24d1-4b89-bab6-4227647f54b8","Handler":null,"Name":"","Endpoint":""} Apr 16 19:30:38.576616 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.576594 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:30:38.576616 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.576621 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:30:38.577967 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.577948 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ftbrq" Apr 16 19:30:38.580501 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.580475 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-fr6fn\"" Apr 16 19:30:38.580900 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.580881 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:30:38.581135 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.581115 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:30:38.647862 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.647826 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:38.648069 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.647826 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:38.648069 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:38.647982 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:38.648069 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:38.648019 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:38.664016 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.663952 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fd63a5dc-d580-47b6-b37d-b1972fcea60a-hosts-file\") pod \"node-resolver-ftbrq\" (UID: \"fd63a5dc-d580-47b6-b37d-b1972fcea60a\") " pod="openshift-dns/node-resolver-ftbrq" Apr 16 19:30:38.664417 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.664099 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fd63a5dc-d580-47b6-b37d-b1972fcea60a-tmp-dir\") pod \"node-resolver-ftbrq\" (UID: \"fd63a5dc-d580-47b6-b37d-b1972fcea60a\") " pod="openshift-dns/node-resolver-ftbrq" Apr 16 19:30:38.664417 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.664142 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkxbt\" (UniqueName: \"kubernetes.io/projected/fd63a5dc-d580-47b6-b37d-b1972fcea60a-kube-api-access-gkxbt\") pod \"node-resolver-ftbrq\" (UID: \"fd63a5dc-d580-47b6-b37d-b1972fcea60a\") " pod="openshift-dns/node-resolver-ftbrq" Apr 16 19:30:38.765516 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.764954 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fd63a5dc-d580-47b6-b37d-b1972fcea60a-hosts-file\") pod \"node-resolver-ftbrq\" (UID: \"fd63a5dc-d580-47b6-b37d-b1972fcea60a\") " pod="openshift-dns/node-resolver-ftbrq" Apr 16 19:30:38.765516 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.765050 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fd63a5dc-d580-47b6-b37d-b1972fcea60a-tmp-dir\") pod \"node-resolver-ftbrq\" (UID: \"fd63a5dc-d580-47b6-b37d-b1972fcea60a\") " pod="openshift-dns/node-resolver-ftbrq" Apr 16 19:30:38.765516 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.765064 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fd63a5dc-d580-47b6-b37d-b1972fcea60a-hosts-file\") pod \"node-resolver-ftbrq\" (UID: \"fd63a5dc-d580-47b6-b37d-b1972fcea60a\") " pod="openshift-dns/node-resolver-ftbrq" Apr 16 19:30:38.765516 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.765076 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkxbt\" (UniqueName: \"kubernetes.io/projected/fd63a5dc-d580-47b6-b37d-b1972fcea60a-kube-api-access-gkxbt\") pod \"node-resolver-ftbrq\" (UID: \"fd63a5dc-d580-47b6-b37d-b1972fcea60a\") " pod="openshift-dns/node-resolver-ftbrq" Apr 16 19:30:38.765516 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.765427 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fd63a5dc-d580-47b6-b37d-b1972fcea60a-tmp-dir\") pod \"node-resolver-ftbrq\" (UID: \"fd63a5dc-d580-47b6-b37d-b1972fcea60a\") " pod="openshift-dns/node-resolver-ftbrq" Apr 16 19:30:38.772154 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.772118 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" event={"ID":"a149ee97-cada-4c41-b88a-0351739b3d48","Type":"ContainerStarted","Data":"9c1e546f1f8df7802f90731b683bfbee0bdf0c68bb2995d185e8ff4e152b8e5f"} Apr 16 19:30:38.774102 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.774065 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" event={"ID":"930ab398-8c7e-4088-bae6-3bde68db4b00","Type":"ContainerStarted","Data":"a3a42679d428d6e17dace5f058e982b4b5deeee03bc3c5edbcc14ef9279a70f4"} Apr 16 19:30:38.776107 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.775587 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-spns4" event={"ID":"d2974a6f-4954-4cca-a5c8-ba2980890b8d","Type":"ContainerStarted","Data":"5e51e27d8f8d2c9ec6d89f18e59534d166963c148c0197bde81bbc60957346bb"} Apr 16 19:30:38.777115 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.777090 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkxbt\" (UniqueName: \"kubernetes.io/projected/fd63a5dc-d580-47b6-b37d-b1972fcea60a-kube-api-access-gkxbt\") pod \"node-resolver-ftbrq\" (UID: \"fd63a5dc-d580-47b6-b37d-b1972fcea60a\") " pod="openshift-dns/node-resolver-ftbrq" Apr 16 19:30:38.789274 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.789215 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-spns4" podStartSLOduration=4.784164596 podStartE2EDuration="21.789198868s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:30:20.067274016 +0000 UTC m=+2.990970593" lastFinishedPulling="2026-04-16 19:30:37.072308287 +0000 UTC m=+19.996004865" observedRunningTime="2026-04-16 19:30:38.789067964 +0000 UTC m=+21.712764564" watchObservedRunningTime="2026-04-16 19:30:38.789198868 +0000 UTC m=+21.712895468" Apr 16 19:30:38.887557 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:38.887484 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ftbrq" Apr 16 19:30:39.001081 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:39.001042 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd63a5dc_d580_47b6_b37d_b1972fcea60a.slice/crio-2ac720638650c80091f2cebe01f0bb4f540f9ce3f46088b8a58a969cb18a293d WatchSource:0}: Error finding container 2ac720638650c80091f2cebe01f0bb4f540f9ce3f46088b8a58a969cb18a293d: Status 404 returned error can't find the container with id 2ac720638650c80091f2cebe01f0bb4f540f9ce3f46088b8a58a969cb18a293d Apr 16 19:30:39.647756 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:39.647721 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:39.648004 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:39.647860 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:39.780099 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:39.780060 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" event={"ID":"930ab398-8c7e-4088-bae6-3bde68db4b00","Type":"ContainerStarted","Data":"3cae1cebb648d84ddec90368a121b970003ba84210bd46642b32f91f6215bd21"} Apr 16 19:30:39.781535 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:39.781502 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ftbrq" event={"ID":"fd63a5dc-d580-47b6-b37d-b1972fcea60a","Type":"ContainerStarted","Data":"bec83a8fd16182a4792fec13f8d13031da8be872b897091ced1d37de11ed154b"} Apr 16 19:30:39.781690 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:39.781541 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ftbrq" event={"ID":"fd63a5dc-d580-47b6-b37d-b1972fcea60a","Type":"ContainerStarted","Data":"2ac720638650c80091f2cebe01f0bb4f540f9ce3f46088b8a58a969cb18a293d"} Apr 16 19:30:39.795901 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:39.795846 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2klnt" podStartSLOduration=3.819572786 podStartE2EDuration="22.795827869s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:30:20.071649378 +0000 UTC m=+2.995345972" lastFinishedPulling="2026-04-16 19:30:39.047904477 +0000 UTC m=+21.971601055" observedRunningTime="2026-04-16 19:30:39.795041245 +0000 UTC m=+22.718737856" watchObservedRunningTime="2026-04-16 19:30:39.795827869 +0000 UTC m=+22.719524473" Apr 16 19:30:39.808504 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:39.808465 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ftbrq" podStartSLOduration=1.80845105 podStartE2EDuration="1.80845105s" podCreationTimestamp="2026-04-16 19:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:30:39.808247161 +0000 UTC m=+22.731943760" watchObservedRunningTime="2026-04-16 19:30:39.80845105 +0000 UTC m=+22.732147649" Apr 16 19:30:40.648575 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:40.648370 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:40.648790 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:40.648370 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:40.648790 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:40.648688 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:40.648871 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:40.648791 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:40.787320 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:40.787270 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" event={"ID":"a149ee97-cada-4c41-b88a-0351739b3d48","Type":"ContainerStarted","Data":"2d7721e624f0697012e9cfc2912c53213867e7b4dbb550237ba317fc1a3916c0"} Apr 16 19:30:41.648199 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:41.648163 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:41.648367 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:41.648276 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:42.096541 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:42.096509 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dxnh8" Apr 16 19:30:42.097249 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:42.097231 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dxnh8" Apr 16 19:30:42.647764 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:42.647730 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:42.647929 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:42.647730 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:42.647929 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:42.647879 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:42.647929 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:42.647905 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:42.792627 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:42.792590 2568 generic.go:358] "Generic (PLEG): container finished" podID="901bc30b-5940-410f-8379-b703113afa1a" containerID="53e24f98f16b615b2764a29876a082cea3b9fd98fc6c56f1038d8eab5208171a" exitCode=0 Apr 16 19:30:42.792795 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:42.792692 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gt9gg" event={"ID":"901bc30b-5940-410f-8379-b703113afa1a","Type":"ContainerDied","Data":"53e24f98f16b615b2764a29876a082cea3b9fd98fc6c56f1038d8eab5208171a"} Apr 16 19:30:42.796317 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:42.796094 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" event={"ID":"a149ee97-cada-4c41-b88a-0351739b3d48","Type":"ContainerStarted","Data":"e476153957b76494cd06ced0dc1fb63960251a8eb46deed4115c2cf4810b4fba"} Apr 16 19:30:42.796414 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:42.796342 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dxnh8" Apr 16 19:30:42.796822 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:42.796806 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dxnh8" Apr 16 19:30:42.841212 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:42.841155 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" podStartSLOduration=8.673340008 podStartE2EDuration="25.841138369s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:30:20.062506361 +0000 UTC m=+2.986202951" lastFinishedPulling="2026-04-16 19:30:37.230304728 +0000 UTC m=+20.154001312" observedRunningTime="2026-04-16 19:30:42.840855146 +0000 UTC m=+25.764551745" watchObservedRunningTime="2026-04-16 19:30:42.841138369 +0000 UTC m=+25.764834970" Apr 16 19:30:43.647810 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:43.647777 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:43.648296 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:43.647895 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:43.798844 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:43.798773 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:43.798844 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:43.798811 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:43.798844 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:43.798825 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:43.815609 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:43.815582 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:43.816632 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:43.816609 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:30:43.980978 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:43.980949 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-h6v8t"] Apr 16 19:30:43.981121 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:43.981072 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:43.981201 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:43.981178 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:43.981811 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:43.981789 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p9mcp"] Apr 16 19:30:43.981933 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:43.981921 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:43.982046 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:43.982024 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:43.982534 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:43.982514 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-krqn6"] Apr 16 19:30:43.982638 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:43.982617 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:43.982760 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:43.982732 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:44.802172 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:44.802136 2568 generic.go:358] "Generic (PLEG): container finished" podID="901bc30b-5940-410f-8379-b703113afa1a" containerID="36dab24e45c9e3a33cfcc4454cdbcacf7f59759b9a385f59ad21c2e6023b5336" exitCode=0 Apr 16 19:30:44.802999 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:44.802219 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gt9gg" event={"ID":"901bc30b-5940-410f-8379-b703113afa1a","Type":"ContainerDied","Data":"36dab24e45c9e3a33cfcc4454cdbcacf7f59759b9a385f59ad21c2e6023b5336"} Apr 16 19:30:45.650622 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:45.650588 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:45.650622 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:45.650603 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:45.650622 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:45.650588 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:45.650902 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:45.650729 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:45.650902 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:45.650779 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:45.650902 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:45.650814 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:45.806632 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:45.806598 2568 generic.go:358] "Generic (PLEG): container finished" podID="901bc30b-5940-410f-8379-b703113afa1a" containerID="50f529ec7eaccfdb5b6de85e620347788c593de09aacc0e2b177457749028557" exitCode=0 Apr 16 19:30:45.807028 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:45.806706 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gt9gg" event={"ID":"901bc30b-5940-410f-8379-b703113afa1a","Type":"ContainerDied","Data":"50f529ec7eaccfdb5b6de85e620347788c593de09aacc0e2b177457749028557"} Apr 16 19:30:47.651537 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:47.651309 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:47.652033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:47.651309 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:47.652033 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:47.651629 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:47.652033 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:47.651320 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:47.652033 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:47.651738 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:47.652033 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:47.651823 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:49.651332 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.651285 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:49.651770 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.651309 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:49.651770 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:49.651413 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-krqn6" podUID="252a776b-215a-41af-9c37-185dedf959ea" Apr 16 19:30:49.651770 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:49.651499 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p9mcp" podUID="1a9f09ff-a8fe-41b7-b833-8c5091a88fb6" Apr 16 19:30:49.651770 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.651309 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:49.651770 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:49.651588 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h6v8t" podUID="9406cd72-aaad-46f9-8e4d-59aa641ee42f" Apr 16 19:30:49.891144 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.891112 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-198.ec2.internal" event="NodeReady" Apr 16 19:30:49.891332 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.891264 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:30:49.936617 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.936530 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-845cbc8db8-wbt2x"] Apr 16 19:30:49.964515 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.964302 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-brrh2"] Apr 16 19:30:49.964515 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.964470 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:49.967206 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.967176 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 19:30:49.967410 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.967393 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 19:30:49.967540 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.967521 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 19:30:49.967662 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.967579 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-25xm9\"" Apr 16 19:30:49.974314 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.974278 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 19:30:49.983218 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.983189 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gqlxj"] Apr 16 19:30:49.983372 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.983355 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:49.987640 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.987617 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:30:49.991250 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.988365 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fkm8b\"" Apr 16 19:30:49.991250 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.988392 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:30:49.997267 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.997245 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-845cbc8db8-wbt2x"] Apr 16 19:30:49.997387 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.997273 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-brrh2"] Apr 16 19:30:49.997387 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.997291 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gqlxj"] Apr 16 19:30:49.997500 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.997433 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gqlxj" Apr 16 19:30:49.999883 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.999858 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:30:49.999984 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.999905 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:30:49.999984 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:49.999924 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fqt5t\"" Apr 16 19:30:50.000186 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.000165 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:30:50.049444 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.049411 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-ca-trust-extracted\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.049640 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.049457 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-trusted-ca\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.049640 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.049574 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf4z5\" (UniqueName: \"kubernetes.io/projected/d702f7c4-fdbf-4294-9aee-07546029945f-kube-api-access-pf4z5\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:50.049640 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.049606 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.049640 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.049635 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-certificates\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.049891 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.049658 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-installation-pull-secrets\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.049891 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.049756 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d702f7c4-fdbf-4294-9aee-07546029945f-config-volume\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:50.049891 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.049783 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d702f7c4-fdbf-4294-9aee-07546029945f-tmp-dir\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:50.049891 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.049816 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-image-registry-private-configuration\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.049891 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.049843 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbt4b\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-kube-api-access-pbt4b\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.049891 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.049880 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:50.050122 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.049902 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-bound-sa-token\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.150763 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.150706 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pf4z5\" (UniqueName: \"kubernetes.io/projected/d702f7c4-fdbf-4294-9aee-07546029945f-kube-api-access-pf4z5\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:50.150952 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.150782 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.150952 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.150807 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-certificates\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.150952 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.150830 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-installation-pull-secrets\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.150952 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.150884 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d702f7c4-fdbf-4294-9aee-07546029945f-config-volume\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:50.150952 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.150906 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d702f7c4-fdbf-4294-9aee-07546029945f-tmp-dir\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:50.150952 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:50.150911 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:30:50.150952 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:50.150933 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-845cbc8db8-wbt2x: secret "image-registry-tls" not found Apr 16 19:30:50.150952 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.150936 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f6d9\" (UniqueName: \"kubernetes.io/projected/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-kube-api-access-5f6d9\") pod \"ingress-canary-gqlxj\" (UID: \"09f9c8cc-cfd9-462c-9967-1afa5e6543ea\") " pod="openshift-ingress-canary/ingress-canary-gqlxj" Apr 16 19:30:50.151362 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.150972 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-image-registry-private-configuration\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.151362 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:50.150989 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls podName:b98edfdf-3c43-46b2-a5e7-d5733e434e5b nodeName:}" failed. No retries permitted until 2026-04-16 19:30:50.650969463 +0000 UTC m=+33.574666064 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls") pod "image-registry-845cbc8db8-wbt2x" (UID: "b98edfdf-3c43-46b2-a5e7-d5733e434e5b") : secret "image-registry-tls" not found Apr 16 19:30:50.151362 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.151023 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbt4b\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-kube-api-access-pbt4b\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.151362 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.151061 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:50.151362 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.151083 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-bound-sa-token\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.151362 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.151111 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert\") pod \"ingress-canary-gqlxj\" (UID: \"09f9c8cc-cfd9-462c-9967-1afa5e6543ea\") " pod="openshift-ingress-canary/ingress-canary-gqlxj" Apr 16 19:30:50.151362 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.151148 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-ca-trust-extracted\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.151362 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.151173 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-trusted-ca\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.151362 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.151340 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d702f7c4-fdbf-4294-9aee-07546029945f-tmp-dir\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:50.151842 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:50.151440 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:30:50.151842 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:50.151499 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls podName:d702f7c4-fdbf-4294-9aee-07546029945f nodeName:}" failed. No retries permitted until 2026-04-16 19:30:50.651484047 +0000 UTC m=+33.575180639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls") pod "dns-default-brrh2" (UID: "d702f7c4-fdbf-4294-9aee-07546029945f") : secret "dns-default-metrics-tls" not found Apr 16 19:30:50.151842 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.151510 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d702f7c4-fdbf-4294-9aee-07546029945f-config-volume\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:50.151842 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.151548 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-certificates\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.151842 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.151788 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-ca-trust-extracted\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.152080 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.152056 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-trusted-ca\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.155739 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.155718 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-installation-pull-secrets\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.155869 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.155717 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-image-registry-private-configuration\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.163208 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.163181 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbt4b\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-kube-api-access-pbt4b\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.163335 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.163313 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf4z5\" (UniqueName: \"kubernetes.io/projected/d702f7c4-fdbf-4294-9aee-07546029945f-kube-api-access-pf4z5\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:50.163853 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.163823 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-bound-sa-token\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.252039 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.251949 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5f6d9\" (UniqueName: \"kubernetes.io/projected/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-kube-api-access-5f6d9\") pod \"ingress-canary-gqlxj\" (UID: \"09f9c8cc-cfd9-462c-9967-1afa5e6543ea\") " pod="openshift-ingress-canary/ingress-canary-gqlxj" Apr 16 19:30:50.252039 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.252012 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert\") pod \"ingress-canary-gqlxj\" (UID: \"09f9c8cc-cfd9-462c-9967-1afa5e6543ea\") " pod="openshift-ingress-canary/ingress-canary-gqlxj" Apr 16 19:30:50.252218 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:50.252111 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:30:50.252218 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:50.252162 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert podName:09f9c8cc-cfd9-462c-9967-1afa5e6543ea nodeName:}" failed. No retries permitted until 2026-04-16 19:30:50.752147298 +0000 UTC m=+33.675843876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert") pod "ingress-canary-gqlxj" (UID: "09f9c8cc-cfd9-462c-9967-1afa5e6543ea") : secret "canary-serving-cert" not found Apr 16 19:30:50.260850 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.260817 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f6d9\" (UniqueName: \"kubernetes.io/projected/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-kube-api-access-5f6d9\") pod \"ingress-canary-gqlxj\" (UID: \"09f9c8cc-cfd9-462c-9967-1afa5e6543ea\") " pod="openshift-ingress-canary/ingress-canary-gqlxj" Apr 16 19:30:50.656066 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.656028 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:50.656648 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.656129 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:50.656648 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:50.656201 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:30:50.656648 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:50.656229 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-845cbc8db8-wbt2x: secret "image-registry-tls" not found Apr 16 19:30:50.656648 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:50.656265 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:30:50.656648 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:50.656297 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls podName:b98edfdf-3c43-46b2-a5e7-d5733e434e5b nodeName:}" failed. No retries permitted until 2026-04-16 19:30:51.656277987 +0000 UTC m=+34.579974579 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls") pod "image-registry-845cbc8db8-wbt2x" (UID: "b98edfdf-3c43-46b2-a5e7-d5733e434e5b") : secret "image-registry-tls" not found Apr 16 19:30:50.656648 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:50.656323 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls podName:d702f7c4-fdbf-4294-9aee-07546029945f nodeName:}" failed. No retries permitted until 2026-04-16 19:30:51.656308247 +0000 UTC m=+34.580004825 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls") pod "dns-default-brrh2" (UID: "d702f7c4-fdbf-4294-9aee-07546029945f") : secret "dns-default-metrics-tls" not found Apr 16 19:30:50.757335 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:50.757298 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert\") pod \"ingress-canary-gqlxj\" (UID: \"09f9c8cc-cfd9-462c-9967-1afa5e6543ea\") " pod="openshift-ingress-canary/ingress-canary-gqlxj" Apr 16 19:30:50.757534 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:50.757460 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:30:50.757607 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:50.757542 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert podName:09f9c8cc-cfd9-462c-9967-1afa5e6543ea nodeName:}" failed. No retries permitted until 2026-04-16 19:30:51.757520862 +0000 UTC m=+34.681217463 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert") pod "ingress-canary-gqlxj" (UID: "09f9c8cc-cfd9-462c-9967-1afa5e6543ea") : secret "canary-serving-cert" not found Apr 16 19:30:51.262264 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:51.262222 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs\") pod \"network-metrics-daemon-p9mcp\" (UID: \"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6\") " pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:51.262529 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:51.262285 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f58z2\" (UniqueName: \"kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2\") pod \"network-check-target-h6v8t\" (UID: \"9406cd72-aaad-46f9-8e4d-59aa641ee42f\") " pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:51.262529 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:51.262372 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:51.262529 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:51.262442 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs podName:1a9f09ff-a8fe-41b7-b833-8c5091a88fb6 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:23.262421464 +0000 UTC m=+66.186118046 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs") pod "network-metrics-daemon-p9mcp" (UID: "1a9f09ff-a8fe-41b7-b833-8c5091a88fb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:51.262529 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:51.262379 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:51.262529 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:51.262486 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:51.262529 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:51.262509 2568 projected.go:194] Error preparing data for projected volume kube-api-access-f58z2 for pod openshift-network-diagnostics/network-check-target-h6v8t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:51.262788 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:51.262549 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2 podName:9406cd72-aaad-46f9-8e4d-59aa641ee42f nodeName:}" failed. No retries permitted until 2026-04-16 19:31:23.262537226 +0000 UTC m=+66.186233817 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-f58z2" (UniqueName: "kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2") pod "network-check-target-h6v8t" (UID: "9406cd72-aaad-46f9-8e4d-59aa641ee42f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:51.647886 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:51.647846 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:51.648060 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:51.647847 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:30:51.648060 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:51.647854 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:30:51.651593 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:51.651569 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:30:51.651747 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:51.651579 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:30:51.651817 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:51.651774 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-52hfs\"" Apr 16 19:30:51.651817 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:51.651809 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p422q\"" Apr 16 19:30:51.651925 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:51.651887 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:30:51.651979 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:51.651970 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:30:51.665581 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:51.665557 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:51.665959 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:51.665645 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:51.665959 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:51.665709 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:30:51.665959 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:51.665772 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:30:51.665959 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:51.665790 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-845cbc8db8-wbt2x: secret "image-registry-tls" not found Apr 16 19:30:51.665959 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:51.665778 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls podName:d702f7c4-fdbf-4294-9aee-07546029945f nodeName:}" failed. No retries permitted until 2026-04-16 19:30:53.665760663 +0000 UTC m=+36.589457241 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls") pod "dns-default-brrh2" (UID: "d702f7c4-fdbf-4294-9aee-07546029945f") : secret "dns-default-metrics-tls" not found Apr 16 19:30:51.665959 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:51.665837 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls podName:b98edfdf-3c43-46b2-a5e7-d5733e434e5b nodeName:}" failed. No retries permitted until 2026-04-16 19:30:53.665821023 +0000 UTC m=+36.589517638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls") pod "image-registry-845cbc8db8-wbt2x" (UID: "b98edfdf-3c43-46b2-a5e7-d5733e434e5b") : secret "image-registry-tls" not found Apr 16 19:30:51.766509 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:51.766463 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert\") pod \"ingress-canary-gqlxj\" (UID: \"09f9c8cc-cfd9-462c-9967-1afa5e6543ea\") " pod="openshift-ingress-canary/ingress-canary-gqlxj" Apr 16 19:30:51.766729 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:51.766580 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:30:51.766729 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:51.766632 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert podName:09f9c8cc-cfd9-462c-9967-1afa5e6543ea nodeName:}" failed. No retries permitted until 2026-04-16 19:30:53.766618656 +0000 UTC m=+36.690315234 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert") pod "ingress-canary-gqlxj" (UID: "09f9c8cc-cfd9-462c-9967-1afa5e6543ea") : secret "canary-serving-cert" not found Apr 16 19:30:52.821533 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:52.821498 2568 generic.go:358] "Generic (PLEG): container finished" podID="901bc30b-5940-410f-8379-b703113afa1a" containerID="6445b6d952ac3523708cea0ff6e206a8f35fd05b91195c885db6c327b23ec6e9" exitCode=0 Apr 16 19:30:52.821917 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:52.821553 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gt9gg" event={"ID":"901bc30b-5940-410f-8379-b703113afa1a","Type":"ContainerDied","Data":"6445b6d952ac3523708cea0ff6e206a8f35fd05b91195c885db6c327b23ec6e9"} Apr 16 19:30:53.682593 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:53.682547 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:53.682794 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:53.682642 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:53.682794 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:53.682720 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:30:53.682794 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:53.682744 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-845cbc8db8-wbt2x: secret "image-registry-tls" not found Apr 16 19:30:53.682794 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:53.682780 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:30:53.682946 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:53.682802 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls podName:b98edfdf-3c43-46b2-a5e7-d5733e434e5b nodeName:}" failed. No retries permitted until 2026-04-16 19:30:57.682785715 +0000 UTC m=+40.606482293 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls") pod "image-registry-845cbc8db8-wbt2x" (UID: "b98edfdf-3c43-46b2-a5e7-d5733e434e5b") : secret "image-registry-tls" not found Apr 16 19:30:53.682946 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:53.682828 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls podName:d702f7c4-fdbf-4294-9aee-07546029945f nodeName:}" failed. No retries permitted until 2026-04-16 19:30:57.68281304 +0000 UTC m=+40.606509645 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls") pod "dns-default-brrh2" (UID: "d702f7c4-fdbf-4294-9aee-07546029945f") : secret "dns-default-metrics-tls" not found Apr 16 19:30:53.784085 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:53.784000 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert\") pod \"ingress-canary-gqlxj\" (UID: \"09f9c8cc-cfd9-462c-9967-1afa5e6543ea\") " pod="openshift-ingress-canary/ingress-canary-gqlxj" Apr 16 19:30:53.784227 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:53.784155 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:30:53.784227 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:53.784220 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert podName:09f9c8cc-cfd9-462c-9967-1afa5e6543ea nodeName:}" failed. No retries permitted until 2026-04-16 19:30:57.78420575 +0000 UTC m=+40.707902331 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert") pod "ingress-canary-gqlxj" (UID: "09f9c8cc-cfd9-462c-9967-1afa5e6543ea") : secret "canary-serving-cert" not found Apr 16 19:30:53.825592 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:53.825411 2568 generic.go:358] "Generic (PLEG): container finished" podID="901bc30b-5940-410f-8379-b703113afa1a" containerID="0d00481929177206516ec48376f075df426ad3703e1d265a82d750532ab7ec79" exitCode=0 Apr 16 19:30:53.825592 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:53.825491 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gt9gg" event={"ID":"901bc30b-5940-410f-8379-b703113afa1a","Type":"ContainerDied","Data":"0d00481929177206516ec48376f075df426ad3703e1d265a82d750532ab7ec79"} Apr 16 19:30:54.087459 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:54.087422 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:54.090830 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:54.090804 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/252a776b-215a-41af-9c37-185dedf959ea-original-pull-secret\") pod \"global-pull-secret-syncer-krqn6\" (UID: \"252a776b-215a-41af-9c37-185dedf959ea\") " pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:54.357949 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:54.357859 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-krqn6" Apr 16 19:30:54.511184 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:54.511151 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-krqn6"] Apr 16 19:30:54.514897 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:30:54.514857 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod252a776b_215a_41af_9c37_185dedf959ea.slice/crio-11ec851e8d145bcb5a9b571c8bc1ebd1892f4025dad21fb8b9672c39c10a6278 WatchSource:0}: Error finding container 11ec851e8d145bcb5a9b571c8bc1ebd1892f4025dad21fb8b9672c39c10a6278: Status 404 returned error can't find the container with id 11ec851e8d145bcb5a9b571c8bc1ebd1892f4025dad21fb8b9672c39c10a6278 Apr 16 19:30:54.829304 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:54.829218 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-krqn6" event={"ID":"252a776b-215a-41af-9c37-185dedf959ea","Type":"ContainerStarted","Data":"11ec851e8d145bcb5a9b571c8bc1ebd1892f4025dad21fb8b9672c39c10a6278"} Apr 16 19:30:54.832823 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:54.832790 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gt9gg" event={"ID":"901bc30b-5940-410f-8379-b703113afa1a","Type":"ContainerStarted","Data":"4dbc3793302e8a23929ee978145aa883bf27aaac088dd8f42e4d49d04396bec0"} Apr 16 19:30:54.855443 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:54.855391 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gt9gg" podStartSLOduration=6.039757986 podStartE2EDuration="37.855375355s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:30:20.069526095 +0000 UTC m=+2.993222676" lastFinishedPulling="2026-04-16 19:30:51.885143449 +0000 UTC m=+34.808840045" observedRunningTime="2026-04-16 19:30:54.855232232 +0000 UTC m=+37.778928881" watchObservedRunningTime="2026-04-16 19:30:54.855375355 +0000 UTC m=+37.779071976" Apr 16 19:30:57.714281 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:57.714056 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:30:57.714767 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:57.714342 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:30:57.714767 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:57.714220 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:30:57.714767 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:57.714434 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls podName:d702f7c4-fdbf-4294-9aee-07546029945f nodeName:}" failed. No retries permitted until 2026-04-16 19:31:05.714413049 +0000 UTC m=+48.638109649 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls") pod "dns-default-brrh2" (UID: "d702f7c4-fdbf-4294-9aee-07546029945f") : secret "dns-default-metrics-tls" not found Apr 16 19:30:57.714767 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:57.714493 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:30:57.714767 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:57.714507 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-845cbc8db8-wbt2x: secret "image-registry-tls" not found Apr 16 19:30:57.714767 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:57.714549 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls podName:b98edfdf-3c43-46b2-a5e7-d5733e434e5b nodeName:}" failed. No retries permitted until 2026-04-16 19:31:05.714536704 +0000 UTC m=+48.638233284 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls") pod "image-registry-845cbc8db8-wbt2x" (UID: "b98edfdf-3c43-46b2-a5e7-d5733e434e5b") : secret "image-registry-tls" not found Apr 16 19:30:57.815058 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:57.815022 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert\") pod \"ingress-canary-gqlxj\" (UID: \"09f9c8cc-cfd9-462c-9967-1afa5e6543ea\") " pod="openshift-ingress-canary/ingress-canary-gqlxj" Apr 16 19:30:57.815243 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:57.815203 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:30:57.815289 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:30:57.815279 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert podName:09f9c8cc-cfd9-462c-9967-1afa5e6543ea nodeName:}" failed. No retries permitted until 2026-04-16 19:31:05.815257267 +0000 UTC m=+48.738953845 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert") pod "ingress-canary-gqlxj" (UID: "09f9c8cc-cfd9-462c-9967-1afa5e6543ea") : secret "canary-serving-cert" not found Apr 16 19:30:59.845151 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:59.845116 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-krqn6" event={"ID":"252a776b-215a-41af-9c37-185dedf959ea","Type":"ContainerStarted","Data":"98ddac877ddcc7313324b30a53b86231d38a93adadd0cb0f3a407d83bdbe7275"} Apr 16 19:30:59.859892 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:30:59.859839 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-krqn6" podStartSLOduration=32.987838442 podStartE2EDuration="37.859822879s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:30:54.516761481 +0000 UTC m=+37.440458063" lastFinishedPulling="2026-04-16 19:30:59.38874592 +0000 UTC m=+42.312442500" observedRunningTime="2026-04-16 19:30:59.859395322 +0000 UTC m=+42.783091923" watchObservedRunningTime="2026-04-16 19:30:59.859822879 +0000 UTC m=+42.783519479" Apr 16 19:31:05.779789 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:05.779748 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:31:05.780151 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:05.779816 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls\") pod \"image-registry-845cbc8db8-wbt2x\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:31:05.780151 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:05.779902 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:31:05.780151 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:05.779988 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls podName:d702f7c4-fdbf-4294-9aee-07546029945f nodeName:}" failed. No retries permitted until 2026-04-16 19:31:21.779970548 +0000 UTC m=+64.703667126 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls") pod "dns-default-brrh2" (UID: "d702f7c4-fdbf-4294-9aee-07546029945f") : secret "dns-default-metrics-tls" not found Apr 16 19:31:05.780151 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:05.779909 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:31:05.780151 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:05.780025 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-845cbc8db8-wbt2x: secret "image-registry-tls" not found Apr 16 19:31:05.780151 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:05.780076 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls podName:b98edfdf-3c43-46b2-a5e7-d5733e434e5b nodeName:}" failed. No retries permitted until 2026-04-16 19:31:21.780064619 +0000 UTC m=+64.703761211 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls") pod "image-registry-845cbc8db8-wbt2x" (UID: "b98edfdf-3c43-46b2-a5e7-d5733e434e5b") : secret "image-registry-tls" not found Apr 16 19:31:05.880483 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:05.880444 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert\") pod \"ingress-canary-gqlxj\" (UID: \"09f9c8cc-cfd9-462c-9967-1afa5e6543ea\") " pod="openshift-ingress-canary/ingress-canary-gqlxj" Apr 16 19:31:05.880667 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:05.880592 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:31:05.880753 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:05.880695 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert podName:09f9c8cc-cfd9-462c-9967-1afa5e6543ea nodeName:}" failed. No retries permitted until 2026-04-16 19:31:21.88065563 +0000 UTC m=+64.804352207 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert") pod "ingress-canary-gqlxj" (UID: "09f9c8cc-cfd9-462c-9967-1afa5e6543ea") : secret "canary-serving-cert" not found Apr 16 19:31:15.817484 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:15.817456 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vf8sc" Apr 16 19:31:19.614877 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.614839 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-685dd866df-djm4j"] Apr 16 19:31:19.618960 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.618931 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm"] Apr 16 19:31:19.619154 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.619133 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-685dd866df-djm4j" Apr 16 19:31:19.621456 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.621433 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-499q8\"" Apr 16 19:31:19.621722 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.621701 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 19:31:19.621804 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.621739 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 19:31:19.621804 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.621756 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" Apr 16 19:31:19.621804 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.621758 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 19:31:19.622452 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.622438 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 19:31:19.624559 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.624541 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 19:31:19.629801 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.629781 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-685dd866df-djm4j"] Apr 16 19:31:19.630622 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.630599 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm"] Apr 16 19:31:19.682004 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.681976 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d57dce11-4758-49ee-abae-5ed6192a28a3-tmp\") pod \"klusterlet-addon-workmgr-557b7d9676-db5sm\" (UID: \"d57dce11-4758-49ee-abae-5ed6192a28a3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" Apr 16 19:31:19.682133 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.682056 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69992ee4-b554-4ee1-ac4e-537e92560ff4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-685dd866df-djm4j\" (UID: \"69992ee4-b554-4ee1-ac4e-537e92560ff4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-685dd866df-djm4j" Apr 16 19:31:19.682133 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.682086 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgtq9\" (UniqueName: \"kubernetes.io/projected/69992ee4-b554-4ee1-ac4e-537e92560ff4-kube-api-access-bgtq9\") pod \"managed-serviceaccount-addon-agent-685dd866df-djm4j\" (UID: \"69992ee4-b554-4ee1-ac4e-537e92560ff4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-685dd866df-djm4j" Apr 16 19:31:19.682133 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.682129 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d57dce11-4758-49ee-abae-5ed6192a28a3-klusterlet-config\") pod \"klusterlet-addon-workmgr-557b7d9676-db5sm\" (UID: \"d57dce11-4758-49ee-abae-5ed6192a28a3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" Apr 16 19:31:19.682273 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.682179 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qcv2\" (UniqueName: \"kubernetes.io/projected/d57dce11-4758-49ee-abae-5ed6192a28a3-kube-api-access-9qcv2\") pod \"klusterlet-addon-workmgr-557b7d9676-db5sm\" (UID: \"d57dce11-4758-49ee-abae-5ed6192a28a3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" Apr 16 19:31:19.721795 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.721768 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r"] Apr 16 19:31:19.725211 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.725188 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r" Apr 16 19:31:19.727076 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.727056 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-845cbc8db8-wbt2x"] Apr 16 19:31:19.727236 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:19.727200 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" podUID="b98edfdf-3c43-46b2-a5e7-d5733e434e5b" Apr 16 19:31:19.728354 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.728332 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-hbbrm\"" Apr 16 19:31:19.728505 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.728469 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:31:19.728505 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.728472 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:31:19.728660 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.728511 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 19:31:19.729792 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.729774 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 19:31:19.736402 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.736382 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r"] Apr 16 19:31:19.782465 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.782439 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d57dce11-4758-49ee-abae-5ed6192a28a3-klusterlet-config\") pod \"klusterlet-addon-workmgr-557b7d9676-db5sm\" (UID: \"d57dce11-4758-49ee-abae-5ed6192a28a3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" Apr 16 19:31:19.782570 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.782484 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qcv2\" (UniqueName: \"kubernetes.io/projected/d57dce11-4758-49ee-abae-5ed6192a28a3-kube-api-access-9qcv2\") pod \"klusterlet-addon-workmgr-557b7d9676-db5sm\" (UID: \"d57dce11-4758-49ee-abae-5ed6192a28a3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" Apr 16 19:31:19.782570 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.782515 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8bc472a4-e44f-4c82-a32b-3cda6b957d95-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-dpb5r\" (UID: \"8bc472a4-e44f-4c82-a32b-3cda6b957d95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r" Apr 16 19:31:19.782726 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.782703 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d57dce11-4758-49ee-abae-5ed6192a28a3-tmp\") pod \"klusterlet-addon-workmgr-557b7d9676-db5sm\" (UID: \"d57dce11-4758-49ee-abae-5ed6192a28a3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" Apr 16 19:31:19.782836 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.782821 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bc472a4-e44f-4c82-a32b-3cda6b957d95-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-dpb5r\" (UID: \"8bc472a4-e44f-4c82-a32b-3cda6b957d95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r" Apr 16 19:31:19.782873 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.782850 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69992ee4-b554-4ee1-ac4e-537e92560ff4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-685dd866df-djm4j\" (UID: \"69992ee4-b554-4ee1-ac4e-537e92560ff4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-685dd866df-djm4j" Apr 16 19:31:19.782916 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.782879 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjcv4\" (UniqueName: \"kubernetes.io/projected/8bc472a4-e44f-4c82-a32b-3cda6b957d95-kube-api-access-gjcv4\") pod \"cluster-monitoring-operator-75587bd455-dpb5r\" (UID: \"8bc472a4-e44f-4c82-a32b-3cda6b957d95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r" Apr 16 19:31:19.782993 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.782971 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgtq9\" (UniqueName: \"kubernetes.io/projected/69992ee4-b554-4ee1-ac4e-537e92560ff4-kube-api-access-bgtq9\") pod \"managed-serviceaccount-addon-agent-685dd866df-djm4j\" (UID: \"69992ee4-b554-4ee1-ac4e-537e92560ff4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-685dd866df-djm4j" Apr 16 19:31:19.783059 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.783042 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d57dce11-4758-49ee-abae-5ed6192a28a3-tmp\") pod \"klusterlet-addon-workmgr-557b7d9676-db5sm\" (UID: \"d57dce11-4758-49ee-abae-5ed6192a28a3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" Apr 16 19:31:19.785668 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.785650 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/69992ee4-b554-4ee1-ac4e-537e92560ff4-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-685dd866df-djm4j\" (UID: \"69992ee4-b554-4ee1-ac4e-537e92560ff4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-685dd866df-djm4j" Apr 16 19:31:19.785835 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.785817 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d57dce11-4758-49ee-abae-5ed6192a28a3-klusterlet-config\") pod \"klusterlet-addon-workmgr-557b7d9676-db5sm\" (UID: \"d57dce11-4758-49ee-abae-5ed6192a28a3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" Apr 16 19:31:19.800780 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.800751 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qcv2\" (UniqueName: \"kubernetes.io/projected/d57dce11-4758-49ee-abae-5ed6192a28a3-kube-api-access-9qcv2\") pod \"klusterlet-addon-workmgr-557b7d9676-db5sm\" (UID: \"d57dce11-4758-49ee-abae-5ed6192a28a3\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" Apr 16 19:31:19.800946 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.800929 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgtq9\" (UniqueName: \"kubernetes.io/projected/69992ee4-b554-4ee1-ac4e-537e92560ff4-kube-api-access-bgtq9\") pod \"managed-serviceaccount-addon-agent-685dd866df-djm4j\" (UID: \"69992ee4-b554-4ee1-ac4e-537e92560ff4\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-685dd866df-djm4j" Apr 16 19:31:19.805524 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.805502 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8698848557-44r8j"] Apr 16 19:31:19.808592 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.808578 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.819547 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.819528 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8698848557-44r8j"] Apr 16 19:31:19.883450 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.883364 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bc472a4-e44f-4c82-a32b-3cda6b957d95-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-dpb5r\" (UID: \"8bc472a4-e44f-4c82-a32b-3cda6b957d95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r" Apr 16 19:31:19.883450 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.883400 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjcv4\" (UniqueName: \"kubernetes.io/projected/8bc472a4-e44f-4c82-a32b-3cda6b957d95-kube-api-access-gjcv4\") pod \"cluster-monitoring-operator-75587bd455-dpb5r\" (UID: \"8bc472a4-e44f-4c82-a32b-3cda6b957d95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r" Apr 16 19:31:19.883450 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.883451 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8bc472a4-e44f-4c82-a32b-3cda6b957d95-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-dpb5r\" (UID: \"8bc472a4-e44f-4c82-a32b-3cda6b957d95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r" Apr 16 19:31:19.883717 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.883473 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8nqk\" (UniqueName: \"kubernetes.io/projected/37b94f27-fe0b-4bd4-9976-8459a9f483b5-kube-api-access-h8nqk\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.883717 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.883500 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/37b94f27-fe0b-4bd4-9976-8459a9f483b5-registry-tls\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.883717 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.883524 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37b94f27-fe0b-4bd4-9976-8459a9f483b5-trusted-ca\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.883717 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.883551 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/37b94f27-fe0b-4bd4-9976-8459a9f483b5-image-registry-private-configuration\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.883717 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.883580 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/37b94f27-fe0b-4bd4-9976-8459a9f483b5-registry-certificates\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.883717 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.883665 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/37b94f27-fe0b-4bd4-9976-8459a9f483b5-installation-pull-secrets\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.883926 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.883736 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/37b94f27-fe0b-4bd4-9976-8459a9f483b5-ca-trust-extracted\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.883926 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.883795 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37b94f27-fe0b-4bd4-9976-8459a9f483b5-bound-sa-token\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.884182 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.884166 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/8bc472a4-e44f-4c82-a32b-3cda6b957d95-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-dpb5r\" (UID: \"8bc472a4-e44f-4c82-a32b-3cda6b957d95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r" Apr 16 19:31:19.886015 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.885991 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bc472a4-e44f-4c82-a32b-3cda6b957d95-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-dpb5r\" (UID: \"8bc472a4-e44f-4c82-a32b-3cda6b957d95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r" Apr 16 19:31:19.886473 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.886456 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:31:19.893527 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.893510 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:31:19.894349 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.894332 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjcv4\" (UniqueName: \"kubernetes.io/projected/8bc472a4-e44f-4c82-a32b-3cda6b957d95-kube-api-access-gjcv4\") pod \"cluster-monitoring-operator-75587bd455-dpb5r\" (UID: \"8bc472a4-e44f-4c82-a32b-3cda6b957d95\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r" Apr 16 19:31:19.935754 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.935729 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-685dd866df-djm4j" Apr 16 19:31:19.941544 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.941512 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" Apr 16 19:31:19.985094 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.984640 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbt4b\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-kube-api-access-pbt4b\") pod \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " Apr 16 19:31:19.985094 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.984715 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-trusted-ca\") pod \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " Apr 16 19:31:19.985094 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.984770 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-bound-sa-token\") pod \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " Apr 16 19:31:19.985094 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.984809 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-ca-trust-extracted\") pod \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " Apr 16 19:31:19.985094 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.984837 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-certificates\") pod \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " Apr 16 19:31:19.985094 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.984886 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-installation-pull-secrets\") pod \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " Apr 16 19:31:19.985094 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.984921 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-image-registry-private-configuration\") pod \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\" (UID: \"b98edfdf-3c43-46b2-a5e7-d5733e434e5b\") " Apr 16 19:31:19.985094 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.985032 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8nqk\" (UniqueName: \"kubernetes.io/projected/37b94f27-fe0b-4bd4-9976-8459a9f483b5-kube-api-access-h8nqk\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.985094 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.985082 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/37b94f27-fe0b-4bd4-9976-8459a9f483b5-registry-tls\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.985567 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.985107 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37b94f27-fe0b-4bd4-9976-8459a9f483b5-trusted-ca\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.985567 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.985138 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/37b94f27-fe0b-4bd4-9976-8459a9f483b5-image-registry-private-configuration\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.985567 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.985162 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/37b94f27-fe0b-4bd4-9976-8459a9f483b5-registry-certificates\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.985567 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.985201 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/37b94f27-fe0b-4bd4-9976-8459a9f483b5-installation-pull-secrets\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.985567 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.985230 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/37b94f27-fe0b-4bd4-9976-8459a9f483b5-ca-trust-extracted\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.985567 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.985291 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37b94f27-fe0b-4bd4-9976-8459a9f483b5-bound-sa-token\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.986994 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.986344 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b98edfdf-3c43-46b2-a5e7-d5733e434e5b" (UID: "b98edfdf-3c43-46b2-a5e7-d5733e434e5b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:31:19.986994 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.986737 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b98edfdf-3c43-46b2-a5e7-d5733e434e5b" (UID: "b98edfdf-3c43-46b2-a5e7-d5733e434e5b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:31:19.988553 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.988523 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/37b94f27-fe0b-4bd4-9976-8459a9f483b5-registry-certificates\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.989663 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.989636 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/37b94f27-fe0b-4bd4-9976-8459a9f483b5-trusted-ca\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.990087 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.990050 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b98edfdf-3c43-46b2-a5e7-d5733e434e5b" (UID: "b98edfdf-3c43-46b2-a5e7-d5733e434e5b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:31:19.990539 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.990387 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/37b94f27-fe0b-4bd4-9976-8459a9f483b5-ca-trust-extracted\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.993180 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.992946 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-kube-api-access-pbt4b" (OuterVolumeSpecName: "kube-api-access-pbt4b") pod "b98edfdf-3c43-46b2-a5e7-d5733e434e5b" (UID: "b98edfdf-3c43-46b2-a5e7-d5733e434e5b"). InnerVolumeSpecName "kube-api-access-pbt4b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:31:19.993180 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.993059 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/37b94f27-fe0b-4bd4-9976-8459a9f483b5-image-registry-private-configuration\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.993352 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.993286 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b98edfdf-3c43-46b2-a5e7-d5733e434e5b" (UID: "b98edfdf-3c43-46b2-a5e7-d5733e434e5b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:31:19.993588 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.993559 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/37b94f27-fe0b-4bd4-9976-8459a9f483b5-registry-tls\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.993975 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.993950 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b98edfdf-3c43-46b2-a5e7-d5733e434e5b" (UID: "b98edfdf-3c43-46b2-a5e7-d5733e434e5b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:31:19.994754 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.994711 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b98edfdf-3c43-46b2-a5e7-d5733e434e5b" (UID: "b98edfdf-3c43-46b2-a5e7-d5733e434e5b"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:31:19.996050 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.996010 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/37b94f27-fe0b-4bd4-9976-8459a9f483b5-bound-sa-token\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.996138 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.996106 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/37b94f27-fe0b-4bd4-9976-8459a9f483b5-installation-pull-secrets\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:19.998223 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:19.998192 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8nqk\" (UniqueName: \"kubernetes.io/projected/37b94f27-fe0b-4bd4-9976-8459a9f483b5-kube-api-access-h8nqk\") pod \"image-registry-8698848557-44r8j\" (UID: \"37b94f27-fe0b-4bd4-9976-8459a9f483b5\") " pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:20.035181 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.035155 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r" Apr 16 19:31:20.060066 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.060034 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-685dd866df-djm4j"] Apr 16 19:31:20.064324 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:31:20.064296 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69992ee4_b554_4ee1_ac4e_537e92560ff4.slice/crio-4259f46188c24d75d0458a0d1f3af754cbc22def1a9857be3601109c4390fea3 WatchSource:0}: Error finding container 4259f46188c24d75d0458a0d1f3af754cbc22def1a9857be3601109c4390fea3: Status 404 returned error can't find the container with id 4259f46188c24d75d0458a0d1f3af754cbc22def1a9857be3601109c4390fea3 Apr 16 19:31:20.080549 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.080526 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm"] Apr 16 19:31:20.083431 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:31:20.083380 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd57dce11_4758_49ee_abae_5ed6192a28a3.slice/crio-c756c94044a4845f2e049036c826ea08a9a60bce06a9f555102faf8825cdc6aa WatchSource:0}: Error finding container c756c94044a4845f2e049036c826ea08a9a60bce06a9f555102faf8825cdc6aa: Status 404 returned error can't find the container with id c756c94044a4845f2e049036c826ea08a9a60bce06a9f555102faf8825cdc6aa Apr 16 19:31:20.086010 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.085955 2568 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-bound-sa-token\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:31:20.086010 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.085979 2568 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-ca-trust-extracted\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:31:20.086010 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.085994 2568 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-certificates\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:31:20.086156 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.086012 2568 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-installation-pull-secrets\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:31:20.086156 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.086027 2568 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-image-registry-private-configuration\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:31:20.086156 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.086042 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pbt4b\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-kube-api-access-pbt4b\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:31:20.086156 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.086056 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-trusted-ca\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:31:20.125532 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.125511 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-25xm9\"" Apr 16 19:31:20.127635 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.127607 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:20.153905 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.153880 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r"] Apr 16 19:31:20.157019 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:31:20.156985 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bc472a4_e44f_4c82_a32b_3cda6b957d95.slice/crio-3d6646d18cd5f98ba4d0eb9ec2b8d61680a4ae742d40e1384538ccf13e89d5a5 WatchSource:0}: Error finding container 3d6646d18cd5f98ba4d0eb9ec2b8d61680a4ae742d40e1384538ccf13e89d5a5: Status 404 returned error can't find the container with id 3d6646d18cd5f98ba4d0eb9ec2b8d61680a4ae742d40e1384538ccf13e89d5a5 Apr 16 19:31:20.242636 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.242605 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8698848557-44r8j"] Apr 16 19:31:20.245714 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:31:20.245664 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37b94f27_fe0b_4bd4_9976_8459a9f483b5.slice/crio-51e545c0ef95664975f973ba8da412c3c2b271490cd059557f2e0edfaf473e33 WatchSource:0}: Error finding container 51e545c0ef95664975f973ba8da412c3c2b271490cd059557f2e0edfaf473e33: Status 404 returned error can't find the container with id 51e545c0ef95664975f973ba8da412c3c2b271490cd059557f2e0edfaf473e33 Apr 16 19:31:20.890979 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.890917 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" event={"ID":"d57dce11-4758-49ee-abae-5ed6192a28a3","Type":"ContainerStarted","Data":"c756c94044a4845f2e049036c826ea08a9a60bce06a9f555102faf8825cdc6aa"} Apr 16 19:31:20.892379 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.892346 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-685dd866df-djm4j" event={"ID":"69992ee4-b554-4ee1-ac4e-537e92560ff4","Type":"ContainerStarted","Data":"4259f46188c24d75d0458a0d1f3af754cbc22def1a9857be3601109c4390fea3"} Apr 16 19:31:20.894568 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.894537 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8698848557-44r8j" event={"ID":"37b94f27-fe0b-4bd4-9976-8459a9f483b5","Type":"ContainerStarted","Data":"0906a94398c24b664f7d7054b63f2e5634dc81ed652958a46bd5b91c0b6846ab"} Apr 16 19:31:20.894706 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.894576 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8698848557-44r8j" event={"ID":"37b94f27-fe0b-4bd4-9976-8459a9f483b5","Type":"ContainerStarted","Data":"51e545c0ef95664975f973ba8da412c3c2b271490cd059557f2e0edfaf473e33"} Apr 16 19:31:20.895315 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.895278 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:20.896879 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.896840 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r" event={"ID":"8bc472a4-e44f-4c82-a32b-3cda6b957d95","Type":"ContainerStarted","Data":"3d6646d18cd5f98ba4d0eb9ec2b8d61680a4ae742d40e1384538ccf13e89d5a5"} Apr 16 19:31:20.896879 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.896859 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-845cbc8db8-wbt2x" Apr 16 19:31:20.918040 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.917988 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8698848557-44r8j" podStartSLOduration=1.917971352 podStartE2EDuration="1.917971352s" podCreationTimestamp="2026-04-16 19:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:31:20.917105549 +0000 UTC m=+63.840802150" watchObservedRunningTime="2026-04-16 19:31:20.917971352 +0000 UTC m=+63.841667954" Apr 16 19:31:20.955092 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.955060 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-845cbc8db8-wbt2x"] Apr 16 19:31:20.956802 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.956780 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-845cbc8db8-wbt2x"] Apr 16 19:31:20.993702 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:20.992821 2568 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b98edfdf-3c43-46b2-a5e7-d5733e434e5b-registry-tls\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:31:21.654301 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:21.654260 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b98edfdf-3c43-46b2-a5e7-d5733e434e5b" path="/var/lib/kubelet/pods/b98edfdf-3c43-46b2-a5e7-d5733e434e5b/volumes" Apr 16 19:31:21.799890 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:21.799567 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:31:21.813365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:21.813299 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d702f7c4-fdbf-4294-9aee-07546029945f-metrics-tls\") pod \"dns-default-brrh2\" (UID: \"d702f7c4-fdbf-4294-9aee-07546029945f\") " pod="openshift-dns/dns-default-brrh2" Apr 16 19:31:21.900454 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:21.900010 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert\") pod \"ingress-canary-gqlxj\" (UID: \"09f9c8cc-cfd9-462c-9967-1afa5e6543ea\") " pod="openshift-ingress-canary/ingress-canary-gqlxj" Apr 16 19:31:21.906196 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:21.906126 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09f9c8cc-cfd9-462c-9967-1afa5e6543ea-cert\") pod \"ingress-canary-gqlxj\" (UID: \"09f9c8cc-cfd9-462c-9967-1afa5e6543ea\") " pod="openshift-ingress-canary/ingress-canary-gqlxj" Apr 16 19:31:22.098002 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:22.097969 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fkm8b\"" Apr 16 19:31:22.106082 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:22.106055 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-brrh2" Apr 16 19:31:22.110171 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:22.110150 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fqt5t\"" Apr 16 19:31:22.118956 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:22.118933 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gqlxj" Apr 16 19:31:23.312187 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:23.312148 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs\") pod \"network-metrics-daemon-p9mcp\" (UID: \"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6\") " pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:31:23.312624 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:23.312225 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f58z2\" (UniqueName: \"kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2\") pod \"network-check-target-h6v8t\" (UID: \"9406cd72-aaad-46f9-8e4d-59aa641ee42f\") " pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:31:23.314780 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:23.314756 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:31:23.314884 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:23.314756 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:31:23.325048 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:23.324903 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:31:23.325424 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:23.325393 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1a9f09ff-a8fe-41b7-b833-8c5091a88fb6-metrics-certs\") pod \"network-metrics-daemon-p9mcp\" (UID: \"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6\") " pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:31:23.335466 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:23.335435 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f58z2\" (UniqueName: \"kubernetes.io/projected/9406cd72-aaad-46f9-8e4d-59aa641ee42f-kube-api-access-f58z2\") pod \"network-check-target-h6v8t\" (UID: \"9406cd72-aaad-46f9-8e4d-59aa641ee42f\") " pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:31:23.466268 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:23.466236 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-52hfs\"" Apr 16 19:31:23.470233 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:23.470201 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p422q\"" Apr 16 19:31:23.473923 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:23.473902 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:31:23.478599 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:23.478576 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p9mcp" Apr 16 19:31:25.097888 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.097807 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p9mcp"] Apr 16 19:31:25.112480 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.111374 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-brrh2"] Apr 16 19:31:25.135120 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.135069 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gqlxj"] Apr 16 19:31:25.138894 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:31:25.138843 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09f9c8cc_cfd9_462c_9967_1afa5e6543ea.slice/crio-7c328e7da6c21fc4a52e0b2f8f66551b952e3844be9f5e8c4a009a899ef0327d WatchSource:0}: Error finding container 7c328e7da6c21fc4a52e0b2f8f66551b952e3844be9f5e8c4a009a899ef0327d: Status 404 returned error can't find the container with id 7c328e7da6c21fc4a52e0b2f8f66551b952e3844be9f5e8c4a009a899ef0327d Apr 16 19:31:25.149884 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.149851 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-h6v8t"] Apr 16 19:31:25.164666 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:31:25.164583 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9406cd72_aaad_46f9_8e4d_59aa641ee42f.slice/crio-ab0f9c69e4dc16d621788fce8899160c18ca35b44fbde7896e831b7e451bad0f WatchSource:0}: Error finding container ab0f9c69e4dc16d621788fce8899160c18ca35b44fbde7896e831b7e451bad0f: Status 404 returned error can't find the container with id ab0f9c69e4dc16d621788fce8899160c18ca35b44fbde7896e831b7e451bad0f Apr 16 19:31:25.529310 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.529212 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6wwx6"] Apr 16 19:31:25.532313 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.532288 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6wwx6" Apr 16 19:31:25.533870 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.533846 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6wwx6"] Apr 16 19:31:25.534915 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.534895 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 19:31:25.535207 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.535181 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-96g8w\"" Apr 16 19:31:25.632577 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.632537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/403c7ad0-c2e7-4007-a079-55ac5fac2efe-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6wwx6\" (UID: \"403c7ad0-c2e7-4007-a079-55ac5fac2efe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6wwx6" Apr 16 19:31:25.734433 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.733875 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/403c7ad0-c2e7-4007-a079-55ac5fac2efe-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6wwx6\" (UID: \"403c7ad0-c2e7-4007-a079-55ac5fac2efe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6wwx6" Apr 16 19:31:25.737461 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.737401 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/403c7ad0-c2e7-4007-a079-55ac5fac2efe-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-6wwx6\" (UID: \"403c7ad0-c2e7-4007-a079-55ac5fac2efe\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6wwx6" Apr 16 19:31:25.844734 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.844305 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6wwx6" Apr 16 19:31:25.918254 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.916154 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" event={"ID":"d57dce11-4758-49ee-abae-5ed6192a28a3","Type":"ContainerStarted","Data":"c55c40f0151b0436fed28367afbec58fae750d71b2c6fc2f2fe8c4f0ba16cf83"} Apr 16 19:31:25.918254 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.917737 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" Apr 16 19:31:25.920234 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.919443 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" Apr 16 19:31:25.921507 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.921350 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-685dd866df-djm4j" event={"ID":"69992ee4-b554-4ee1-ac4e-537e92560ff4","Type":"ContainerStarted","Data":"0edd1813776fb5fe1101a6a6e661fa3b7ba47e9d01ca6bdfcfc4ca29be6bd7ce"} Apr 16 19:31:25.924817 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.924780 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p9mcp" event={"ID":"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6","Type":"ContainerStarted","Data":"9415ac6d15889af780ff4b8bf2cb0f856c8854a376887ad3dfeb8f295fe83051"} Apr 16 19:31:25.926577 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.926536 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-brrh2" event={"ID":"d702f7c4-fdbf-4294-9aee-07546029945f","Type":"ContainerStarted","Data":"14c770a29942374a48519f5ad4f11f7ebae0e1190b875cc16761091a353a167e"} Apr 16 19:31:25.930590 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.928452 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-h6v8t" event={"ID":"9406cd72-aaad-46f9-8e4d-59aa641ee42f","Type":"ContainerStarted","Data":"ab0f9c69e4dc16d621788fce8899160c18ca35b44fbde7896e831b7e451bad0f"} Apr 16 19:31:25.937725 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.934307 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r" event={"ID":"8bc472a4-e44f-4c82-a32b-3cda6b957d95","Type":"ContainerStarted","Data":"46aadf291672fe499330f64d6c8982e054a62d0341ef65d96b42ea8ff393d2d3"} Apr 16 19:31:25.941123 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.941099 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gqlxj" event={"ID":"09f9c8cc-cfd9-462c-9967-1afa5e6543ea","Type":"ContainerStarted","Data":"7c328e7da6c21fc4a52e0b2f8f66551b952e3844be9f5e8c4a009a899ef0327d"} Apr 16 19:31:25.948966 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.947534 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-557b7d9676-db5sm" podStartSLOduration=2.101343649 podStartE2EDuration="6.947517557s" podCreationTimestamp="2026-04-16 19:31:19 +0000 UTC" firstStartedPulling="2026-04-16 19:31:20.086715909 +0000 UTC m=+63.010412489" lastFinishedPulling="2026-04-16 19:31:24.932889805 +0000 UTC m=+67.856586397" observedRunningTime="2026-04-16 19:31:25.933121952 +0000 UTC m=+68.856818554" watchObservedRunningTime="2026-04-16 19:31:25.947517557 +0000 UTC m=+68.871214158" Apr 16 19:31:25.970441 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.970386 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-685dd866df-djm4j" podStartSLOduration=2.120994725 podStartE2EDuration="6.970368593s" podCreationTimestamp="2026-04-16 19:31:19 +0000 UTC" firstStartedPulling="2026-04-16 19:31:20.066082405 +0000 UTC m=+62.989778986" lastFinishedPulling="2026-04-16 19:31:24.915456264 +0000 UTC m=+67.839152854" observedRunningTime="2026-04-16 19:31:25.948733584 +0000 UTC m=+68.872430185" watchObservedRunningTime="2026-04-16 19:31:25.970368593 +0000 UTC m=+68.894065194" Apr 16 19:31:25.985058 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:25.985015 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-dpb5r" podStartSLOduration=2.232296838 podStartE2EDuration="6.985001487s" podCreationTimestamp="2026-04-16 19:31:19 +0000 UTC" firstStartedPulling="2026-04-16 19:31:20.158833209 +0000 UTC m=+63.082529790" lastFinishedPulling="2026-04-16 19:31:24.911537861 +0000 UTC m=+67.835234439" observedRunningTime="2026-04-16 19:31:25.984313634 +0000 UTC m=+68.908010235" watchObservedRunningTime="2026-04-16 19:31:25.985001487 +0000 UTC m=+68.908698089" Apr 16 19:31:26.006968 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:26.006934 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6wwx6"] Apr 16 19:31:26.016196 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:31:26.016145 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod403c7ad0_c2e7_4007_a079_55ac5fac2efe.slice/crio-66eeec6fba023667a0a61c9f6562ee6ee2711132eebf4cb87d12370de52a10cc WatchSource:0}: Error finding container 66eeec6fba023667a0a61c9f6562ee6ee2711132eebf4cb87d12370de52a10cc: Status 404 returned error can't find the container with id 66eeec6fba023667a0a61c9f6562ee6ee2711132eebf4cb87d12370de52a10cc Apr 16 19:31:26.706731 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:26.706697 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ftbrq_fd63a5dc-d580-47b6-b37d-b1972fcea60a/dns-node-resolver/0.log" Apr 16 19:31:26.945417 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:26.945383 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6wwx6" event={"ID":"403c7ad0-c2e7-4007-a079-55ac5fac2efe","Type":"ContainerStarted","Data":"66eeec6fba023667a0a61c9f6562ee6ee2711132eebf4cb87d12370de52a10cc"} Apr 16 19:31:27.307162 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:27.307127 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-8698848557-44r8j_37b94f27-fe0b-4bd4-9976-8459a9f483b5/registry/0.log" Apr 16 19:31:27.906391 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:27.906304 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vsb4w_ab25b071-b28e-4ed4-8595-4ea92620f2bd/node-ca/0.log" Apr 16 19:31:27.985770 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:27.985734 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-nb8l9"] Apr 16 19:31:27.990762 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:27.990741 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:27.993117 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:27.993079 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:31:27.993254 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:27.993091 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:31:27.993254 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:27.993186 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:31:27.993254 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:27.993201 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:31:27.994042 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:27.994021 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jq9hf\"" Apr 16 19:31:27.998360 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:27.998334 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nb8l9"] Apr 16 19:31:28.053261 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.053225 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1c015403-db65-48c7-860c-61aaa90431fd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.053475 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.053282 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1c015403-db65-48c7-860c-61aaa90431fd-data-volume\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.053475 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.053401 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1c015403-db65-48c7-860c-61aaa90431fd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.053475 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.053459 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1c015403-db65-48c7-860c-61aaa90431fd-crio-socket\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.053599 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.053499 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g22h\" (UniqueName: \"kubernetes.io/projected/1c015403-db65-48c7-860c-61aaa90431fd-kube-api-access-2g22h\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.154194 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.154153 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1c015403-db65-48c7-860c-61aaa90431fd-crio-socket\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.154194 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.154207 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2g22h\" (UniqueName: \"kubernetes.io/projected/1c015403-db65-48c7-860c-61aaa90431fd-kube-api-access-2g22h\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.154440 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.154233 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1c015403-db65-48c7-860c-61aaa90431fd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.154440 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.154269 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1c015403-db65-48c7-860c-61aaa90431fd-data-volume\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.154440 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.154309 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1c015403-db65-48c7-860c-61aaa90431fd-crio-socket\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.154440 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:28.154370 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 19:31:28.154440 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:28.154437 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c015403-db65-48c7-860c-61aaa90431fd-insights-runtime-extractor-tls podName:1c015403-db65-48c7-860c-61aaa90431fd nodeName:}" failed. No retries permitted until 2026-04-16 19:31:28.654415029 +0000 UTC m=+71.578111612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/1c015403-db65-48c7-860c-61aaa90431fd-insights-runtime-extractor-tls") pod "insights-runtime-extractor-nb8l9" (UID: "1c015403-db65-48c7-860c-61aaa90431fd") : secret "insights-runtime-extractor-tls" not found Apr 16 19:31:28.154707 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.154318 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1c015403-db65-48c7-860c-61aaa90431fd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.154707 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.154609 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1c015403-db65-48c7-860c-61aaa90431fd-data-volume\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.154934 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.154906 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1c015403-db65-48c7-860c-61aaa90431fd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.163254 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.163199 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g22h\" (UniqueName: \"kubernetes.io/projected/1c015403-db65-48c7-860c-61aaa90431fd-kube-api-access-2g22h\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.659027 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.658992 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1c015403-db65-48c7-860c-61aaa90431fd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.661918 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.661886 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1c015403-db65-48c7-860c-61aaa90431fd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-nb8l9\" (UID: \"1c015403-db65-48c7-860c-61aaa90431fd\") " pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.903286 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.903204 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-nb8l9" Apr 16 19:31:28.951642 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:28.951593 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p9mcp" event={"ID":"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6","Type":"ContainerStarted","Data":"afc1f0cf8b022feea74376b75702a43ad358b3d3f5627a44e3d45021a9afe115"} Apr 16 19:31:29.094582 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.094406 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-nb8l9"] Apr 16 19:31:29.097599 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:31:29.097571 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c015403_db65_48c7_860c_61aaa90431fd.slice/crio-60a4b8044895a97d25b50b3ab8d3713e25efbaa82d6a8b807da58833d7dc81cf WatchSource:0}: Error finding container 60a4b8044895a97d25b50b3ab8d3713e25efbaa82d6a8b807da58833d7dc81cf: Status 404 returned error can't find the container with id 60a4b8044895a97d25b50b3ab8d3713e25efbaa82d6a8b807da58833d7dc81cf Apr 16 19:31:29.957528 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.957494 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p9mcp" event={"ID":"1a9f09ff-a8fe-41b7-b833-8c5091a88fb6","Type":"ContainerStarted","Data":"b18e4a1d262d53afa7f52b41ea2942166f196ab13ae94634c7e0997d0ac84558"} Apr 16 19:31:29.959042 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.959018 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-brrh2" event={"ID":"d702f7c4-fdbf-4294-9aee-07546029945f","Type":"ContainerStarted","Data":"68a51021d4be340b74628eaf9359e31faf9aa4caf5c331ac34e72442782dbddf"} Apr 16 19:31:29.959161 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.959050 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-brrh2" event={"ID":"d702f7c4-fdbf-4294-9aee-07546029945f","Type":"ContainerStarted","Data":"1bffe1bfed1ad208bfd8746355290cda40e7c69bed7ad7fa8c46309ca4e84506"} Apr 16 19:31:29.959211 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.959188 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-brrh2" Apr 16 19:31:29.963413 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.963388 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-h6v8t" event={"ID":"9406cd72-aaad-46f9-8e4d-59aa641ee42f","Type":"ContainerStarted","Data":"25d6d7313a4d89f7d710acfcb0b30329514114904bae397dbf62a4f22836a0dd"} Apr 16 19:31:29.963545 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.963518 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:31:29.964649 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.964617 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6wwx6" event={"ID":"403c7ad0-c2e7-4007-a079-55ac5fac2efe","Type":"ContainerStarted","Data":"5ebf4656c65863343fa2ee8380d77bda884c3605bbc5033709e97aa3fab7a408"} Apr 16 19:31:29.964781 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.964763 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6wwx6" Apr 16 19:31:29.965921 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.965900 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nb8l9" event={"ID":"1c015403-db65-48c7-860c-61aaa90431fd","Type":"ContainerStarted","Data":"84582c10a4ce1d847950ec8b466ae31bb1061e3b069257a21d3cb181bac1719e"} Apr 16 19:31:29.966000 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.965924 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nb8l9" event={"ID":"1c015403-db65-48c7-860c-61aaa90431fd","Type":"ContainerStarted","Data":"60a4b8044895a97d25b50b3ab8d3713e25efbaa82d6a8b807da58833d7dc81cf"} Apr 16 19:31:29.967148 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.967128 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gqlxj" event={"ID":"09f9c8cc-cfd9-462c-9967-1afa5e6543ea","Type":"ContainerStarted","Data":"502dcb56e2f3f7fad4230435d5299fc154809ee70710d59ad2ddced274fd1e40"} Apr 16 19:31:29.969641 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.969625 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6wwx6" Apr 16 19:31:29.976039 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.975998 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-p9mcp" podStartSLOduration=69.474471231 podStartE2EDuration="1m12.975986524s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:31:25.107847044 +0000 UTC m=+68.031543622" lastFinishedPulling="2026-04-16 19:31:28.609362323 +0000 UTC m=+71.533058915" observedRunningTime="2026-04-16 19:31:29.975271598 +0000 UTC m=+72.898968198" watchObservedRunningTime="2026-04-16 19:31:29.975986524 +0000 UTC m=+72.899683123" Apr 16 19:31:29.990897 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:29.990850 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-6wwx6" podStartSLOduration=2.003672375 podStartE2EDuration="4.990833419s" podCreationTimestamp="2026-04-16 19:31:25 +0000 UTC" firstStartedPulling="2026-04-16 19:31:26.017772109 +0000 UTC m=+68.941468694" lastFinishedPulling="2026-04-16 19:31:29.004933139 +0000 UTC m=+71.928629738" observedRunningTime="2026-04-16 19:31:29.990000598 +0000 UTC m=+72.913697197" watchObservedRunningTime="2026-04-16 19:31:29.990833419 +0000 UTC m=+72.914530019" Apr 16 19:31:30.007073 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:30.007007 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-h6v8t" podStartSLOduration=69.258465322 podStartE2EDuration="1m13.00698879s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:31:25.167244948 +0000 UTC m=+68.090941525" lastFinishedPulling="2026-04-16 19:31:28.9157684 +0000 UTC m=+71.839464993" observedRunningTime="2026-04-16 19:31:30.00576409 +0000 UTC m=+72.929460693" watchObservedRunningTime="2026-04-16 19:31:30.00698879 +0000 UTC m=+72.930685393" Apr 16 19:31:30.021718 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:30.021652 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gqlxj" podStartSLOduration=37.551929724 podStartE2EDuration="41.021634512s" podCreationTimestamp="2026-04-16 19:30:49 +0000 UTC" firstStartedPulling="2026-04-16 19:31:25.142375359 +0000 UTC m=+68.066071938" lastFinishedPulling="2026-04-16 19:31:28.612080133 +0000 UTC m=+71.535776726" observedRunningTime="2026-04-16 19:31:30.021245354 +0000 UTC m=+72.944941955" watchObservedRunningTime="2026-04-16 19:31:30.021634512 +0000 UTC m=+72.945331114" Apr 16 19:31:30.972047 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:30.971993 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nb8l9" event={"ID":"1c015403-db65-48c7-860c-61aaa90431fd","Type":"ContainerStarted","Data":"a1090f2dd79aab4a080a08ab427ced1c46215637b6e470c00f84877342f7f5f0"} Apr 16 19:31:32.979847 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:32.979807 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-nb8l9" event={"ID":"1c015403-db65-48c7-860c-61aaa90431fd","Type":"ContainerStarted","Data":"c952c7d2f7e5f71b4b5ab2bbc828995cc009b3bba671c0ac51c738e1bbb02d29"} Apr 16 19:31:32.998474 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:32.998412 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-brrh2" podStartSLOduration=40.50343053 podStartE2EDuration="43.998392366s" podCreationTimestamp="2026-04-16 19:30:49 +0000 UTC" firstStartedPulling="2026-04-16 19:31:25.119345775 +0000 UTC m=+68.043042355" lastFinishedPulling="2026-04-16 19:31:28.614307599 +0000 UTC m=+71.538004191" observedRunningTime="2026-04-16 19:31:30.044872216 +0000 UTC m=+72.968568827" watchObservedRunningTime="2026-04-16 19:31:32.998392366 +0000 UTC m=+75.922088967" Apr 16 19:31:32.999384 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:32.999349 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-nb8l9" podStartSLOduration=3.191694407 podStartE2EDuration="5.999331575s" podCreationTimestamp="2026-04-16 19:31:27 +0000 UTC" firstStartedPulling="2026-04-16 19:31:29.200953986 +0000 UTC m=+72.124650579" lastFinishedPulling="2026-04-16 19:31:32.008591169 +0000 UTC m=+74.932287747" observedRunningTime="2026-04-16 19:31:32.997183516 +0000 UTC m=+75.920880116" watchObservedRunningTime="2026-04-16 19:31:32.999331575 +0000 UTC m=+75.923028174" Apr 16 19:31:34.983351 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:34.983293 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7xz85"] Apr 16 19:31:34.986880 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:34.986855 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:34.989134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:34.989106 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 19:31:34.989280 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:34.989165 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 19:31:34.990343 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:34.990323 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-nfrcb\"" Apr 16 19:31:34.990343 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:34.990332 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:31:34.990505 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:34.990331 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 19:31:34.993241 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:34.993222 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9b9rf"] Apr 16 19:31:34.996418 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:34.996397 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:34.997277 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:34.997255 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7xz85"] Apr 16 19:31:34.998660 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:34.998642 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:31:34.998660 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:34.998656 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:31:34.998850 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:34.998720 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:31:34.998850 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:34.998782 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qtsms\"" Apr 16 19:31:35.112963 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.112932 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e3d17f2-5079-43eb-a07c-110bfa423d12-sys\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.113133 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.112974 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c0525fb0-30a0-4fd3-a006-2b9b67460566-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.113133 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.112998 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-tls\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.113133 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.113023 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.113133 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.113041 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-accelerators-collector-config\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.113133 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.113123 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-wtmp\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.113331 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.113159 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.113331 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.113183 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3e3d17f2-5079-43eb-a07c-110bfa423d12-root\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.113331 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.113199 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-textfile\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.113331 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.113216 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6cn6\" (UniqueName: \"kubernetes.io/projected/3e3d17f2-5079-43eb-a07c-110bfa423d12-kube-api-access-g6cn6\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.113331 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.113235 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkvjx\" (UniqueName: \"kubernetes.io/projected/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-api-access-mkvjx\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.113331 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.113253 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c0525fb0-30a0-4fd3-a006-2b9b67460566-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.113331 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.113322 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.113544 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.113353 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e3d17f2-5079-43eb-a07c-110bfa423d12-metrics-client-ca\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.113544 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.113369 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.214490 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214451 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-wtmp\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.214711 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214513 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.214711 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214547 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3e3d17f2-5079-43eb-a07c-110bfa423d12-root\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.214711 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214571 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-textfile\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.214711 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214596 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6cn6\" (UniqueName: \"kubernetes.io/projected/3e3d17f2-5079-43eb-a07c-110bfa423d12-kube-api-access-g6cn6\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.214711 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214621 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkvjx\" (UniqueName: \"kubernetes.io/projected/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-api-access-mkvjx\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.214711 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214650 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c0525fb0-30a0-4fd3-a006-2b9b67460566-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.214711 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214667 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3e3d17f2-5079-43eb-a07c-110bfa423d12-root\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.215056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214719 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.215056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214749 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e3d17f2-5079-43eb-a07c-110bfa423d12-metrics-client-ca\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.215056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214770 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.215056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214648 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-wtmp\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.215056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214800 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e3d17f2-5079-43eb-a07c-110bfa423d12-sys\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.215056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214830 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c0525fb0-30a0-4fd3-a006-2b9b67460566-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.215056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214862 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-tls\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.215056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214888 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.215056 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:35.214908 2568 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 19:31:35.215056 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:35.214994 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-state-metrics-tls podName:c0525fb0-30a0-4fd3-a006-2b9b67460566 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:35.714971022 +0000 UTC m=+78.638667614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-7xz85" (UID: "c0525fb0-30a0-4fd3-a006-2b9b67460566") : secret "kube-state-metrics-tls" not found Apr 16 19:31:35.215529 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.214913 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-accelerators-collector-config\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.215529 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.215314 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e3d17f2-5079-43eb-a07c-110bfa423d12-metrics-client-ca\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.215529 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.215375 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3e3d17f2-5079-43eb-a07c-110bfa423d12-sys\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.215529 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.215379 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/c0525fb0-30a0-4fd3-a006-2b9b67460566-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.215529 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.215395 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.215529 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.215427 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-accelerators-collector-config\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.215861 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.215751 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-textfile\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.216065 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.216022 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c0525fb0-30a0-4fd3-a006-2b9b67460566-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.218397 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.218376 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.218567 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.218544 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3e3d17f2-5079-43eb-a07c-110bfa423d12-node-exporter-tls\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.218630 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.218614 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.226519 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.226491 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6cn6\" (UniqueName: \"kubernetes.io/projected/3e3d17f2-5079-43eb-a07c-110bfa423d12-kube-api-access-g6cn6\") pod \"node-exporter-9b9rf\" (UID: \"3e3d17f2-5079-43eb-a07c-110bfa423d12\") " pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.226654 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.226629 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkvjx\" (UniqueName: \"kubernetes.io/projected/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-api-access-mkvjx\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.305902 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.305814 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9b9rf" Apr 16 19:31:35.316867 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:31:35.316820 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e3d17f2_5079_43eb_a07c_110bfa423d12.slice/crio-a74e0eddc7e3d806da4c2ecb1f38bc36b10b8955c06bb8be7f0ba24999ae00c1 WatchSource:0}: Error finding container a74e0eddc7e3d806da4c2ecb1f38bc36b10b8955c06bb8be7f0ba24999ae00c1: Status 404 returned error can't find the container with id a74e0eddc7e3d806da4c2ecb1f38bc36b10b8955c06bb8be7f0ba24999ae00c1 Apr 16 19:31:35.719921 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.719882 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.722581 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.722554 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c0525fb0-30a0-4fd3-a006-2b9b67460566-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-7xz85\" (UID: \"c0525fb0-30a0-4fd3-a006-2b9b67460566\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.897698 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.897645 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" Apr 16 19:31:35.992537 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:35.992432 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9b9rf" event={"ID":"3e3d17f2-5079-43eb-a07c-110bfa423d12","Type":"ContainerStarted","Data":"a74e0eddc7e3d806da4c2ecb1f38bc36b10b8955c06bb8be7f0ba24999ae00c1"} Apr 16 19:31:36.083578 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:36.083423 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-7xz85"] Apr 16 19:31:36.088410 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:31:36.088379 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0525fb0_30a0_4fd3_a006_2b9b67460566.slice/crio-2f92f73c4d441b05eaa5bde93df8df19effda40a52141575e2e4bd0307d178b7 WatchSource:0}: Error finding container 2f92f73c4d441b05eaa5bde93df8df19effda40a52141575e2e4bd0307d178b7: Status 404 returned error can't find the container with id 2f92f73c4d441b05eaa5bde93df8df19effda40a52141575e2e4bd0307d178b7 Apr 16 19:31:36.998603 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:36.998564 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" event={"ID":"c0525fb0-30a0-4fd3-a006-2b9b67460566","Type":"ContainerStarted","Data":"2f92f73c4d441b05eaa5bde93df8df19effda40a52141575e2e4bd0307d178b7"} Apr 16 19:31:37.001876 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:37.001238 2568 generic.go:358] "Generic (PLEG): container finished" podID="3e3d17f2-5079-43eb-a07c-110bfa423d12" containerID="27844333fb7b3a8cf3817f950ddd27e22e9fe41e5b590e4fc57cc2cbf4fa680d" exitCode=0 Apr 16 19:31:37.001876 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:37.001338 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9b9rf" event={"ID":"3e3d17f2-5079-43eb-a07c-110bfa423d12","Type":"ContainerDied","Data":"27844333fb7b3a8cf3817f950ddd27e22e9fe41e5b590e4fc57cc2cbf4fa680d"} Apr 16 19:31:38.006375 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.006335 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" event={"ID":"c0525fb0-30a0-4fd3-a006-2b9b67460566","Type":"ContainerStarted","Data":"07661925955c29d90094475bd5a8b3e53e3653efa28b23b2475615cfcef83ce7"} Apr 16 19:31:38.006847 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.006385 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" event={"ID":"c0525fb0-30a0-4fd3-a006-2b9b67460566","Type":"ContainerStarted","Data":"5d8cb535ab0bd78abbf7386a6cceb8a4dc56992bd0ae77d56a526fd77a7bad55"} Apr 16 19:31:38.006847 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.006398 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" event={"ID":"c0525fb0-30a0-4fd3-a006-2b9b67460566","Type":"ContainerStarted","Data":"7371d425c4d08769d60d644c9d63430a7c2ba1f618ac69c8e1b399a3f0ec14f9"} Apr 16 19:31:38.008789 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.008754 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9b9rf" event={"ID":"3e3d17f2-5079-43eb-a07c-110bfa423d12","Type":"ContainerStarted","Data":"b7cf96b123115c294e5a96e2c9132c990a586409615c8b9aeddeeacf76280672"} Apr 16 19:31:38.008929 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.008796 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9b9rf" event={"ID":"3e3d17f2-5079-43eb-a07c-110bfa423d12","Type":"ContainerStarted","Data":"5035dfee381bb69756722b0fcc1b689ce3ca9f45eb11aeb8bcd3dddadce4fc9e"} Apr 16 19:31:38.029950 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.029887 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-7xz85" podStartSLOduration=2.472943647 podStartE2EDuration="4.029867316s" podCreationTimestamp="2026-04-16 19:31:34 +0000 UTC" firstStartedPulling="2026-04-16 19:31:36.094344554 +0000 UTC m=+79.018041144" lastFinishedPulling="2026-04-16 19:31:37.651268232 +0000 UTC m=+80.574964813" observedRunningTime="2026-04-16 19:31:38.029474004 +0000 UTC m=+80.953170628" watchObservedRunningTime="2026-04-16 19:31:38.029867316 +0000 UTC m=+80.953563916" Apr 16 19:31:38.049805 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.049744 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9b9rf" podStartSLOduration=2.94513985 podStartE2EDuration="4.049722066s" podCreationTimestamp="2026-04-16 19:31:34 +0000 UTC" firstStartedPulling="2026-04-16 19:31:35.31901396 +0000 UTC m=+78.242710538" lastFinishedPulling="2026-04-16 19:31:36.423596161 +0000 UTC m=+79.347292754" observedRunningTime="2026-04-16 19:31:38.049451178 +0000 UTC m=+80.973147803" watchObservedRunningTime="2026-04-16 19:31:38.049722066 +0000 UTC m=+80.973418668" Apr 16 19:31:38.064415 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.064365 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-57b9b4c777-nggfv"] Apr 16 19:31:38.068586 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.068548 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.071529 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.071488 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 19:31:38.071709 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.071556 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 19:31:38.071709 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.071498 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 19:31:38.071836 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.071498 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-xxfj8\"" Apr 16 19:31:38.071889 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.071867 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-c6m98qvv05aq2\"" Apr 16 19:31:38.072132 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.071966 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 19:31:38.072132 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.072003 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 19:31:38.082929 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.082897 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-57b9b4c777-nggfv"] Apr 16 19:31:38.141869 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.141820 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6fg\" (UniqueName: \"kubernetes.io/projected/75ffdd93-6f95-4760-97cb-bc2ff147f109-kube-api-access-qt6fg\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.142014 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.141941 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.142014 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.141983 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.142374 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.142028 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75ffdd93-6f95-4760-97cb-bc2ff147f109-metrics-client-ca\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.142374 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.142073 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-grpc-tls\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.142374 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.142113 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.142374 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.142167 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-tls\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.142374 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.142193 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.243258 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.243163 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt6fg\" (UniqueName: \"kubernetes.io/projected/75ffdd93-6f95-4760-97cb-bc2ff147f109-kube-api-access-qt6fg\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.243258 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.243218 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.243462 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.243333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.243462 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.243390 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75ffdd93-6f95-4760-97cb-bc2ff147f109-metrics-client-ca\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.243462 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.243417 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-grpc-tls\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.243462 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.243449 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.243652 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.243609 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-tls\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.243753 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.243652 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.244266 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.244209 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/75ffdd93-6f95-4760-97cb-bc2ff147f109-metrics-client-ca\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.246230 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.246201 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.246434 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.246414 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.246484 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.246436 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-grpc-tls\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.246563 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.246516 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.246633 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.246618 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-tls\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.246819 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.246801 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/75ffdd93-6f95-4760-97cb-bc2ff147f109-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.250213 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.250187 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt6fg\" (UniqueName: \"kubernetes.io/projected/75ffdd93-6f95-4760-97cb-bc2ff147f109-kube-api-access-qt6fg\") pod \"thanos-querier-57b9b4c777-nggfv\" (UID: \"75ffdd93-6f95-4760-97cb-bc2ff147f109\") " pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.380582 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.380540 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:38.513015 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:38.510378 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-57b9b4c777-nggfv"] Apr 16 19:31:39.012334 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:39.012293 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" event={"ID":"75ffdd93-6f95-4760-97cb-bc2ff147f109","Type":"ContainerStarted","Data":"a1eaab2915663867679a8b3b7e1e2dcc79069265ffade8e72d46f85f390c30e2"} Apr 16 19:31:39.747868 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:39.747835 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-422kv"] Apr 16 19:31:39.751037 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:39.751015 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-422kv" Apr 16 19:31:39.753785 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:39.753762 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-m2vl2\"" Apr 16 19:31:39.753785 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:39.753779 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 19:31:39.760358 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:39.760328 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-422kv"] Apr 16 19:31:39.856787 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:39.856751 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/30844433-d473-4019-98ea-1e208e6aea91-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-422kv\" (UID: \"30844433-d473-4019-98ea-1e208e6aea91\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-422kv" Apr 16 19:31:39.958132 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:39.958095 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/30844433-d473-4019-98ea-1e208e6aea91-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-422kv\" (UID: \"30844433-d473-4019-98ea-1e208e6aea91\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-422kv" Apr 16 19:31:39.958324 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:39.958265 2568 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 19:31:39.958388 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:39.958351 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30844433-d473-4019-98ea-1e208e6aea91-monitoring-plugin-cert podName:30844433-d473-4019-98ea-1e208e6aea91 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:40.458326673 +0000 UTC m=+83.382023271 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/30844433-d473-4019-98ea-1e208e6aea91-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-422kv" (UID: "30844433-d473-4019-98ea-1e208e6aea91") : secret "monitoring-plugin-cert" not found Apr 16 19:31:39.975508 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:39.975478 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-brrh2" Apr 16 19:31:40.463061 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:40.463034 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/30844433-d473-4019-98ea-1e208e6aea91-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-422kv\" (UID: \"30844433-d473-4019-98ea-1e208e6aea91\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-422kv" Apr 16 19:31:40.463365 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:40.463162 2568 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 19:31:40.463365 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:31:40.463218 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30844433-d473-4019-98ea-1e208e6aea91-monitoring-plugin-cert podName:30844433-d473-4019-98ea-1e208e6aea91 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:41.463203305 +0000 UTC m=+84.386899884 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/30844433-d473-4019-98ea-1e208e6aea91-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-422kv" (UID: "30844433-d473-4019-98ea-1e208e6aea91") : secret "monitoring-plugin-cert" not found Apr 16 19:31:41.020524 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.020474 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" event={"ID":"75ffdd93-6f95-4760-97cb-bc2ff147f109","Type":"ContainerStarted","Data":"3fc968d235cd31abaf552b31bbaf23ec2d6ce3ce23bde717279ccbce4aa8a094"} Apr 16 19:31:41.020524 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.020516 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" event={"ID":"75ffdd93-6f95-4760-97cb-bc2ff147f109","Type":"ContainerStarted","Data":"bc4c34ef80cb6dbad6d5a813740e168a22b1bc492fd9386bbc51b8e7849cdbfb"} Apr 16 19:31:41.020524 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.020529 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" event={"ID":"75ffdd93-6f95-4760-97cb-bc2ff147f109","Type":"ContainerStarted","Data":"3f8f68ac29607c6dc498815b0e270f537a9b90b9d0ea22a8cd406b690cfb92e9"} Apr 16 19:31:41.235103 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.235069 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:31:41.239285 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.239258 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.241907 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.241879 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 19:31:41.242408 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.242386 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-cpkmg\"" Apr 16 19:31:41.242518 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.242419 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 19:31:41.242816 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.242795 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 19:31:41.242928 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.242863 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 19:31:41.242928 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.242866 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 19:31:41.242928 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.242919 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 19:31:41.243054 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.242868 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 19:31:41.244122 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.243793 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bdq9rdlp8j1v\"" Apr 16 19:31:41.244122 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.243830 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 19:31:41.244122 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.243906 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 19:31:41.244321 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.244295 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 19:31:41.244321 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.244315 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 19:31:41.244923 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.244787 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:31:41.250099 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.250072 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 19:31:41.256074 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.256040 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:31:41.270004 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.269967 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270182 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270015 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270182 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270054 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270182 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270106 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270182 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270182 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270140 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-web-config\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270182 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270158 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270441 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270190 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270441 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270213 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270441 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270292 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-config-out\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270441 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270322 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270441 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270353 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270441 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270370 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270441 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270420 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270813 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270470 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7ljg\" (UniqueName: \"kubernetes.io/projected/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-kube-api-access-r7ljg\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270813 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270538 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270813 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270584 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-config\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.270813 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.270626 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.371883 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.371848 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372066 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.371894 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372066 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.371920 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372066 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.371945 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-web-config\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372066 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.371972 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372066 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.372005 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372066 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.372027 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372398 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.372072 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-config-out\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372398 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.372093 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372398 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.372126 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372398 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.372151 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372398 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.372178 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372398 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.372210 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7ljg\" (UniqueName: \"kubernetes.io/projected/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-kube-api-access-r7ljg\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372398 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.372256 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372398 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.372290 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-config\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372398 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.372327 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372398 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.372355 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.372398 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.372386 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.374042 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.374006 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.374177 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.374071 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.376972 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.375971 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.376972 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.376663 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.379570 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.377352 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.379570 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.377492 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.379570 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.377546 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.379570 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.377928 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.379570 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.378024 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.379570 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.378188 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-config\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.379570 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.378345 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.379570 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.378644 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.379570 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.378880 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.380053 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.379837 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-config-out\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.380053 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.379990 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-web-config\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.380155 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.380064 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.380265 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.380236 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.391477 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.391447 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7ljg\" (UniqueName: \"kubernetes.io/projected/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-kube-api-access-r7ljg\") pod \"prometheus-k8s-0\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.473947 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.473903 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/30844433-d473-4019-98ea-1e208e6aea91-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-422kv\" (UID: \"30844433-d473-4019-98ea-1e208e6aea91\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-422kv" Apr 16 19:31:41.476748 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.476719 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/30844433-d473-4019-98ea-1e208e6aea91-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-422kv\" (UID: \"30844433-d473-4019-98ea-1e208e6aea91\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-422kv" Apr 16 19:31:41.552253 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.552224 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:31:41.561004 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.560981 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-422kv" Apr 16 19:31:41.717127 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.717104 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:31:41.719534 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:41.719508 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-422kv"] Apr 16 19:31:41.722378 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:31:41.722346 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c9d9f8e_fe59_4e0b_b536_a2a917cadf3c.slice/crio-08c9a81b465a80305014f919384384ff4df099b747c8080fe1f722ab39d0e03c WatchSource:0}: Error finding container 08c9a81b465a80305014f919384384ff4df099b747c8080fe1f722ab39d0e03c: Status 404 returned error can't find the container with id 08c9a81b465a80305014f919384384ff4df099b747c8080fe1f722ab39d0e03c Apr 16 19:31:41.726172 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:31:41.726144 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30844433_d473_4019_98ea_1e208e6aea91.slice/crio-4ae7a8972699ef409e678076f838296e40e09fa11f382472e4945df47051c71e WatchSource:0}: Error finding container 4ae7a8972699ef409e678076f838296e40e09fa11f382472e4945df47051c71e: Status 404 returned error can't find the container with id 4ae7a8972699ef409e678076f838296e40e09fa11f382472e4945df47051c71e Apr 16 19:31:42.026451 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:42.026416 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" event={"ID":"75ffdd93-6f95-4760-97cb-bc2ff147f109","Type":"ContainerStarted","Data":"48932cf473f2e02576d9c062bfe08040392805d93faefb393a98f5fd97ca0cce"} Apr 16 19:31:42.026451 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:42.026460 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" event={"ID":"75ffdd93-6f95-4760-97cb-bc2ff147f109","Type":"ContainerStarted","Data":"ab6170e7575fd87d27771451e39c1c68c0e10826fda173f1fa05bd7b06964abc"} Apr 16 19:31:42.026747 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:42.026472 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" event={"ID":"75ffdd93-6f95-4760-97cb-bc2ff147f109","Type":"ContainerStarted","Data":"bb3dd125e4b080794eca2cb2fcd31979e30fb8f410606c3610c14a39a2e52d9e"} Apr 16 19:31:42.026747 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:42.026611 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:42.027584 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:42.027555 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-422kv" event={"ID":"30844433-d473-4019-98ea-1e208e6aea91","Type":"ContainerStarted","Data":"4ae7a8972699ef409e678076f838296e40e09fa11f382472e4945df47051c71e"} Apr 16 19:31:42.028617 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:42.028598 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerStarted","Data":"08c9a81b465a80305014f919384384ff4df099b747c8080fe1f722ab39d0e03c"} Apr 16 19:31:42.051824 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:42.051770 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" podStartSLOduration=1.027862564 podStartE2EDuration="4.051753832s" podCreationTimestamp="2026-04-16 19:31:38 +0000 UTC" firstStartedPulling="2026-04-16 19:31:38.515167876 +0000 UTC m=+81.438864453" lastFinishedPulling="2026-04-16 19:31:41.53905914 +0000 UTC m=+84.462755721" observedRunningTime="2026-04-16 19:31:42.050198604 +0000 UTC m=+84.973895203" watchObservedRunningTime="2026-04-16 19:31:42.051753832 +0000 UTC m=+84.975450432" Apr 16 19:31:42.907312 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:42.907234 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8698848557-44r8j" Apr 16 19:31:43.033388 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:43.033357 2568 generic.go:358] "Generic (PLEG): container finished" podID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerID="42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603" exitCode=0 Apr 16 19:31:43.033559 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:43.033453 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerDied","Data":"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603"} Apr 16 19:31:44.038530 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:44.038488 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-422kv" event={"ID":"30844433-d473-4019-98ea-1e208e6aea91","Type":"ContainerStarted","Data":"2b75b7a7b1cdc2f1ede1408525278d3c9b0ddc9a09cbe21ec89d0dc9e6b4fcbe"} Apr 16 19:31:44.039016 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:44.038869 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-422kv" Apr 16 19:31:44.044271 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:44.044245 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-422kv" Apr 16 19:31:44.053621 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:44.053567 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-422kv" podStartSLOduration=3.527823045 podStartE2EDuration="5.053552267s" podCreationTimestamp="2026-04-16 19:31:39 +0000 UTC" firstStartedPulling="2026-04-16 19:31:41.728604479 +0000 UTC m=+84.652301069" lastFinishedPulling="2026-04-16 19:31:43.254333699 +0000 UTC m=+86.178030291" observedRunningTime="2026-04-16 19:31:44.053211262 +0000 UTC m=+86.976907864" watchObservedRunningTime="2026-04-16 19:31:44.053552267 +0000 UTC m=+86.977248868" Apr 16 19:31:47.050539 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:47.050370 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerStarted","Data":"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe"} Apr 16 19:31:47.050539 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:47.050406 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerStarted","Data":"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83"} Apr 16 19:31:47.050539 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:47.050415 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerStarted","Data":"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d"} Apr 16 19:31:47.050539 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:47.050422 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerStarted","Data":"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12"} Apr 16 19:31:48.039526 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:48.039495 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-57b9b4c777-nggfv" Apr 16 19:31:48.058080 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:48.058037 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerStarted","Data":"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1"} Apr 16 19:31:48.058080 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:48.058088 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerStarted","Data":"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1"} Apr 16 19:31:48.129802 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:48.129742 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.110946208 podStartE2EDuration="7.129723126s" podCreationTimestamp="2026-04-16 19:31:41 +0000 UTC" firstStartedPulling="2026-04-16 19:31:41.725032278 +0000 UTC m=+84.648728856" lastFinishedPulling="2026-04-16 19:31:46.743809184 +0000 UTC m=+89.667505774" observedRunningTime="2026-04-16 19:31:48.128183772 +0000 UTC m=+91.051880373" watchObservedRunningTime="2026-04-16 19:31:48.129723126 +0000 UTC m=+91.053419726" Apr 16 19:31:51.553060 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:31:51.553021 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:32:00.974450 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:32:00.974419 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-h6v8t" Apr 16 19:32:41.552447 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:32:41.552408 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:32:41.572549 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:32:41.572516 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:32:42.238256 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:32:42.238204 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:32:59.656632 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:32:59.656601 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:32:59.657109 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:32:59.657069 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="prometheus" containerID="cri-o://a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12" gracePeriod=600 Apr 16 19:32:59.657175 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:32:59.657112 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="kube-rbac-proxy-web" containerID="cri-o://58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe" gracePeriod=600 Apr 16 19:32:59.657175 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:32:59.657103 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="thanos-sidecar" containerID="cri-o://31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83" gracePeriod=600 Apr 16 19:32:59.657175 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:32:59.657103 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="kube-rbac-proxy" containerID="cri-o://cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1" gracePeriod=600 Apr 16 19:32:59.657175 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:32:59.657128 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="kube-rbac-proxy-thanos" containerID="cri-o://6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1" gracePeriod=600 Apr 16 19:32:59.657362 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:32:59.657196 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="config-reloader" containerID="cri-o://aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d" gracePeriod=600 Apr 16 19:32:59.895799 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:32:59.895774 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.058729 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.058630 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-kubelet-serving-ca-bundle\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.058729 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.058664 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-kube-rbac-proxy\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.058729 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.058708 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-tls\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.058729 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.058727 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.059056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.058746 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-config\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.059056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.058784 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-k8s-rulefiles-0\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.059056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.058814 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-metrics-client-ca\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.059056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.058852 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-tls-assets\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.059056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.058894 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-grpc-tls\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.059056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.058917 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-web-config\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.059363 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.059093 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:33:00.059933 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.059740 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:33:00.059933 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.059799 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7ljg\" (UniqueName: \"kubernetes.io/projected/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-kube-api-access-r7ljg\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.059933 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.059838 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-thanos-prometheus-http-client-file\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.059933 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.059872 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.059933 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.059897 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-trusted-ca-bundle\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.060257 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.059942 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-k8s-db\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.060257 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.059980 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-metrics-client-certs\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.060376 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.060353 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-config-out\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.060437 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.060409 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-serving-certs-ca-bundle\") pod \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\" (UID: \"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c\") " Apr 16 19:33:00.060738 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.060718 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.060829 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.060746 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-metrics-client-ca\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.061133 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.061111 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:33:00.061760 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.061733 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:33:00.062119 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.062088 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:33:00.062222 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.062110 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:33:00.062222 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.062169 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:33:00.062222 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.062174 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:33:00.062552 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.062199 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:33:00.063096 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.063074 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:33:00.063713 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.063667 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:33:00.064031 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.063988 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-config" (OuterVolumeSpecName: "config") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:33:00.064120 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.064083 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:33:00.064180 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.064110 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:33:00.064239 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.064215 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:33:00.064525 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.064508 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-config-out" (OuterVolumeSpecName: "config-out") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:33:00.065157 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.065130 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-kube-api-access-r7ljg" (OuterVolumeSpecName: "kube-api-access-r7ljg") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "kube-api-access-r7ljg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:33:00.073641 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.073622 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-web-config" (OuterVolumeSpecName: "web-config") pod "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" (UID: "5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:33:00.161467 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161440 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-tls-assets\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161467 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161464 2568 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-grpc-tls\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161467 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161473 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-web-config\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161482 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r7ljg\" (UniqueName: \"kubernetes.io/projected/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-kube-api-access-r7ljg\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161491 2568 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-thanos-prometheus-http-client-file\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161500 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161509 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-trusted-ca-bundle\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161518 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-k8s-db\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161527 2568 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-metrics-client-certs\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161536 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-config-out\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161544 2568 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161553 2568 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-kube-rbac-proxy\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161561 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-tls\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161569 2568 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161579 2568 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-config\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.161655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.161587 2568 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:33:00.274117 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274086 2568 generic.go:358] "Generic (PLEG): container finished" podID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerID="6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1" exitCode=0 Apr 16 19:33:00.274117 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274111 2568 generic.go:358] "Generic (PLEG): container finished" podID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerID="cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1" exitCode=0 Apr 16 19:33:00.274117 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274120 2568 generic.go:358] "Generic (PLEG): container finished" podID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerID="58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe" exitCode=0 Apr 16 19:33:00.274351 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274129 2568 generic.go:358] "Generic (PLEG): container finished" podID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerID="31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83" exitCode=0 Apr 16 19:33:00.274351 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274136 2568 generic.go:358] "Generic (PLEG): container finished" podID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerID="aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d" exitCode=0 Apr 16 19:33:00.274351 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274141 2568 generic.go:358] "Generic (PLEG): container finished" podID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerID="a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12" exitCode=0 Apr 16 19:33:00.274351 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274130 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerDied","Data":"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1"} Apr 16 19:33:00.274351 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274191 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.274351 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274196 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerDied","Data":"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1"} Apr 16 19:33:00.274351 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274214 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerDied","Data":"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe"} Apr 16 19:33:00.274351 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274227 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerDied","Data":"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83"} Apr 16 19:33:00.274351 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274242 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerDied","Data":"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d"} Apr 16 19:33:00.274351 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274253 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerDied","Data":"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12"} Apr 16 19:33:00.274351 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274261 2568 scope.go:117] "RemoveContainer" containerID="6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1" Apr 16 19:33:00.274351 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.274267 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c","Type":"ContainerDied","Data":"08c9a81b465a80305014f919384384ff4df099b747c8080fe1f722ab39d0e03c"} Apr 16 19:33:00.284880 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.284860 2568 scope.go:117] "RemoveContainer" containerID="cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1" Apr 16 19:33:00.292535 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.292517 2568 scope.go:117] "RemoveContainer" containerID="58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe" Apr 16 19:33:00.299613 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.299596 2568 scope.go:117] "RemoveContainer" containerID="31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83" Apr 16 19:33:00.300387 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.300368 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:33:00.304212 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.304192 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:33:00.306333 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.306313 2568 scope.go:117] "RemoveContainer" containerID="aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d" Apr 16 19:33:00.312422 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.312407 2568 scope.go:117] "RemoveContainer" containerID="a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12" Apr 16 19:33:00.318990 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.318975 2568 scope.go:117] "RemoveContainer" containerID="42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603" Apr 16 19:33:00.325045 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.325031 2568 scope.go:117] "RemoveContainer" containerID="6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1" Apr 16 19:33:00.325312 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:33:00.325286 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1\": container with ID starting with 6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1 not found: ID does not exist" containerID="6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1" Apr 16 19:33:00.325364 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.325311 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1"} err="failed to get container status \"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1\": rpc error: code = NotFound desc = could not find container \"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1\": container with ID starting with 6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1 not found: ID does not exist" Apr 16 19:33:00.325364 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.325341 2568 scope.go:117] "RemoveContainer" containerID="cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1" Apr 16 19:33:00.325564 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:33:00.325536 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1\": container with ID starting with cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1 not found: ID does not exist" containerID="cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1" Apr 16 19:33:00.325630 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.325563 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1"} err="failed to get container status \"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1\": rpc error: code = NotFound desc = could not find container \"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1\": container with ID starting with cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1 not found: ID does not exist" Apr 16 19:33:00.325630 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.325587 2568 scope.go:117] "RemoveContainer" containerID="58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe" Apr 16 19:33:00.325839 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:33:00.325821 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe\": container with ID starting with 58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe not found: ID does not exist" containerID="58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe" Apr 16 19:33:00.325876 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.325845 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe"} err="failed to get container status \"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe\": rpc error: code = NotFound desc = could not find container \"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe\": container with ID starting with 58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe not found: ID does not exist" Apr 16 19:33:00.325876 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.325860 2568 scope.go:117] "RemoveContainer" containerID="31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83" Apr 16 19:33:00.326069 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:33:00.326052 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83\": container with ID starting with 31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83 not found: ID does not exist" containerID="31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83" Apr 16 19:33:00.326110 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.326073 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83"} err="failed to get container status \"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83\": rpc error: code = NotFound desc = could not find container \"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83\": container with ID starting with 31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83 not found: ID does not exist" Apr 16 19:33:00.326110 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.326087 2568 scope.go:117] "RemoveContainer" containerID="aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d" Apr 16 19:33:00.326298 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:33:00.326279 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d\": container with ID starting with aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d not found: ID does not exist" containerID="aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d" Apr 16 19:33:00.326358 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.326305 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d"} err="failed to get container status \"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d\": rpc error: code = NotFound desc = could not find container \"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d\": container with ID starting with aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d not found: ID does not exist" Apr 16 19:33:00.326358 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.326326 2568 scope.go:117] "RemoveContainer" containerID="a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12" Apr 16 19:33:00.326519 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:33:00.326502 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12\": container with ID starting with a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12 not found: ID does not exist" containerID="a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12" Apr 16 19:33:00.326556 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.326523 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12"} err="failed to get container status \"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12\": rpc error: code = NotFound desc = could not find container \"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12\": container with ID starting with a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12 not found: ID does not exist" Apr 16 19:33:00.326556 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.326534 2568 scope.go:117] "RemoveContainer" containerID="42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603" Apr 16 19:33:00.326714 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:33:00.326700 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603\": container with ID starting with 42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603 not found: ID does not exist" containerID="42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603" Apr 16 19:33:00.326758 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.326718 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603"} err="failed to get container status \"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603\": rpc error: code = NotFound desc = could not find container \"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603\": container with ID starting with 42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603 not found: ID does not exist" Apr 16 19:33:00.326758 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.326729 2568 scope.go:117] "RemoveContainer" containerID="6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1" Apr 16 19:33:00.326915 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.326899 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1"} err="failed to get container status \"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1\": rpc error: code = NotFound desc = could not find container \"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1\": container with ID starting with 6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1 not found: ID does not exist" Apr 16 19:33:00.326955 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.326916 2568 scope.go:117] "RemoveContainer" containerID="cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1" Apr 16 19:33:00.327103 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.327086 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1"} err="failed to get container status \"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1\": rpc error: code = NotFound desc = could not find container \"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1\": container with ID starting with cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1 not found: ID does not exist" Apr 16 19:33:00.327179 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.327108 2568 scope.go:117] "RemoveContainer" containerID="58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe" Apr 16 19:33:00.327376 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.327357 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe"} err="failed to get container status \"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe\": rpc error: code = NotFound desc = could not find container \"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe\": container with ID starting with 58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe not found: ID does not exist" Apr 16 19:33:00.327439 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.327377 2568 scope.go:117] "RemoveContainer" containerID="31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83" Apr 16 19:33:00.327642 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.327603 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83"} err="failed to get container status \"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83\": rpc error: code = NotFound desc = could not find container \"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83\": container with ID starting with 31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83 not found: ID does not exist" Apr 16 19:33:00.327642 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.327628 2568 scope.go:117] "RemoveContainer" containerID="aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d" Apr 16 19:33:00.328062 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.328021 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d"} err="failed to get container status \"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d\": rpc error: code = NotFound desc = could not find container \"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d\": container with ID starting with aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d not found: ID does not exist" Apr 16 19:33:00.328062 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.328047 2568 scope.go:117] "RemoveContainer" containerID="a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12" Apr 16 19:33:00.328349 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.328322 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12"} err="failed to get container status \"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12\": rpc error: code = NotFound desc = could not find container \"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12\": container with ID starting with a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12 not found: ID does not exist" Apr 16 19:33:00.328463 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.328350 2568 scope.go:117] "RemoveContainer" containerID="42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603" Apr 16 19:33:00.328613 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.328595 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603"} err="failed to get container status \"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603\": rpc error: code = NotFound desc = could not find container \"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603\": container with ID starting with 42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603 not found: ID does not exist" Apr 16 19:33:00.328665 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.328614 2568 scope.go:117] "RemoveContainer" containerID="6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1" Apr 16 19:33:00.328938 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.328907 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1"} err="failed to get container status \"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1\": rpc error: code = NotFound desc = could not find container \"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1\": container with ID starting with 6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1 not found: ID does not exist" Apr 16 19:33:00.328938 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.328929 2568 scope.go:117] "RemoveContainer" containerID="cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1" Apr 16 19:33:00.329225 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329194 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1"} err="failed to get container status \"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1\": rpc error: code = NotFound desc = could not find container \"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1\": container with ID starting with cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1 not found: ID does not exist" Apr 16 19:33:00.329225 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329214 2568 scope.go:117] "RemoveContainer" containerID="58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe" Apr 16 19:33:00.329507 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329484 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe"} err="failed to get container status \"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe\": rpc error: code = NotFound desc = could not find container \"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe\": container with ID starting with 58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe not found: ID does not exist" Apr 16 19:33:00.329507 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329507 2568 scope.go:117] "RemoveContainer" containerID="31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83" Apr 16 19:33:00.329614 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329530 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:33:00.329827 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329801 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83"} err="failed to get container status \"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83\": rpc error: code = NotFound desc = could not find container \"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83\": container with ID starting with 31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83 not found: ID does not exist" Apr 16 19:33:00.329941 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329829 2568 scope.go:117] "RemoveContainer" containerID="aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d" Apr 16 19:33:00.329941 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329914 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="kube-rbac-proxy-web" Apr 16 19:33:00.329941 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329929 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="kube-rbac-proxy-web" Apr 16 19:33:00.329941 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329940 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="kube-rbac-proxy" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329949 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="kube-rbac-proxy" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329965 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="prometheus" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329973 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="prometheus" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329983 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="kube-rbac-proxy-thanos" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.329991 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="kube-rbac-proxy-thanos" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330004 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="thanos-sidecar" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330012 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="thanos-sidecar" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330022 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="init-config-reloader" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330030 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="init-config-reloader" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330048 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="config-reloader" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330058 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="config-reloader" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330105 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="prometheus" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330116 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="config-reloader" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330126 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="kube-rbac-proxy" Apr 16 19:33:00.330134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330137 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="kube-rbac-proxy-web" Apr 16 19:33:00.330633 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330147 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="kube-rbac-proxy-thanos" Apr 16 19:33:00.330633 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330154 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" containerName="thanos-sidecar" Apr 16 19:33:00.330633 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330157 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d"} err="failed to get container status \"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d\": rpc error: code = NotFound desc = could not find container \"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d\": container with ID starting with aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d not found: ID does not exist" Apr 16 19:33:00.330633 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330174 2568 scope.go:117] "RemoveContainer" containerID="a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12" Apr 16 19:33:00.330633 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330440 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12"} err="failed to get container status \"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12\": rpc error: code = NotFound desc = could not find container \"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12\": container with ID starting with a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12 not found: ID does not exist" Apr 16 19:33:00.330633 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330455 2568 scope.go:117] "RemoveContainer" containerID="42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603" Apr 16 19:33:00.330903 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330695 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603"} err="failed to get container status \"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603\": rpc error: code = NotFound desc = could not find container \"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603\": container with ID starting with 42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603 not found: ID does not exist" Apr 16 19:33:00.330903 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330714 2568 scope.go:117] "RemoveContainer" containerID="6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1" Apr 16 19:33:00.330967 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330936 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1"} err="failed to get container status \"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1\": rpc error: code = NotFound desc = could not find container \"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1\": container with ID starting with 6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1 not found: ID does not exist" Apr 16 19:33:00.330967 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.330951 2568 scope.go:117] "RemoveContainer" containerID="cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1" Apr 16 19:33:00.331169 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.331150 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1"} err="failed to get container status \"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1\": rpc error: code = NotFound desc = could not find container \"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1\": container with ID starting with cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1 not found: ID does not exist" Apr 16 19:33:00.331215 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.331170 2568 scope.go:117] "RemoveContainer" containerID="58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe" Apr 16 19:33:00.331365 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.331348 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe"} err="failed to get container status \"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe\": rpc error: code = NotFound desc = could not find container \"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe\": container with ID starting with 58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe not found: ID does not exist" Apr 16 19:33:00.331407 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.331366 2568 scope.go:117] "RemoveContainer" containerID="31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83" Apr 16 19:33:00.331560 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.331541 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83"} err="failed to get container status \"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83\": rpc error: code = NotFound desc = could not find container \"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83\": container with ID starting with 31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83 not found: ID does not exist" Apr 16 19:33:00.331600 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.331564 2568 scope.go:117] "RemoveContainer" containerID="aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d" Apr 16 19:33:00.331825 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.331807 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d"} err="failed to get container status \"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d\": rpc error: code = NotFound desc = could not find container \"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d\": container with ID starting with aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d not found: ID does not exist" Apr 16 19:33:00.331825 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.331825 2568 scope.go:117] "RemoveContainer" containerID="a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12" Apr 16 19:33:00.332040 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.332020 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12"} err="failed to get container status \"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12\": rpc error: code = NotFound desc = could not find container \"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12\": container with ID starting with a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12 not found: ID does not exist" Apr 16 19:33:00.332104 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.332042 2568 scope.go:117] "RemoveContainer" containerID="42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603" Apr 16 19:33:00.332282 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.332264 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603"} err="failed to get container status \"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603\": rpc error: code = NotFound desc = could not find container \"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603\": container with ID starting with 42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603 not found: ID does not exist" Apr 16 19:33:00.332347 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.332293 2568 scope.go:117] "RemoveContainer" containerID="6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1" Apr 16 19:33:00.332504 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.332482 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1"} err="failed to get container status \"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1\": rpc error: code = NotFound desc = could not find container \"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1\": container with ID starting with 6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1 not found: ID does not exist" Apr 16 19:33:00.332556 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.332507 2568 scope.go:117] "RemoveContainer" containerID="cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1" Apr 16 19:33:00.332667 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.332653 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1"} err="failed to get container status \"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1\": rpc error: code = NotFound desc = could not find container \"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1\": container with ID starting with cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1 not found: ID does not exist" Apr 16 19:33:00.332738 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.332668 2568 scope.go:117] "RemoveContainer" containerID="58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe" Apr 16 19:33:00.332903 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.332885 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe"} err="failed to get container status \"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe\": rpc error: code = NotFound desc = could not find container \"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe\": container with ID starting with 58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe not found: ID does not exist" Apr 16 19:33:00.332945 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.332904 2568 scope.go:117] "RemoveContainer" containerID="31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83" Apr 16 19:33:00.333101 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.333086 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83"} err="failed to get container status \"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83\": rpc error: code = NotFound desc = could not find container \"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83\": container with ID starting with 31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83 not found: ID does not exist" Apr 16 19:33:00.333146 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.333101 2568 scope.go:117] "RemoveContainer" containerID="aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d" Apr 16 19:33:00.333315 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.333296 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d"} err="failed to get container status \"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d\": rpc error: code = NotFound desc = could not find container \"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d\": container with ID starting with aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d not found: ID does not exist" Apr 16 19:33:00.333363 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.333316 2568 scope.go:117] "RemoveContainer" containerID="a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12" Apr 16 19:33:00.333516 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.333500 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12"} err="failed to get container status \"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12\": rpc error: code = NotFound desc = could not find container \"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12\": container with ID starting with a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12 not found: ID does not exist" Apr 16 19:33:00.333574 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.333516 2568 scope.go:117] "RemoveContainer" containerID="42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603" Apr 16 19:33:00.333717 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.333701 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603"} err="failed to get container status \"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603\": rpc error: code = NotFound desc = could not find container \"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603\": container with ID starting with 42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603 not found: ID does not exist" Apr 16 19:33:00.333763 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.333717 2568 scope.go:117] "RemoveContainer" containerID="6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1" Apr 16 19:33:00.333933 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.333913 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1"} err="failed to get container status \"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1\": rpc error: code = NotFound desc = could not find container \"6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1\": container with ID starting with 6d12fcab37e5947b77e62d6b9b2a4d706253875bbe3b44da58b14d88161ef7f1 not found: ID does not exist" Apr 16 19:33:00.333975 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.333935 2568 scope.go:117] "RemoveContainer" containerID="cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1" Apr 16 19:33:00.334199 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.334179 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1"} err="failed to get container status \"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1\": rpc error: code = NotFound desc = could not find container \"cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1\": container with ID starting with cddf342833d5a8ac995f8dabd21a4d11b5ae38b15c4b09cae3f3c1aa6267b2f1 not found: ID does not exist" Apr 16 19:33:00.334199 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.334198 2568 scope.go:117] "RemoveContainer" containerID="58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe" Apr 16 19:33:00.334377 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.334362 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe"} err="failed to get container status \"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe\": rpc error: code = NotFound desc = could not find container \"58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe\": container with ID starting with 58566dd825b736399a590a976deee5d60c84fbb59c4ecc0ea6d43defabec5ebe not found: ID does not exist" Apr 16 19:33:00.334377 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.334377 2568 scope.go:117] "RemoveContainer" containerID="31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83" Apr 16 19:33:00.334566 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.334548 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83"} err="failed to get container status \"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83\": rpc error: code = NotFound desc = could not find container \"31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83\": container with ID starting with 31906a40dcb4df7fa32268777ccec4b2e7d7d362cac04f878dc8d7f1607d3c83 not found: ID does not exist" Apr 16 19:33:00.334566 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.334564 2568 scope.go:117] "RemoveContainer" containerID="aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d" Apr 16 19:33:00.334814 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.334796 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d"} err="failed to get container status \"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d\": rpc error: code = NotFound desc = could not find container \"aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d\": container with ID starting with aef2afdd152ef303a6280186c6b271d8145af9bb12481e61188d8dba27f2fe5d not found: ID does not exist" Apr 16 19:33:00.334814 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.334815 2568 scope.go:117] "RemoveContainer" containerID="a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12" Apr 16 19:33:00.335056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.335033 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12"} err="failed to get container status \"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12\": rpc error: code = NotFound desc = could not find container \"a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12\": container with ID starting with a9736e53da2b4da46e97aa07ba71e27e652aab549228109cbceeae14e570dc12 not found: ID does not exist" Apr 16 19:33:00.335056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.335057 2568 scope.go:117] "RemoveContainer" containerID="42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603" Apr 16 19:33:00.335325 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.335298 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603"} err="failed to get container status \"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603\": rpc error: code = NotFound desc = could not find container \"42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603\": container with ID starting with 42c05ea2db9163400c1ff7673af3c119527d61e01b02614b4f754a5a373e2603 not found: ID does not exist" Apr 16 19:33:00.335510 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.335496 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.338363 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.338344 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 19:33:00.338472 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.338366 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:33:00.338472 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.338460 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-bdq9rdlp8j1v\"" Apr 16 19:33:00.338607 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.338501 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 19:33:00.338757 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.338741 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 19:33:00.338882 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.338783 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 19:33:00.339060 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.339043 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 19:33:00.339603 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.339543 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-cpkmg\"" Apr 16 19:33:00.339603 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.339551 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 19:33:00.339771 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.339551 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 19:33:00.339771 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.339698 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 19:33:00.339928 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.339820 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 19:33:00.340898 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.340613 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 19:33:00.345937 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.343863 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 19:33:00.348722 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.348700 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 19:33:00.349967 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.349946 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:33:00.363342 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363322 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.363496 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363368 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.363496 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363414 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.363603 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363490 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.363603 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363549 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.363711 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363606 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15f0261f-6878-42f5-8a58-ae42cf943b64-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.363711 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363632 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.363711 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363655 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.363840 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363729 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15f0261f-6878-42f5-8a58-ae42cf943b64-config-out\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.363840 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363752 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjxb2\" (UniqueName: \"kubernetes.io/projected/15f0261f-6878-42f5-8a58-ae42cf943b64-kube-api-access-pjxb2\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.363840 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363790 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.363991 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363849 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/15f0261f-6878-42f5-8a58-ae42cf943b64-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.363991 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363906 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.363991 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363935 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.364133 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.363996 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.364133 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.364024 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.364133 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.364050 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-web-config\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.364133 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.364093 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-config\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.464714 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.464658 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.464714 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.464719 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.464948 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.464736 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-web-config\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.464948 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.464768 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-config\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.464948 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.464793 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.464948 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.464816 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.464948 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.464834 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.464948 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.464870 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.464948 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.464902 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.464948 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.464943 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15f0261f-6878-42f5-8a58-ae42cf943b64-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.465324 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.464968 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.465324 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.464991 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.465324 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.465029 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15f0261f-6878-42f5-8a58-ae42cf943b64-config-out\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.465324 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.465051 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjxb2\" (UniqueName: \"kubernetes.io/projected/15f0261f-6878-42f5-8a58-ae42cf943b64-kube-api-access-pjxb2\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.465324 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.465080 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.465324 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.465109 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/15f0261f-6878-42f5-8a58-ae42cf943b64-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.465324 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.465144 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.465324 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.465169 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.465753 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.465522 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.467621 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.466668 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/15f0261f-6878-42f5-8a58-ae42cf943b64-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.467621 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.466993 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.467621 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.466993 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.468190 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.468083 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.468734 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.468493 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-web-config\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.468734 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.468607 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.468893 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.468871 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.469239 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.469221 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.469429 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.469413 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/15f0261f-6878-42f5-8a58-ae42cf943b64-config-out\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.470352 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.470327 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.470488 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.470393 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-config\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.470488 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.470442 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.470488 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.470461 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.470865 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.470840 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/15f0261f-6878-42f5-8a58-ae42cf943b64-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.470865 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.470860 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/15f0261f-6878-42f5-8a58-ae42cf943b64-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.471733 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.471716 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/15f0261f-6878-42f5-8a58-ae42cf943b64-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.474978 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.474955 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjxb2\" (UniqueName: \"kubernetes.io/projected/15f0261f-6878-42f5-8a58-ae42cf943b64-kube-api-access-pjxb2\") pod \"prometheus-k8s-0\" (UID: \"15f0261f-6878-42f5-8a58-ae42cf943b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.650069 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.650045 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:33:00.777603 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:00.777578 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:33:00.779646 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:33:00.779617 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15f0261f_6878_42f5_8a58_ae42cf943b64.slice/crio-6780ba0d4b3b52df0a6c223bcbfd219918cdc872fe0579d72b0f467febab0b39 WatchSource:0}: Error finding container 6780ba0d4b3b52df0a6c223bcbfd219918cdc872fe0579d72b0f467febab0b39: Status 404 returned error can't find the container with id 6780ba0d4b3b52df0a6c223bcbfd219918cdc872fe0579d72b0f467febab0b39 Apr 16 19:33:01.279019 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:01.278981 2568 generic.go:358] "Generic (PLEG): container finished" podID="15f0261f-6878-42f5-8a58-ae42cf943b64" containerID="9788ab0e249a7ab94eea3169b58a2a1bafc294b974e87ac4814c35169505628d" exitCode=0 Apr 16 19:33:01.279185 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:01.279056 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15f0261f-6878-42f5-8a58-ae42cf943b64","Type":"ContainerDied","Data":"9788ab0e249a7ab94eea3169b58a2a1bafc294b974e87ac4814c35169505628d"} Apr 16 19:33:01.279185 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:01.279080 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15f0261f-6878-42f5-8a58-ae42cf943b64","Type":"ContainerStarted","Data":"6780ba0d4b3b52df0a6c223bcbfd219918cdc872fe0579d72b0f467febab0b39"} Apr 16 19:33:01.652866 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:01.652840 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c" path="/var/lib/kubelet/pods/5c9d9f8e-fe59-4e0b-b536-a2a917cadf3c/volumes" Apr 16 19:33:02.285415 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:02.285383 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15f0261f-6878-42f5-8a58-ae42cf943b64","Type":"ContainerStarted","Data":"ee2db9052bde9eba6c30335f5d541bc7bf472b27790c5a0ff4cfbd489a5cc218"} Apr 16 19:33:02.285415 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:02.285416 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15f0261f-6878-42f5-8a58-ae42cf943b64","Type":"ContainerStarted","Data":"7e530184e2f059a2527879d90d6e9cb5b8d4b2f64c41d9f97e58d15a9048933f"} Apr 16 19:33:02.285823 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:02.285426 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15f0261f-6878-42f5-8a58-ae42cf943b64","Type":"ContainerStarted","Data":"f7403110ba2216b49bb2be19da2dd8042e6bd5752dbc79ea3981138db318de75"} Apr 16 19:33:02.285823 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:02.285436 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15f0261f-6878-42f5-8a58-ae42cf943b64","Type":"ContainerStarted","Data":"3f0124024919699b0c9e02ad4cc67c8ce41e7e6764a8ae5d660b0e86a50c9543"} Apr 16 19:33:02.285823 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:02.285444 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15f0261f-6878-42f5-8a58-ae42cf943b64","Type":"ContainerStarted","Data":"59a1d07b8eed91ca57fc04b3a8cd0c97786a98cf4ff75afeb21d7029407ed47b"} Apr 16 19:33:02.285823 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:02.285453 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"15f0261f-6878-42f5-8a58-ae42cf943b64","Type":"ContainerStarted","Data":"2f45f8b64276e6d8f78321072526ca8523d2ea5cb918282f99921f92d5f4cff7"} Apr 16 19:33:02.312999 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:02.312930 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.312817628 podStartE2EDuration="2.312817628s" podCreationTimestamp="2026-04-16 19:33:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:33:02.311649196 +0000 UTC m=+165.235345796" watchObservedRunningTime="2026-04-16 19:33:02.312817628 +0000 UTC m=+165.236514230" Apr 16 19:33:05.651224 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:33:05.651194 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:34:00.651105 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:34:00.651072 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:34:00.667200 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:34:00.667172 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:34:01.475561 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:34:01.475533 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:35:17.524916 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:35:17.524887 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 19:36:03.654352 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.654320 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c"] Apr 16 19:36:03.657613 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.657594 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" Apr 16 19:36:03.662263 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.662243 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 19:36:03.662380 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.662293 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-vrbr4\"" Apr 16 19:36:03.663188 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.663170 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 19:36:03.663389 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.663376 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 19:36:03.664823 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.664803 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 19:36:03.689824 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.689788 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c"] Apr 16 19:36:03.786342 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.786304 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q58r4\" (UniqueName: \"kubernetes.io/projected/7458448c-7f4a-4d6e-8288-a5cc4c0993c2-kube-api-access-q58r4\") pod \"opendatahub-operator-controller-manager-57586b9555-v675c\" (UID: \"7458448c-7f4a-4d6e-8288-a5cc4c0993c2\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" Apr 16 19:36:03.786525 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.786356 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7458448c-7f4a-4d6e-8288-a5cc4c0993c2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-v675c\" (UID: \"7458448c-7f4a-4d6e-8288-a5cc4c0993c2\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" Apr 16 19:36:03.786525 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.786465 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7458448c-7f4a-4d6e-8288-a5cc4c0993c2-webhook-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-v675c\" (UID: \"7458448c-7f4a-4d6e-8288-a5cc4c0993c2\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" Apr 16 19:36:03.887129 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.887092 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7458448c-7f4a-4d6e-8288-a5cc4c0993c2-webhook-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-v675c\" (UID: \"7458448c-7f4a-4d6e-8288-a5cc4c0993c2\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" Apr 16 19:36:03.887290 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.887142 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q58r4\" (UniqueName: \"kubernetes.io/projected/7458448c-7f4a-4d6e-8288-a5cc4c0993c2-kube-api-access-q58r4\") pod \"opendatahub-operator-controller-manager-57586b9555-v675c\" (UID: \"7458448c-7f4a-4d6e-8288-a5cc4c0993c2\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" Apr 16 19:36:03.887341 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.887314 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7458448c-7f4a-4d6e-8288-a5cc4c0993c2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-v675c\" (UID: \"7458448c-7f4a-4d6e-8288-a5cc4c0993c2\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" Apr 16 19:36:03.889558 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.889533 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7458448c-7f4a-4d6e-8288-a5cc4c0993c2-webhook-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-v675c\" (UID: \"7458448c-7f4a-4d6e-8288-a5cc4c0993c2\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" Apr 16 19:36:03.889711 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.889644 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7458448c-7f4a-4d6e-8288-a5cc4c0993c2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-v675c\" (UID: \"7458448c-7f4a-4d6e-8288-a5cc4c0993c2\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" Apr 16 19:36:03.902277 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.902248 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q58r4\" (UniqueName: \"kubernetes.io/projected/7458448c-7f4a-4d6e-8288-a5cc4c0993c2-kube-api-access-q58r4\") pod \"opendatahub-operator-controller-manager-57586b9555-v675c\" (UID: \"7458448c-7f4a-4d6e-8288-a5cc4c0993c2\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" Apr 16 19:36:03.971549 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:03.971464 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" Apr 16 19:36:04.093907 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:04.093879 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c"] Apr 16 19:36:04.097317 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:36:04.097291 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7458448c_7f4a_4d6e_8288_a5cc4c0993c2.slice/crio-42b27e603ba5508580e3f94a7a12b39f149aef46379f2d8ccf95c3189aba4f45 WatchSource:0}: Error finding container 42b27e603ba5508580e3f94a7a12b39f149aef46379f2d8ccf95c3189aba4f45: Status 404 returned error can't find the container with id 42b27e603ba5508580e3f94a7a12b39f149aef46379f2d8ccf95c3189aba4f45 Apr 16 19:36:04.098811 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:04.098796 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:36:04.808412 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:04.808348 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" event={"ID":"7458448c-7f4a-4d6e-8288-a5cc4c0993c2","Type":"ContainerStarted","Data":"42b27e603ba5508580e3f94a7a12b39f149aef46379f2d8ccf95c3189aba4f45"} Apr 16 19:36:06.815085 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:06.815048 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" event={"ID":"7458448c-7f4a-4d6e-8288-a5cc4c0993c2","Type":"ContainerStarted","Data":"37ac8835c3187416a99853c074d5943d63edbea07ce05f246c80d66b6e88784a"} Apr 16 19:36:06.815459 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:06.815194 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" Apr 16 19:36:06.865106 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:06.865049 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" podStartSLOduration=1.341947272 podStartE2EDuration="3.865033807s" podCreationTimestamp="2026-04-16 19:36:03 +0000 UTC" firstStartedPulling="2026-04-16 19:36:04.098915873 +0000 UTC m=+347.022612451" lastFinishedPulling="2026-04-16 19:36:06.622002396 +0000 UTC m=+349.545698986" observedRunningTime="2026-04-16 19:36:06.862743426 +0000 UTC m=+349.786440026" watchObservedRunningTime="2026-04-16 19:36:06.865033807 +0000 UTC m=+349.788730407" Apr 16 19:36:13.134962 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.134928 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs"] Apr 16 19:36:13.138610 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.138589 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:13.141323 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.141298 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 19:36:13.141435 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.141350 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 19:36:13.142357 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.142335 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:36:13.142484 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.142385 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 19:36:13.142484 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.142334 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 19:36:13.142484 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.142473 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-9rnfn\"" Apr 16 19:36:13.151745 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.151720 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs"] Apr 16 19:36:13.268750 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.268708 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6f0dab7e-ceb4-418e-98e0-f497e72ca500-manager-config\") pod \"lws-controller-manager-5bfdb756-t47zs\" (UID: \"6f0dab7e-ceb4-418e-98e0-f497e72ca500\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:13.268750 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.268751 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klr8f\" (UniqueName: \"kubernetes.io/projected/6f0dab7e-ceb4-418e-98e0-f497e72ca500-kube-api-access-klr8f\") pod \"lws-controller-manager-5bfdb756-t47zs\" (UID: \"6f0dab7e-ceb4-418e-98e0-f497e72ca500\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:13.268957 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.268819 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f0dab7e-ceb4-418e-98e0-f497e72ca500-metrics-cert\") pod \"lws-controller-manager-5bfdb756-t47zs\" (UID: \"6f0dab7e-ceb4-418e-98e0-f497e72ca500\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:13.268957 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.268856 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f0dab7e-ceb4-418e-98e0-f497e72ca500-cert\") pod \"lws-controller-manager-5bfdb756-t47zs\" (UID: \"6f0dab7e-ceb4-418e-98e0-f497e72ca500\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:13.370058 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.370009 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f0dab7e-ceb4-418e-98e0-f497e72ca500-metrics-cert\") pod \"lws-controller-manager-5bfdb756-t47zs\" (UID: \"6f0dab7e-ceb4-418e-98e0-f497e72ca500\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:13.370240 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.370069 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f0dab7e-ceb4-418e-98e0-f497e72ca500-cert\") pod \"lws-controller-manager-5bfdb756-t47zs\" (UID: \"6f0dab7e-ceb4-418e-98e0-f497e72ca500\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:13.370240 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.370120 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6f0dab7e-ceb4-418e-98e0-f497e72ca500-manager-config\") pod \"lws-controller-manager-5bfdb756-t47zs\" (UID: \"6f0dab7e-ceb4-418e-98e0-f497e72ca500\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:13.370240 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.370137 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klr8f\" (UniqueName: \"kubernetes.io/projected/6f0dab7e-ceb4-418e-98e0-f497e72ca500-kube-api-access-klr8f\") pod \"lws-controller-manager-5bfdb756-t47zs\" (UID: \"6f0dab7e-ceb4-418e-98e0-f497e72ca500\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:13.370873 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.370812 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6f0dab7e-ceb4-418e-98e0-f497e72ca500-manager-config\") pod \"lws-controller-manager-5bfdb756-t47zs\" (UID: \"6f0dab7e-ceb4-418e-98e0-f497e72ca500\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:13.372826 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.372796 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f0dab7e-ceb4-418e-98e0-f497e72ca500-metrics-cert\") pod \"lws-controller-manager-5bfdb756-t47zs\" (UID: \"6f0dab7e-ceb4-418e-98e0-f497e72ca500\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:13.372932 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.372841 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f0dab7e-ceb4-418e-98e0-f497e72ca500-cert\") pod \"lws-controller-manager-5bfdb756-t47zs\" (UID: \"6f0dab7e-ceb4-418e-98e0-f497e72ca500\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:13.378312 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.378285 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klr8f\" (UniqueName: \"kubernetes.io/projected/6f0dab7e-ceb4-418e-98e0-f497e72ca500-kube-api-access-klr8f\") pod \"lws-controller-manager-5bfdb756-t47zs\" (UID: \"6f0dab7e-ceb4-418e-98e0-f497e72ca500\") " pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:13.448601 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.448501 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:13.577200 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.577162 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs"] Apr 16 19:36:13.580500 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:36:13.580471 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f0dab7e_ceb4_418e_98e0_f497e72ca500.slice/crio-9055789cdc76b25cb71103a49f3e50c85ad4f17ba5b0e23902d29c8a7d3c9e37 WatchSource:0}: Error finding container 9055789cdc76b25cb71103a49f3e50c85ad4f17ba5b0e23902d29c8a7d3c9e37: Status 404 returned error can't find the container with id 9055789cdc76b25cb71103a49f3e50c85ad4f17ba5b0e23902d29c8a7d3c9e37 Apr 16 19:36:13.844056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:13.844020 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" event={"ID":"6f0dab7e-ceb4-418e-98e0-f497e72ca500","Type":"ContainerStarted","Data":"9055789cdc76b25cb71103a49f3e50c85ad4f17ba5b0e23902d29c8a7d3c9e37"} Apr 16 19:36:16.856534 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:16.856496 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" event={"ID":"6f0dab7e-ceb4-418e-98e0-f497e72ca500","Type":"ContainerStarted","Data":"27b8d00ef1dffde1e5512d4afc5a1b4f41f5ffdd8da49c8399d40b7961ade84e"} Apr 16 19:36:16.856997 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:16.856619 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:16.883417 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:16.883367 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" podStartSLOduration=1.283182059 podStartE2EDuration="3.883350714s" podCreationTimestamp="2026-04-16 19:36:13 +0000 UTC" firstStartedPulling="2026-04-16 19:36:13.582253081 +0000 UTC m=+356.505949659" lastFinishedPulling="2026-04-16 19:36:16.182421722 +0000 UTC m=+359.106118314" observedRunningTime="2026-04-16 19:36:16.881722869 +0000 UTC m=+359.805419466" watchObservedRunningTime="2026-04-16 19:36:16.883350714 +0000 UTC m=+359.807047313" Apr 16 19:36:17.821163 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:17.821132 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-v675c" Apr 16 19:36:27.863098 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:27.863067 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5bfdb756-t47zs" Apr 16 19:36:54.076248 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.076212 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5"] Apr 16 19:36:54.082995 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.082972 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.085613 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.085584 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 19:36:54.085781 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.085612 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 19:36:54.085781 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.085595 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 19:36:54.085781 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.085690 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-9k96t\"" Apr 16 19:36:54.091740 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.091713 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5"] Apr 16 19:36:54.223620 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.223579 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.223620 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.223625 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.223918 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.223702 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/252f8668-c013-4a42-9977-123205eefcdb-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.223918 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.223753 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/252f8668-c013-4a42-9977-123205eefcdb-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.223918 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.223797 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wfcn\" (UniqueName: \"kubernetes.io/projected/252f8668-c013-4a42-9977-123205eefcdb-kube-api-access-7wfcn\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.223918 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.223818 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/252f8668-c013-4a42-9977-123205eefcdb-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.224131 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.223930 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.224131 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.223951 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.224131 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.223967 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.325346 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.325305 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/252f8668-c013-4a42-9977-123205eefcdb-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.325541 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.325357 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/252f8668-c013-4a42-9977-123205eefcdb-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.325541 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.325410 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wfcn\" (UniqueName: \"kubernetes.io/projected/252f8668-c013-4a42-9977-123205eefcdb-kube-api-access-7wfcn\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.325541 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.325441 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/252f8668-c013-4a42-9977-123205eefcdb-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.325541 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.325514 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.325819 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.325544 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.326010 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.325979 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.326010 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.325995 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.326180 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.326070 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.326180 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.326097 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.326180 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.326115 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.326382 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.326212 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/252f8668-c013-4a42-9977-123205eefcdb-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.326382 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.326331 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.326496 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.326456 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.328125 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.328106 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/252f8668-c013-4a42-9977-123205eefcdb-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.328232 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.328195 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/252f8668-c013-4a42-9977-123205eefcdb-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.334746 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.334715 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/252f8668-c013-4a42-9977-123205eefcdb-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.334933 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.334905 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wfcn\" (UniqueName: \"kubernetes.io/projected/252f8668-c013-4a42-9977-123205eefcdb-kube-api-access-7wfcn\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f4ffn5\" (UID: \"252f8668-c013-4a42-9977-123205eefcdb\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.394783 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.394745 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:54.523527 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.523462 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5"] Apr 16 19:36:54.526852 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:36:54.526823 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod252f8668_c013_4a42_9977_123205eefcdb.slice/crio-3c9b28be37f3f443d90015d061363a3de44ae7b3c5969899f62aca73b822e2df WatchSource:0}: Error finding container 3c9b28be37f3f443d90015d061363a3de44ae7b3c5969899f62aca73b822e2df: Status 404 returned error can't find the container with id 3c9b28be37f3f443d90015d061363a3de44ae7b3c5969899f62aca73b822e2df Apr 16 19:36:54.982694 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:54.982637 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" event={"ID":"252f8668-c013-4a42-9977-123205eefcdb","Type":"ContainerStarted","Data":"3c9b28be37f3f443d90015d061363a3de44ae7b3c5969899f62aca73b822e2df"} Apr 16 19:36:57.087335 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:57.087287 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 16 19:36:57.087620 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:57.087381 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 16 19:36:57.087620 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:57.087410 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 16 19:36:57.994251 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:57.994208 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" event={"ID":"252f8668-c013-4a42-9977-123205eefcdb","Type":"ContainerStarted","Data":"7c0e833fd026491e9299cc7a67babf8a54f2480cdd17cd54b2dbc1de8437043b"} Apr 16 19:36:58.014350 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:58.014293 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" podStartSLOduration=1.4560659440000001 podStartE2EDuration="4.014277297s" podCreationTimestamp="2026-04-16 19:36:54 +0000 UTC" firstStartedPulling="2026-04-16 19:36:54.528804939 +0000 UTC m=+397.452501516" lastFinishedPulling="2026-04-16 19:36:57.087016289 +0000 UTC m=+400.010712869" observedRunningTime="2026-04-16 19:36:58.012706629 +0000 UTC m=+400.936403230" watchObservedRunningTime="2026-04-16 19:36:58.014277297 +0000 UTC m=+400.937973896" Apr 16 19:36:58.395154 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:58.395111 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:58.399951 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:58.399921 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:58.997574 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:58.997541 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:36:58.998601 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:36:58.998577 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f4ffn5" Apr 16 19:37:21.410889 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:21.410855 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-n55pc"] Apr 16 19:37:21.414194 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:21.414174 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-n55pc" Apr 16 19:37:21.416590 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:21.416568 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 19:37:21.417450 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:21.417432 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 19:37:21.417526 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:21.417432 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-6dv74\"" Apr 16 19:37:21.423437 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:21.423410 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-n55pc"] Apr 16 19:37:21.467474 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:21.467434 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzt4x\" (UniqueName: \"kubernetes.io/projected/a531335b-4da1-471c-ab04-0242d261e113-kube-api-access-gzt4x\") pod \"kuadrant-operator-catalog-n55pc\" (UID: \"a531335b-4da1-471c-ab04-0242d261e113\") " pod="kuadrant-system/kuadrant-operator-catalog-n55pc" Apr 16 19:37:21.568210 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:21.568172 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzt4x\" (UniqueName: \"kubernetes.io/projected/a531335b-4da1-471c-ab04-0242d261e113-kube-api-access-gzt4x\") pod \"kuadrant-operator-catalog-n55pc\" (UID: \"a531335b-4da1-471c-ab04-0242d261e113\") " pod="kuadrant-system/kuadrant-operator-catalog-n55pc" Apr 16 19:37:21.580306 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:21.580269 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzt4x\" (UniqueName: \"kubernetes.io/projected/a531335b-4da1-471c-ab04-0242d261e113-kube-api-access-gzt4x\") pod \"kuadrant-operator-catalog-n55pc\" (UID: \"a531335b-4da1-471c-ab04-0242d261e113\") " pod="kuadrant-system/kuadrant-operator-catalog-n55pc" Apr 16 19:37:21.724432 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:21.724328 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-n55pc" Apr 16 19:37:21.781609 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:21.781535 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-n55pc"] Apr 16 19:37:21.847202 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:21.847169 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-n55pc"] Apr 16 19:37:21.850274 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:37:21.850228 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda531335b_4da1_471c_ab04_0242d261e113.slice/crio-b346153786cc9b69319535192b86aa762cf2fba85f28c288a4f41968d6dda254 WatchSource:0}: Error finding container b346153786cc9b69319535192b86aa762cf2fba85f28c288a4f41968d6dda254: Status 404 returned error can't find the container with id b346153786cc9b69319535192b86aa762cf2fba85f28c288a4f41968d6dda254 Apr 16 19:37:21.992488 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:21.992397 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mrmsv"] Apr 16 19:37:21.997052 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:21.997026 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-mrmsv" Apr 16 19:37:22.002405 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:22.002379 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mrmsv"] Apr 16 19:37:22.070341 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:22.070306 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-n55pc" event={"ID":"a531335b-4da1-471c-ab04-0242d261e113","Type":"ContainerStarted","Data":"b346153786cc9b69319535192b86aa762cf2fba85f28c288a4f41968d6dda254"} Apr 16 19:37:22.071755 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:22.071738 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv5wd\" (UniqueName: \"kubernetes.io/projected/8a729e29-27a0-4a24-8ec0-ff538e303f7e-kube-api-access-bv5wd\") pod \"kuadrant-operator-catalog-mrmsv\" (UID: \"8a729e29-27a0-4a24-8ec0-ff538e303f7e\") " pod="kuadrant-system/kuadrant-operator-catalog-mrmsv" Apr 16 19:37:22.173100 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:22.173055 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bv5wd\" (UniqueName: \"kubernetes.io/projected/8a729e29-27a0-4a24-8ec0-ff538e303f7e-kube-api-access-bv5wd\") pod \"kuadrant-operator-catalog-mrmsv\" (UID: \"8a729e29-27a0-4a24-8ec0-ff538e303f7e\") " pod="kuadrant-system/kuadrant-operator-catalog-mrmsv" Apr 16 19:37:22.182012 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:22.181988 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv5wd\" (UniqueName: \"kubernetes.io/projected/8a729e29-27a0-4a24-8ec0-ff538e303f7e-kube-api-access-bv5wd\") pod \"kuadrant-operator-catalog-mrmsv\" (UID: \"8a729e29-27a0-4a24-8ec0-ff538e303f7e\") " pod="kuadrant-system/kuadrant-operator-catalog-mrmsv" Apr 16 19:37:22.308291 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:22.308209 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-mrmsv" Apr 16 19:37:22.443594 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:22.443564 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mrmsv"] Apr 16 19:37:22.445794 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:37:22.445762 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a729e29_27a0_4a24_8ec0_ff538e303f7e.slice/crio-c24752b89c1ea340c5e3dff54cb402076e295bc6edf488cbbc25aba73da39d72 WatchSource:0}: Error finding container c24752b89c1ea340c5e3dff54cb402076e295bc6edf488cbbc25aba73da39d72: Status 404 returned error can't find the container with id c24752b89c1ea340c5e3dff54cb402076e295bc6edf488cbbc25aba73da39d72 Apr 16 19:37:23.076023 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:23.075950 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-mrmsv" event={"ID":"8a729e29-27a0-4a24-8ec0-ff538e303f7e","Type":"ContainerStarted","Data":"c24752b89c1ea340c5e3dff54cb402076e295bc6edf488cbbc25aba73da39d72"} Apr 16 19:37:24.080811 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:24.080773 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-mrmsv" event={"ID":"8a729e29-27a0-4a24-8ec0-ff538e303f7e","Type":"ContainerStarted","Data":"8eb874b2e5e1a89fcb4abab31cd498eb135f78fdea336769f9fa3c38a39f1bde"} Apr 16 19:37:25.086912 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:25.086876 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-n55pc" event={"ID":"a531335b-4da1-471c-ab04-0242d261e113","Type":"ContainerStarted","Data":"fee84dee29e2dbff6730b6b06f555a3a90f3d297e486fb83dfe9362f71239b08"} Apr 16 19:37:25.087400 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:25.087025 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-n55pc" podUID="a531335b-4da1-471c-ab04-0242d261e113" containerName="registry-server" containerID="cri-o://fee84dee29e2dbff6730b6b06f555a3a90f3d297e486fb83dfe9362f71239b08" gracePeriod=2 Apr 16 19:37:25.103556 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:25.103515 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-mrmsv" podStartSLOduration=2.55089677 podStartE2EDuration="4.103500829s" podCreationTimestamp="2026-04-16 19:37:21 +0000 UTC" firstStartedPulling="2026-04-16 19:37:22.447261284 +0000 UTC m=+425.370957862" lastFinishedPulling="2026-04-16 19:37:23.999865343 +0000 UTC m=+426.923561921" observedRunningTime="2026-04-16 19:37:25.101259653 +0000 UTC m=+428.024956254" watchObservedRunningTime="2026-04-16 19:37:25.103500829 +0000 UTC m=+428.027197434" Apr 16 19:37:25.115764 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:25.115721 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-n55pc" podStartSLOduration=1.970317241 podStartE2EDuration="4.115711008s" podCreationTimestamp="2026-04-16 19:37:21 +0000 UTC" firstStartedPulling="2026-04-16 19:37:21.851581338 +0000 UTC m=+424.775277916" lastFinishedPulling="2026-04-16 19:37:23.996975088 +0000 UTC m=+426.920671683" observedRunningTime="2026-04-16 19:37:25.115091369 +0000 UTC m=+428.038787966" watchObservedRunningTime="2026-04-16 19:37:25.115711008 +0000 UTC m=+428.039407637" Apr 16 19:37:25.322647 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:25.322619 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-n55pc" Apr 16 19:37:25.403111 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:25.403037 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzt4x\" (UniqueName: \"kubernetes.io/projected/a531335b-4da1-471c-ab04-0242d261e113-kube-api-access-gzt4x\") pod \"a531335b-4da1-471c-ab04-0242d261e113\" (UID: \"a531335b-4da1-471c-ab04-0242d261e113\") " Apr 16 19:37:25.405215 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:25.405195 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a531335b-4da1-471c-ab04-0242d261e113-kube-api-access-gzt4x" (OuterVolumeSpecName: "kube-api-access-gzt4x") pod "a531335b-4da1-471c-ab04-0242d261e113" (UID: "a531335b-4da1-471c-ab04-0242d261e113"). InnerVolumeSpecName "kube-api-access-gzt4x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:37:25.503997 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:25.503969 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gzt4x\" (UniqueName: \"kubernetes.io/projected/a531335b-4da1-471c-ab04-0242d261e113-kube-api-access-gzt4x\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:37:26.091419 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:26.091382 2568 generic.go:358] "Generic (PLEG): container finished" podID="a531335b-4da1-471c-ab04-0242d261e113" containerID="fee84dee29e2dbff6730b6b06f555a3a90f3d297e486fb83dfe9362f71239b08" exitCode=0 Apr 16 19:37:26.091908 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:26.091445 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-n55pc" event={"ID":"a531335b-4da1-471c-ab04-0242d261e113","Type":"ContainerDied","Data":"fee84dee29e2dbff6730b6b06f555a3a90f3d297e486fb83dfe9362f71239b08"} Apr 16 19:37:26.091908 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:26.091448 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-n55pc" Apr 16 19:37:26.091908 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:26.091470 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-n55pc" event={"ID":"a531335b-4da1-471c-ab04-0242d261e113","Type":"ContainerDied","Data":"b346153786cc9b69319535192b86aa762cf2fba85f28c288a4f41968d6dda254"} Apr 16 19:37:26.091908 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:26.091485 2568 scope.go:117] "RemoveContainer" containerID="fee84dee29e2dbff6730b6b06f555a3a90f3d297e486fb83dfe9362f71239b08" Apr 16 19:37:26.099957 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:26.099931 2568 scope.go:117] "RemoveContainer" containerID="fee84dee29e2dbff6730b6b06f555a3a90f3d297e486fb83dfe9362f71239b08" Apr 16 19:37:26.100199 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:37:26.100183 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fee84dee29e2dbff6730b6b06f555a3a90f3d297e486fb83dfe9362f71239b08\": container with ID starting with fee84dee29e2dbff6730b6b06f555a3a90f3d297e486fb83dfe9362f71239b08 not found: ID does not exist" containerID="fee84dee29e2dbff6730b6b06f555a3a90f3d297e486fb83dfe9362f71239b08" Apr 16 19:37:26.100263 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:26.100211 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fee84dee29e2dbff6730b6b06f555a3a90f3d297e486fb83dfe9362f71239b08"} err="failed to get container status \"fee84dee29e2dbff6730b6b06f555a3a90f3d297e486fb83dfe9362f71239b08\": rpc error: code = NotFound desc = could not find container \"fee84dee29e2dbff6730b6b06f555a3a90f3d297e486fb83dfe9362f71239b08\": container with ID starting with fee84dee29e2dbff6730b6b06f555a3a90f3d297e486fb83dfe9362f71239b08 not found: ID does not exist" Apr 16 19:37:26.108549 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:26.108523 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-n55pc"] Apr 16 19:37:26.110651 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:26.110633 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-n55pc"] Apr 16 19:37:27.652278 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:27.652236 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a531335b-4da1-471c-ab04-0242d261e113" path="/var/lib/kubelet/pods/a531335b-4da1-471c-ab04-0242d261e113/volumes" Apr 16 19:37:32.308474 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:32.308439 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-mrmsv" Apr 16 19:37:32.308474 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:32.308482 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-mrmsv" Apr 16 19:37:32.333445 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:32.333415 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-mrmsv" Apr 16 19:37:33.137109 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:33.137083 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-mrmsv" Apr 16 19:37:57.269550 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:57.269516 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz"] Apr 16 19:37:57.269943 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:57.269877 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a531335b-4da1-471c-ab04-0242d261e113" containerName="registry-server" Apr 16 19:37:57.269943 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:57.269888 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a531335b-4da1-471c-ab04-0242d261e113" containerName="registry-server" Apr 16 19:37:57.269943 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:57.269945 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a531335b-4da1-471c-ab04-0242d261e113" containerName="registry-server" Apr 16 19:37:57.277580 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:57.277547 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz" Apr 16 19:37:57.279945 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:57.279927 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-5wzxq\"" Apr 16 19:37:57.283796 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:57.283770 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz"] Apr 16 19:37:57.376134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:57.376103 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpl5b\" (UniqueName: \"kubernetes.io/projected/8748c11e-0743-4946-aace-88de6de54143-kube-api-access-dpl5b\") pod \"limitador-operator-controller-manager-85c4996f8c-ctrcz\" (UID: \"8748c11e-0743-4946-aace-88de6de54143\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz" Apr 16 19:37:57.476929 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:57.476887 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpl5b\" (UniqueName: \"kubernetes.io/projected/8748c11e-0743-4946-aace-88de6de54143-kube-api-access-dpl5b\") pod \"limitador-operator-controller-manager-85c4996f8c-ctrcz\" (UID: \"8748c11e-0743-4946-aace-88de6de54143\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz" Apr 16 19:37:57.492533 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:57.492497 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpl5b\" (UniqueName: \"kubernetes.io/projected/8748c11e-0743-4946-aace-88de6de54143-kube-api-access-dpl5b\") pod \"limitador-operator-controller-manager-85c4996f8c-ctrcz\" (UID: \"8748c11e-0743-4946-aace-88de6de54143\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz" Apr 16 19:37:57.590196 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:57.590166 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz" Apr 16 19:37:57.710996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:57.710969 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz"] Apr 16 19:37:57.713827 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:37:57.713796 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8748c11e_0743_4946_aace_88de6de54143.slice/crio-74d815b9982bfb6a2efa3cb14089a4a3179a34c5b14601160dd40e0379fc52ca WatchSource:0}: Error finding container 74d815b9982bfb6a2efa3cb14089a4a3179a34c5b14601160dd40e0379fc52ca: Status 404 returned error can't find the container with id 74d815b9982bfb6a2efa3cb14089a4a3179a34c5b14601160dd40e0379fc52ca Apr 16 19:37:58.190191 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:37:58.190159 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz" event={"ID":"8748c11e-0743-4946-aace-88de6de54143","Type":"ContainerStarted","Data":"74d815b9982bfb6a2efa3cb14089a4a3179a34c5b14601160dd40e0379fc52ca"} Apr 16 19:38:00.199256 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:00.199214 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz" event={"ID":"8748c11e-0743-4946-aace-88de6de54143","Type":"ContainerStarted","Data":"dfc2d73c9555b10f13380f80ad1dede5f280b829a789428b30fa8016eeb63832"} Apr 16 19:38:00.199659 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:00.199283 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz" Apr 16 19:38:00.217901 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:00.217844 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz" podStartSLOduration=1.243393792 podStartE2EDuration="3.21782611s" podCreationTimestamp="2026-04-16 19:37:57 +0000 UTC" firstStartedPulling="2026-04-16 19:37:57.715992417 +0000 UTC m=+460.639688996" lastFinishedPulling="2026-04-16 19:37:59.690424733 +0000 UTC m=+462.614121314" observedRunningTime="2026-04-16 19:38:00.216072075 +0000 UTC m=+463.139768698" watchObservedRunningTime="2026-04-16 19:38:00.21782611 +0000 UTC m=+463.141522709" Apr 16 19:38:00.567813 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:00.567730 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-bjtrt"] Apr 16 19:38:00.571004 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:00.570972 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bjtrt" Apr 16 19:38:00.573578 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:00.573555 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 19:38:00.573687 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:00.573619 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-7wfw8\"" Apr 16 19:38:00.591555 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:00.591526 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-bjtrt"] Apr 16 19:38:00.705960 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:00.705917 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxc28\" (UniqueName: \"kubernetes.io/projected/c75a6e35-6996-48d2-a141-e778996fc546-kube-api-access-rxc28\") pod \"dns-operator-controller-manager-648d5c98bc-bjtrt\" (UID: \"c75a6e35-6996-48d2-a141-e778996fc546\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bjtrt" Apr 16 19:38:00.807126 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:00.807089 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxc28\" (UniqueName: \"kubernetes.io/projected/c75a6e35-6996-48d2-a141-e778996fc546-kube-api-access-rxc28\") pod \"dns-operator-controller-manager-648d5c98bc-bjtrt\" (UID: \"c75a6e35-6996-48d2-a141-e778996fc546\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bjtrt" Apr 16 19:38:00.817717 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:00.817661 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxc28\" (UniqueName: \"kubernetes.io/projected/c75a6e35-6996-48d2-a141-e778996fc546-kube-api-access-rxc28\") pod \"dns-operator-controller-manager-648d5c98bc-bjtrt\" (UID: \"c75a6e35-6996-48d2-a141-e778996fc546\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bjtrt" Apr 16 19:38:00.881264 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:00.881223 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bjtrt" Apr 16 19:38:01.037145 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:01.037108 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-bjtrt"] Apr 16 19:38:01.038435 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:38:01.038396 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc75a6e35_6996_48d2_a141_e778996fc546.slice/crio-52c79957e0b849cfa9a30ae245d1175bf89c52e53122ca899d1a343afbade2f3 WatchSource:0}: Error finding container 52c79957e0b849cfa9a30ae245d1175bf89c52e53122ca899d1a343afbade2f3: Status 404 returned error can't find the container with id 52c79957e0b849cfa9a30ae245d1175bf89c52e53122ca899d1a343afbade2f3 Apr 16 19:38:01.203973 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:01.203880 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bjtrt" event={"ID":"c75a6e35-6996-48d2-a141-e778996fc546","Type":"ContainerStarted","Data":"52c79957e0b849cfa9a30ae245d1175bf89c52e53122ca899d1a343afbade2f3"} Apr 16 19:38:04.217749 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:04.217706 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bjtrt" event={"ID":"c75a6e35-6996-48d2-a141-e778996fc546","Type":"ContainerStarted","Data":"db75b481042b4931fdeca6cbf17d0415a7c16fbfc04ca547fc9a9a1bf8da8c00"} Apr 16 19:38:04.218214 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:04.217816 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bjtrt" Apr 16 19:38:04.250004 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:04.249945 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bjtrt" podStartSLOduration=1.838902783 podStartE2EDuration="4.249923895s" podCreationTimestamp="2026-04-16 19:38:00 +0000 UTC" firstStartedPulling="2026-04-16 19:38:01.0405744 +0000 UTC m=+463.964270977" lastFinishedPulling="2026-04-16 19:38:03.451595509 +0000 UTC m=+466.375292089" observedRunningTime="2026-04-16 19:38:04.248698798 +0000 UTC m=+467.172395390" watchObservedRunningTime="2026-04-16 19:38:04.249923895 +0000 UTC m=+467.173620496" Apr 16 19:38:05.683658 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:05.683620 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv"] Apr 16 19:38:05.686823 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:05.686799 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" Apr 16 19:38:05.689316 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:05.689297 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-v5mgg\"" Apr 16 19:38:05.695870 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:05.695844 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv"] Apr 16 19:38:05.859516 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:05.859482 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcg5m\" (UniqueName: \"kubernetes.io/projected/5b37f065-06d5-4055-b6d7-c0a8dc6b14e2-kube-api-access-wcg5m\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d2lqv\" (UID: \"5b37f065-06d5-4055-b6d7-c0a8dc6b14e2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" Apr 16 19:38:05.859706 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:05.859614 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5b37f065-06d5-4055-b6d7-c0a8dc6b14e2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d2lqv\" (UID: \"5b37f065-06d5-4055-b6d7-c0a8dc6b14e2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" Apr 16 19:38:05.960819 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:05.960696 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5b37f065-06d5-4055-b6d7-c0a8dc6b14e2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d2lqv\" (UID: \"5b37f065-06d5-4055-b6d7-c0a8dc6b14e2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" Apr 16 19:38:05.960819 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:05.960760 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcg5m\" (UniqueName: \"kubernetes.io/projected/5b37f065-06d5-4055-b6d7-c0a8dc6b14e2-kube-api-access-wcg5m\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d2lqv\" (UID: \"5b37f065-06d5-4055-b6d7-c0a8dc6b14e2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" Apr 16 19:38:05.961076 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:05.961055 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5b37f065-06d5-4055-b6d7-c0a8dc6b14e2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d2lqv\" (UID: \"5b37f065-06d5-4055-b6d7-c0a8dc6b14e2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" Apr 16 19:38:05.969323 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:05.969294 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcg5m\" (UniqueName: \"kubernetes.io/projected/5b37f065-06d5-4055-b6d7-c0a8dc6b14e2-kube-api-access-wcg5m\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d2lqv\" (UID: \"5b37f065-06d5-4055-b6d7-c0a8dc6b14e2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" Apr 16 19:38:05.997306 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:05.997264 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" Apr 16 19:38:06.148227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:06.148181 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv"] Apr 16 19:38:06.153489 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:38:06.153455 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b37f065_06d5_4055_b6d7_c0a8dc6b14e2.slice/crio-94375bcef14decff991039b1e6cfa90aeda64cd4d0ff80751f73e0d54c3366b0 WatchSource:0}: Error finding container 94375bcef14decff991039b1e6cfa90aeda64cd4d0ff80751f73e0d54c3366b0: Status 404 returned error can't find the container with id 94375bcef14decff991039b1e6cfa90aeda64cd4d0ff80751f73e0d54c3366b0 Apr 16 19:38:06.226099 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:06.226010 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" event={"ID":"5b37f065-06d5-4055-b6d7-c0a8dc6b14e2","Type":"ContainerStarted","Data":"94375bcef14decff991039b1e6cfa90aeda64cd4d0ff80751f73e0d54c3366b0"} Apr 16 19:38:11.206569 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:11.206536 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz" Apr 16 19:38:11.246329 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:11.246290 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" event={"ID":"5b37f065-06d5-4055-b6d7-c0a8dc6b14e2","Type":"ContainerStarted","Data":"676d42f12a033a1910dc9c0445bb0b748c42f475ae18624421bf41e673cb3e92"} Apr 16 19:38:11.246512 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:11.246406 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" Apr 16 19:38:11.272967 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:11.272900 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" podStartSLOduration=1.3363279559999999 podStartE2EDuration="6.272879666s" podCreationTimestamp="2026-04-16 19:38:05 +0000 UTC" firstStartedPulling="2026-04-16 19:38:06.156010728 +0000 UTC m=+469.079707306" lastFinishedPulling="2026-04-16 19:38:11.092562437 +0000 UTC m=+474.016259016" observedRunningTime="2026-04-16 19:38:11.269521306 +0000 UTC m=+474.193217907" watchObservedRunningTime="2026-04-16 19:38:11.272879666 +0000 UTC m=+474.196576265" Apr 16 19:38:15.224783 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:15.224754 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-bjtrt" Apr 16 19:38:22.252545 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:22.252512 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" Apr 16 19:38:23.139825 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.139796 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv"] Apr 16 19:38:23.140059 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.140011 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" podUID="5b37f065-06d5-4055-b6d7-c0a8dc6b14e2" containerName="manager" containerID="cri-o://676d42f12a033a1910dc9c0445bb0b748c42f475ae18624421bf41e673cb3e92" gracePeriod=2 Apr 16 19:38:23.145997 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.145970 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv"] Apr 16 19:38:23.172898 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.172857 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz"] Apr 16 19:38:23.173206 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.173180 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz" podUID="8748c11e-0743-4946-aace-88de6de54143" containerName="manager" containerID="cri-o://dfc2d73c9555b10f13380f80ad1dede5f280b829a789428b30fa8016eeb63832" gracePeriod=2 Apr 16 19:38:23.174501 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.174472 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv"] Apr 16 19:38:23.174851 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.174836 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b37f065-06d5-4055-b6d7-c0a8dc6b14e2" containerName="manager" Apr 16 19:38:23.174915 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.174854 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b37f065-06d5-4055-b6d7-c0a8dc6b14e2" containerName="manager" Apr 16 19:38:23.174951 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.174920 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b37f065-06d5-4055-b6d7-c0a8dc6b14e2" containerName="manager" Apr 16 19:38:23.175808 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.175769 2568 status_manager.go:895] "Failed to get status for pod" podUID="5b37f065-06d5-4055-b6d7-c0a8dc6b14e2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-d2lqv\" is forbidden: User \"system:node:ip-10-0-133-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-198.ec2.internal' and this object" Apr 16 19:38:23.177911 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.177893 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" Apr 16 19:38:23.189222 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.189190 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv"] Apr 16 19:38:23.193342 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.193299 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz"] Apr 16 19:38:23.203421 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.203392 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bchsj"] Apr 16 19:38:23.203924 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.203897 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8748c11e-0743-4946-aace-88de6de54143" containerName="manager" Apr 16 19:38:23.203924 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.203921 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8748c11e-0743-4946-aace-88de6de54143" containerName="manager" Apr 16 19:38:23.204105 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.203993 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="8748c11e-0743-4946-aace-88de6de54143" containerName="manager" Apr 16 19:38:23.206952 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.206931 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bchsj" Apr 16 19:38:23.208467 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.208443 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c2a371b7-8f78-4f00-a198-36d9c8b02f33-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-kzmgv\" (UID: \"c2a371b7-8f78-4f00-a198-36d9c8b02f33\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" Apr 16 19:38:23.208575 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.208531 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8ggm\" (UniqueName: \"kubernetes.io/projected/c2a371b7-8f78-4f00-a198-36d9c8b02f33-kube-api-access-k8ggm\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-kzmgv\" (UID: \"c2a371b7-8f78-4f00-a198-36d9c8b02f33\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" Apr 16 19:38:23.217293 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.217253 2568 status_manager.go:895] "Failed to get status for pod" podUID="5b37f065-06d5-4055-b6d7-c0a8dc6b14e2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-d2lqv\" is forbidden: User \"system:node:ip-10-0-133-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-198.ec2.internal' and this object" Apr 16 19:38:23.219357 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.219156 2568 status_manager.go:895] "Failed to get status for pod" podUID="5b37f065-06d5-4055-b6d7-c0a8dc6b14e2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-d2lqv\" is forbidden: User \"system:node:ip-10-0-133-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-198.ec2.internal' and this object" Apr 16 19:38:23.220129 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.220106 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bchsj"] Apr 16 19:38:23.290640 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.290605 2568 generic.go:358] "Generic (PLEG): container finished" podID="5b37f065-06d5-4055-b6d7-c0a8dc6b14e2" containerID="676d42f12a033a1910dc9c0445bb0b748c42f475ae18624421bf41e673cb3e92" exitCode=0 Apr 16 19:38:23.292364 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.292338 2568 generic.go:358] "Generic (PLEG): container finished" podID="8748c11e-0743-4946-aace-88de6de54143" containerID="dfc2d73c9555b10f13380f80ad1dede5f280b829a789428b30fa8016eeb63832" exitCode=0 Apr 16 19:38:23.309532 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.309487 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c2a371b7-8f78-4f00-a198-36d9c8b02f33-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-kzmgv\" (UID: \"c2a371b7-8f78-4f00-a198-36d9c8b02f33\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" Apr 16 19:38:23.309731 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.309642 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8ggm\" (UniqueName: \"kubernetes.io/projected/c2a371b7-8f78-4f00-a198-36d9c8b02f33-kube-api-access-k8ggm\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-kzmgv\" (UID: \"c2a371b7-8f78-4f00-a198-36d9c8b02f33\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" Apr 16 19:38:23.309731 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.309716 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4thjj\" (UniqueName: \"kubernetes.io/projected/a39e3df1-0e66-40fc-ba71-aca0fec7fa56-kube-api-access-4thjj\") pod \"limitador-operator-controller-manager-85c4996f8c-bchsj\" (UID: \"a39e3df1-0e66-40fc-ba71-aca0fec7fa56\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bchsj" Apr 16 19:38:23.309912 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.309891 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c2a371b7-8f78-4f00-a198-36d9c8b02f33-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-kzmgv\" (UID: \"c2a371b7-8f78-4f00-a198-36d9c8b02f33\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" Apr 16 19:38:23.324878 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.324842 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8ggm\" (UniqueName: \"kubernetes.io/projected/c2a371b7-8f78-4f00-a198-36d9c8b02f33-kube-api-access-k8ggm\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-kzmgv\" (UID: \"c2a371b7-8f78-4f00-a198-36d9c8b02f33\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" Apr 16 19:38:23.410865 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.410830 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4thjj\" (UniqueName: \"kubernetes.io/projected/a39e3df1-0e66-40fc-ba71-aca0fec7fa56-kube-api-access-4thjj\") pod \"limitador-operator-controller-manager-85c4996f8c-bchsj\" (UID: \"a39e3df1-0e66-40fc-ba71-aca0fec7fa56\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bchsj" Apr 16 19:38:23.422244 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.422222 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" Apr 16 19:38:23.424611 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.424579 2568 status_manager.go:895] "Failed to get status for pod" podUID="5b37f065-06d5-4055-b6d7-c0a8dc6b14e2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-d2lqv\" is forbidden: User \"system:node:ip-10-0-133-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-198.ec2.internal' and this object" Apr 16 19:38:23.425483 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.425468 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz" Apr 16 19:38:23.427816 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.427791 2568 status_manager.go:895] "Failed to get status for pod" podUID="8748c11e-0743-4946-aace-88de6de54143" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz" err="pods \"limitador-operator-controller-manager-85c4996f8c-ctrcz\" is forbidden: User \"system:node:ip-10-0-133-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-198.ec2.internal' and this object" Apr 16 19:38:23.429886 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.429862 2568 status_manager.go:895] "Failed to get status for pod" podUID="5b37f065-06d5-4055-b6d7-c0a8dc6b14e2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-d2lqv\" is forbidden: User \"system:node:ip-10-0-133-198.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-198.ec2.internal' and this object" Apr 16 19:38:23.441101 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.441073 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4thjj\" (UniqueName: \"kubernetes.io/projected/a39e3df1-0e66-40fc-ba71-aca0fec7fa56-kube-api-access-4thjj\") pod \"limitador-operator-controller-manager-85c4996f8c-bchsj\" (UID: \"a39e3df1-0e66-40fc-ba71-aca0fec7fa56\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bchsj" Apr 16 19:38:23.511375 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.511341 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpl5b\" (UniqueName: \"kubernetes.io/projected/8748c11e-0743-4946-aace-88de6de54143-kube-api-access-dpl5b\") pod \"8748c11e-0743-4946-aace-88de6de54143\" (UID: \"8748c11e-0743-4946-aace-88de6de54143\") " Apr 16 19:38:23.511571 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.511395 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcg5m\" (UniqueName: \"kubernetes.io/projected/5b37f065-06d5-4055-b6d7-c0a8dc6b14e2-kube-api-access-wcg5m\") pod \"5b37f065-06d5-4055-b6d7-c0a8dc6b14e2\" (UID: \"5b37f065-06d5-4055-b6d7-c0a8dc6b14e2\") " Apr 16 19:38:23.511571 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.511414 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5b37f065-06d5-4055-b6d7-c0a8dc6b14e2-extensions-socket-volume\") pod \"5b37f065-06d5-4055-b6d7-c0a8dc6b14e2\" (UID: \"5b37f065-06d5-4055-b6d7-c0a8dc6b14e2\") " Apr 16 19:38:23.511856 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.511832 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b37f065-06d5-4055-b6d7-c0a8dc6b14e2-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "5b37f065-06d5-4055-b6d7-c0a8dc6b14e2" (UID: "5b37f065-06d5-4055-b6d7-c0a8dc6b14e2"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:38:23.513698 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.513627 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8748c11e-0743-4946-aace-88de6de54143-kube-api-access-dpl5b" (OuterVolumeSpecName: "kube-api-access-dpl5b") pod "8748c11e-0743-4946-aace-88de6de54143" (UID: "8748c11e-0743-4946-aace-88de6de54143"). InnerVolumeSpecName "kube-api-access-dpl5b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:38:23.513698 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.513627 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b37f065-06d5-4055-b6d7-c0a8dc6b14e2-kube-api-access-wcg5m" (OuterVolumeSpecName: "kube-api-access-wcg5m") pod "5b37f065-06d5-4055-b6d7-c0a8dc6b14e2" (UID: "5b37f065-06d5-4055-b6d7-c0a8dc6b14e2"). InnerVolumeSpecName "kube-api-access-wcg5m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:38:23.562883 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.562837 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" Apr 16 19:38:23.568668 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.568636 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bchsj" Apr 16 19:38:23.612736 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.612616 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dpl5b\" (UniqueName: \"kubernetes.io/projected/8748c11e-0743-4946-aace-88de6de54143-kube-api-access-dpl5b\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:38:23.612736 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.612650 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wcg5m\" (UniqueName: \"kubernetes.io/projected/5b37f065-06d5-4055-b6d7-c0a8dc6b14e2-kube-api-access-wcg5m\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:38:23.612736 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.612664 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/5b37f065-06d5-4055-b6d7-c0a8dc6b14e2-extensions-socket-volume\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:38:23.653061 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.652965 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b37f065-06d5-4055-b6d7-c0a8dc6b14e2" path="/var/lib/kubelet/pods/5b37f065-06d5-4055-b6d7-c0a8dc6b14e2/volumes" Apr 16 19:38:23.653377 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.653358 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8748c11e-0743-4946-aace-88de6de54143" path="/var/lib/kubelet/pods/8748c11e-0743-4946-aace-88de6de54143/volumes" Apr 16 19:38:23.705576 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.705546 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv"] Apr 16 19:38:23.708789 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:38:23.708754 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a371b7_8f78_4f00_a198_36d9c8b02f33.slice/crio-8c7ce0e51489d0a2ceebecfff10794313f03c15ca628f2d87d8441bff6059479 WatchSource:0}: Error finding container 8c7ce0e51489d0a2ceebecfff10794313f03c15ca628f2d87d8441bff6059479: Status 404 returned error can't find the container with id 8c7ce0e51489d0a2ceebecfff10794313f03c15ca628f2d87d8441bff6059479 Apr 16 19:38:23.727472 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:23.727446 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bchsj"] Apr 16 19:38:23.730354 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:38:23.730329 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda39e3df1_0e66_40fc_ba71_aca0fec7fa56.slice/crio-a148eb882a0e8fa90b869dd7450e8477a04d8e4da957935c5e2d9b45f5bc4770 WatchSource:0}: Error finding container a148eb882a0e8fa90b869dd7450e8477a04d8e4da957935c5e2d9b45f5bc4770: Status 404 returned error can't find the container with id a148eb882a0e8fa90b869dd7450e8477a04d8e4da957935c5e2d9b45f5bc4770 Apr 16 19:38:24.296802 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:24.296768 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bchsj" event={"ID":"a39e3df1-0e66-40fc-ba71-aca0fec7fa56","Type":"ContainerStarted","Data":"fcaac1af9f610944d4a767fb001cdc34e3799dd41ecf745672b9a97d92b36e02"} Apr 16 19:38:24.296802 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:24.296808 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bchsj" event={"ID":"a39e3df1-0e66-40fc-ba71-aca0fec7fa56","Type":"ContainerStarted","Data":"a148eb882a0e8fa90b869dd7450e8477a04d8e4da957935c5e2d9b45f5bc4770"} Apr 16 19:38:24.297337 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:24.296890 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bchsj" Apr 16 19:38:24.298221 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:24.298197 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" event={"ID":"c2a371b7-8f78-4f00-a198-36d9c8b02f33","Type":"ContainerStarted","Data":"30604b0942643ba287206e9998d9430bdbb973b9f17029e2ee25b4326d64719f"} Apr 16 19:38:24.298343 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:24.298228 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" event={"ID":"c2a371b7-8f78-4f00-a198-36d9c8b02f33","Type":"ContainerStarted","Data":"8c7ce0e51489d0a2ceebecfff10794313f03c15ca628f2d87d8441bff6059479"} Apr 16 19:38:24.298343 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:24.298265 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" Apr 16 19:38:24.299283 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:24.299260 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-ctrcz" Apr 16 19:38:24.299373 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:24.299285 2568 scope.go:117] "RemoveContainer" containerID="dfc2d73c9555b10f13380f80ad1dede5f280b829a789428b30fa8016eeb63832" Apr 16 19:38:24.300619 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:24.300603 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2lqv" Apr 16 19:38:24.307524 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:24.307505 2568 scope.go:117] "RemoveContainer" containerID="676d42f12a033a1910dc9c0445bb0b748c42f475ae18624421bf41e673cb3e92" Apr 16 19:38:24.321424 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:24.321370 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bchsj" podStartSLOduration=1.321351565 podStartE2EDuration="1.321351565s" podCreationTimestamp="2026-04-16 19:38:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:38:24.321167645 +0000 UTC m=+487.244864245" watchObservedRunningTime="2026-04-16 19:38:24.321351565 +0000 UTC m=+487.245048166" Apr 16 19:38:24.356259 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:24.356210 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" podStartSLOduration=1.356193854 podStartE2EDuration="1.356193854s" podCreationTimestamp="2026-04-16 19:38:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:38:24.355508229 +0000 UTC m=+487.279204829" watchObservedRunningTime="2026-04-16 19:38:24.356193854 +0000 UTC m=+487.279890454" Apr 16 19:38:35.307543 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:35.307511 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bchsj" Apr 16 19:38:35.308023 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:35.307562 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" Apr 16 19:38:39.969534 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:39.969499 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv"] Apr 16 19:38:39.970029 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:39.969802 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" podUID="c2a371b7-8f78-4f00-a198-36d9c8b02f33" containerName="manager" containerID="cri-o://30604b0942643ba287206e9998d9430bdbb973b9f17029e2ee25b4326d64719f" gracePeriod=10 Apr 16 19:38:40.208740 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.208716 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" Apr 16 19:38:40.263812 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.263726 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8ggm\" (UniqueName: \"kubernetes.io/projected/c2a371b7-8f78-4f00-a198-36d9c8b02f33-kube-api-access-k8ggm\") pod \"c2a371b7-8f78-4f00-a198-36d9c8b02f33\" (UID: \"c2a371b7-8f78-4f00-a198-36d9c8b02f33\") " Apr 16 19:38:40.263812 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.263808 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c2a371b7-8f78-4f00-a198-36d9c8b02f33-extensions-socket-volume\") pod \"c2a371b7-8f78-4f00-a198-36d9c8b02f33\" (UID: \"c2a371b7-8f78-4f00-a198-36d9c8b02f33\") " Apr 16 19:38:40.264203 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.264181 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a371b7-8f78-4f00-a198-36d9c8b02f33-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "c2a371b7-8f78-4f00-a198-36d9c8b02f33" (UID: "c2a371b7-8f78-4f00-a198-36d9c8b02f33"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:38:40.265957 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.265930 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a371b7-8f78-4f00-a198-36d9c8b02f33-kube-api-access-k8ggm" (OuterVolumeSpecName: "kube-api-access-k8ggm") pod "c2a371b7-8f78-4f00-a198-36d9c8b02f33" (UID: "c2a371b7-8f78-4f00-a198-36d9c8b02f33"). InnerVolumeSpecName "kube-api-access-k8ggm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:38:40.357281 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.357248 2568 generic.go:358] "Generic (PLEG): container finished" podID="c2a371b7-8f78-4f00-a198-36d9c8b02f33" containerID="30604b0942643ba287206e9998d9430bdbb973b9f17029e2ee25b4326d64719f" exitCode=0 Apr 16 19:38:40.357463 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.357305 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" event={"ID":"c2a371b7-8f78-4f00-a198-36d9c8b02f33","Type":"ContainerDied","Data":"30604b0942643ba287206e9998d9430bdbb973b9f17029e2ee25b4326d64719f"} Apr 16 19:38:40.357463 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.357311 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" Apr 16 19:38:40.357463 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.357331 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv" event={"ID":"c2a371b7-8f78-4f00-a198-36d9c8b02f33","Type":"ContainerDied","Data":"8c7ce0e51489d0a2ceebecfff10794313f03c15ca628f2d87d8441bff6059479"} Apr 16 19:38:40.357463 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.357345 2568 scope.go:117] "RemoveContainer" containerID="30604b0942643ba287206e9998d9430bdbb973b9f17029e2ee25b4326d64719f" Apr 16 19:38:40.365034 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.364799 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k8ggm\" (UniqueName: \"kubernetes.io/projected/c2a371b7-8f78-4f00-a198-36d9c8b02f33-kube-api-access-k8ggm\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:38:40.365034 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.364828 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c2a371b7-8f78-4f00-a198-36d9c8b02f33-extensions-socket-volume\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:38:40.366032 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.366011 2568 scope.go:117] "RemoveContainer" containerID="30604b0942643ba287206e9998d9430bdbb973b9f17029e2ee25b4326d64719f" Apr 16 19:38:40.366296 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:38:40.366278 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30604b0942643ba287206e9998d9430bdbb973b9f17029e2ee25b4326d64719f\": container with ID starting with 30604b0942643ba287206e9998d9430bdbb973b9f17029e2ee25b4326d64719f not found: ID does not exist" containerID="30604b0942643ba287206e9998d9430bdbb973b9f17029e2ee25b4326d64719f" Apr 16 19:38:40.366366 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.366306 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30604b0942643ba287206e9998d9430bdbb973b9f17029e2ee25b4326d64719f"} err="failed to get container status \"30604b0942643ba287206e9998d9430bdbb973b9f17029e2ee25b4326d64719f\": rpc error: code = NotFound desc = could not find container \"30604b0942643ba287206e9998d9430bdbb973b9f17029e2ee25b4326d64719f\": container with ID starting with 30604b0942643ba287206e9998d9430bdbb973b9f17029e2ee25b4326d64719f not found: ID does not exist" Apr 16 19:38:40.378844 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.378811 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv"] Apr 16 19:38:40.382408 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:40.382380 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-kzmgv"] Apr 16 19:38:41.652860 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:41.652826 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a371b7-8f78-4f00-a198-36d9c8b02f33" path="/var/lib/kubelet/pods/c2a371b7-8f78-4f00-a198-36d9c8b02f33/volumes" Apr 16 19:38:56.226442 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.226405 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls"] Apr 16 19:38:56.227006 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.226979 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2a371b7-8f78-4f00-a198-36d9c8b02f33" containerName="manager" Apr 16 19:38:56.227099 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.227010 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a371b7-8f78-4f00-a198-36d9c8b02f33" containerName="manager" Apr 16 19:38:56.227179 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.227156 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2a371b7-8f78-4f00-a198-36d9c8b02f33" containerName="manager" Apr 16 19:38:56.232148 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.232123 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.234885 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.234861 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-d4k7v\"" Apr 16 19:38:56.247645 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.247620 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls"] Apr 16 19:38:56.410509 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.410466 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.410509 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.410504 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/81f21eff-1442-413c-b0d9-faf842fe8771-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.410831 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.410565 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppt59\" (UniqueName: \"kubernetes.io/projected/81f21eff-1442-413c-b0d9-faf842fe8771-kube-api-access-ppt59\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.410831 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.410604 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/81f21eff-1442-413c-b0d9-faf842fe8771-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.410831 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.410635 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.410831 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.410666 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.410831 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.410730 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.410831 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.410746 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.410831 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.410789 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/81f21eff-1442-413c-b0d9-faf842fe8771-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.511958 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.511864 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.511958 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.511914 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.511958 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.511954 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/81f21eff-1442-413c-b0d9-faf842fe8771-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.512225 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.512005 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.512225 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.512029 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/81f21eff-1442-413c-b0d9-faf842fe8771-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.512225 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.512071 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppt59\" (UniqueName: \"kubernetes.io/projected/81f21eff-1442-413c-b0d9-faf842fe8771-kube-api-access-ppt59\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.512225 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.512112 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/81f21eff-1442-413c-b0d9-faf842fe8771-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.512438 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.512319 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.512438 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.512333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.512438 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.512395 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.512438 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.512403 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.512604 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.512450 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.512643 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.512632 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/81f21eff-1442-413c-b0d9-faf842fe8771-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.512711 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.512635 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.514511 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.514469 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/81f21eff-1442-413c-b0d9-faf842fe8771-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.514641 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.514554 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/81f21eff-1442-413c-b0d9-faf842fe8771-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.523384 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.523348 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/81f21eff-1442-413c-b0d9-faf842fe8771-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.523633 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.523617 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppt59\" (UniqueName: \"kubernetes.io/projected/81f21eff-1442-413c-b0d9-faf842fe8771-kube-api-access-ppt59\") pod \"maas-default-gateway-openshift-default-845c6b4b48-pzrls\" (UID: \"81f21eff-1442-413c-b0d9-faf842fe8771\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.545892 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.545871 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:56.670621 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.670586 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls"] Apr 16 19:38:56.673772 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:38:56.673742 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81f21eff_1442_413c_b0d9_faf842fe8771.slice/crio-951ab5d1262aaae773d6b238aead158b7d10feacd05bf1ba24a49757fa2e2a6c WatchSource:0}: Error finding container 951ab5d1262aaae773d6b238aead158b7d10feacd05bf1ba24a49757fa2e2a6c: Status 404 returned error can't find the container with id 951ab5d1262aaae773d6b238aead158b7d10feacd05bf1ba24a49757fa2e2a6c Apr 16 19:38:56.675879 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.675845 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 16 19:38:56.675996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.675924 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 16 19:38:56.675996 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:56.675967 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 16 19:38:57.417093 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:57.417058 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" event={"ID":"81f21eff-1442-413c-b0d9-faf842fe8771","Type":"ContainerStarted","Data":"0fa3905e7a02f3502ac6385871c3c431eef30eefb3065c16ae349c74654cd70a"} Apr 16 19:38:57.417093 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:57.417093 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" event={"ID":"81f21eff-1442-413c-b0d9-faf842fe8771","Type":"ContainerStarted","Data":"951ab5d1262aaae773d6b238aead158b7d10feacd05bf1ba24a49757fa2e2a6c"} Apr 16 19:38:57.437882 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:57.437832 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" podStartSLOduration=1.437816562 podStartE2EDuration="1.437816562s" podCreationTimestamp="2026-04-16 19:38:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:38:57.43548718 +0000 UTC m=+520.359183780" watchObservedRunningTime="2026-04-16 19:38:57.437816562 +0000 UTC m=+520.361513162" Apr 16 19:38:57.546328 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:57.546296 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:57.555529 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:57.555505 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:58.420322 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:58.420294 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:38:58.421277 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:38:58.421260 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-pzrls" Apr 16 19:39:00.515710 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:00.515661 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-w4g7z"] Apr 16 19:39:00.519407 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:00.519387 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" Apr 16 19:39:00.521735 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:00.521715 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 19:39:00.521823 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:00.521717 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-5m27b\"" Apr 16 19:39:00.529371 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:00.529347 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-w4g7z"] Apr 16 19:39:00.544113 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:00.544088 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r257h\" (UniqueName: \"kubernetes.io/projected/c487095b-8d0d-4ea1-a6f1-f7975db4ce83-kube-api-access-r257h\") pod \"limitador-limitador-7d549b5b-w4g7z\" (UID: \"c487095b-8d0d-4ea1-a6f1-f7975db4ce83\") " pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" Apr 16 19:39:00.544203 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:00.544139 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c487095b-8d0d-4ea1-a6f1-f7975db4ce83-config-file\") pod \"limitador-limitador-7d549b5b-w4g7z\" (UID: \"c487095b-8d0d-4ea1-a6f1-f7975db4ce83\") " pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" Apr 16 19:39:00.614815 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:00.614779 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-w4g7z"] Apr 16 19:39:00.644857 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:00.644829 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r257h\" (UniqueName: \"kubernetes.io/projected/c487095b-8d0d-4ea1-a6f1-f7975db4ce83-kube-api-access-r257h\") pod \"limitador-limitador-7d549b5b-w4g7z\" (UID: \"c487095b-8d0d-4ea1-a6f1-f7975db4ce83\") " pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" Apr 16 19:39:00.645027 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:00.644887 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c487095b-8d0d-4ea1-a6f1-f7975db4ce83-config-file\") pod \"limitador-limitador-7d549b5b-w4g7z\" (UID: \"c487095b-8d0d-4ea1-a6f1-f7975db4ce83\") " pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" Apr 16 19:39:00.645552 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:00.645527 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c487095b-8d0d-4ea1-a6f1-f7975db4ce83-config-file\") pod \"limitador-limitador-7d549b5b-w4g7z\" (UID: \"c487095b-8d0d-4ea1-a6f1-f7975db4ce83\") " pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" Apr 16 19:39:00.652538 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:00.652519 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r257h\" (UniqueName: \"kubernetes.io/projected/c487095b-8d0d-4ea1-a6f1-f7975db4ce83-kube-api-access-r257h\") pod \"limitador-limitador-7d549b5b-w4g7z\" (UID: \"c487095b-8d0d-4ea1-a6f1-f7975db4ce83\") " pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" Apr 16 19:39:00.830429 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:00.830349 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" Apr 16 19:39:00.954786 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:00.954759 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-w4g7z"] Apr 16 19:39:00.956992 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:39:00.956960 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc487095b_8d0d_4ea1_a6f1_f7975db4ce83.slice/crio-633d7af5b24971d44d5fd740bf9b450497761a347c25ed399192406349fd4703 WatchSource:0}: Error finding container 633d7af5b24971d44d5fd740bf9b450497761a347c25ed399192406349fd4703: Status 404 returned error can't find the container with id 633d7af5b24971d44d5fd740bf9b450497761a347c25ed399192406349fd4703 Apr 16 19:39:01.430785 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:01.430744 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" event={"ID":"c487095b-8d0d-4ea1-a6f1-f7975db4ce83","Type":"ContainerStarted","Data":"633d7af5b24971d44d5fd740bf9b450497761a347c25ed399192406349fd4703"} Apr 16 19:39:01.488888 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:01.488859 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-tcmw5"] Apr 16 19:39:01.493822 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:01.493799 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-tcmw5" Apr 16 19:39:01.496173 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:01.496147 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-tcmw5"] Apr 16 19:39:01.496292 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:01.496208 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-892hz\"" Apr 16 19:39:01.552479 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:01.552443 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7xsr\" (UniqueName: \"kubernetes.io/projected/d35d9da3-538a-4765-b2e3-5fbe42d7c743-kube-api-access-z7xsr\") pod \"authorino-7498df8756-tcmw5\" (UID: \"d35d9da3-538a-4765-b2e3-5fbe42d7c743\") " pod="kuadrant-system/authorino-7498df8756-tcmw5" Apr 16 19:39:01.655367 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:01.654846 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7xsr\" (UniqueName: \"kubernetes.io/projected/d35d9da3-538a-4765-b2e3-5fbe42d7c743-kube-api-access-z7xsr\") pod \"authorino-7498df8756-tcmw5\" (UID: \"d35d9da3-538a-4765-b2e3-5fbe42d7c743\") " pod="kuadrant-system/authorino-7498df8756-tcmw5" Apr 16 19:39:01.663482 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:01.663445 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7xsr\" (UniqueName: \"kubernetes.io/projected/d35d9da3-538a-4765-b2e3-5fbe42d7c743-kube-api-access-z7xsr\") pod \"authorino-7498df8756-tcmw5\" (UID: \"d35d9da3-538a-4765-b2e3-5fbe42d7c743\") " pod="kuadrant-system/authorino-7498df8756-tcmw5" Apr 16 19:39:01.804158 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:01.804065 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-tcmw5" Apr 16 19:39:01.972113 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:01.972073 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-tcmw5"] Apr 16 19:39:01.978959 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:39:01.978243 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd35d9da3_538a_4765_b2e3_5fbe42d7c743.slice/crio-60e52948d5bd643ffc0234c120844e7054eb88589cb917405ad9e3f367a99f54 WatchSource:0}: Error finding container 60e52948d5bd643ffc0234c120844e7054eb88589cb917405ad9e3f367a99f54: Status 404 returned error can't find the container with id 60e52948d5bd643ffc0234c120844e7054eb88589cb917405ad9e3f367a99f54 Apr 16 19:39:02.437641 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:02.437601 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-tcmw5" event={"ID":"d35d9da3-538a-4765-b2e3-5fbe42d7c743","Type":"ContainerStarted","Data":"60e52948d5bd643ffc0234c120844e7054eb88589cb917405ad9e3f367a99f54"} Apr 16 19:39:06.452661 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:06.452623 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-tcmw5" event={"ID":"d35d9da3-538a-4765-b2e3-5fbe42d7c743","Type":"ContainerStarted","Data":"6732f702a4e2f7f9868465e4284f3a794c2bd253ff8715359e138da817ab0034"} Apr 16 19:39:06.453966 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:06.453943 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" event={"ID":"c487095b-8d0d-4ea1-a6f1-f7975db4ce83","Type":"ContainerStarted","Data":"e18f75870772c02ede47a9c9d0c8adf8d5778488a83bd01e4af1ca9ae5c4336c"} Apr 16 19:39:06.454075 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:06.454058 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" Apr 16 19:39:06.469958 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:06.469914 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-tcmw5" podStartSLOduration=1.6900772530000001 podStartE2EDuration="5.469901538s" podCreationTimestamp="2026-04-16 19:39:01 +0000 UTC" firstStartedPulling="2026-04-16 19:39:01.981372719 +0000 UTC m=+524.905069311" lastFinishedPulling="2026-04-16 19:39:05.761197003 +0000 UTC m=+528.684893596" observedRunningTime="2026-04-16 19:39:06.468196105 +0000 UTC m=+529.391892704" watchObservedRunningTime="2026-04-16 19:39:06.469901538 +0000 UTC m=+529.393598138" Apr 16 19:39:06.487252 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:06.487198 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" podStartSLOduration=1.6915431170000002 podStartE2EDuration="6.487179703s" podCreationTimestamp="2026-04-16 19:39:00 +0000 UTC" firstStartedPulling="2026-04-16 19:39:00.958667323 +0000 UTC m=+523.882363911" lastFinishedPulling="2026-04-16 19:39:05.754303893 +0000 UTC m=+528.678000497" observedRunningTime="2026-04-16 19:39:06.48472094 +0000 UTC m=+529.408417545" watchObservedRunningTime="2026-04-16 19:39:06.487179703 +0000 UTC m=+529.410876304" Apr 16 19:39:15.114427 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:15.114385 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-w4g7z"] Apr 16 19:39:15.114874 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:15.114696 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" podUID="c487095b-8d0d-4ea1-a6f1-f7975db4ce83" containerName="limitador" containerID="cri-o://e18f75870772c02ede47a9c9d0c8adf8d5778488a83bd01e4af1ca9ae5c4336c" gracePeriod=30 Apr 16 19:39:15.115499 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:15.115308 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" Apr 16 19:39:15.665306 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:15.665278 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" Apr 16 19:39:15.776527 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:15.776438 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c487095b-8d0d-4ea1-a6f1-f7975db4ce83-config-file\") pod \"c487095b-8d0d-4ea1-a6f1-f7975db4ce83\" (UID: \"c487095b-8d0d-4ea1-a6f1-f7975db4ce83\") " Apr 16 19:39:15.776527 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:15.776502 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r257h\" (UniqueName: \"kubernetes.io/projected/c487095b-8d0d-4ea1-a6f1-f7975db4ce83-kube-api-access-r257h\") pod \"c487095b-8d0d-4ea1-a6f1-f7975db4ce83\" (UID: \"c487095b-8d0d-4ea1-a6f1-f7975db4ce83\") " Apr 16 19:39:15.776885 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:15.776859 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c487095b-8d0d-4ea1-a6f1-f7975db4ce83-config-file" (OuterVolumeSpecName: "config-file") pod "c487095b-8d0d-4ea1-a6f1-f7975db4ce83" (UID: "c487095b-8d0d-4ea1-a6f1-f7975db4ce83"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:39:15.778766 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:15.778740 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c487095b-8d0d-4ea1-a6f1-f7975db4ce83-kube-api-access-r257h" (OuterVolumeSpecName: "kube-api-access-r257h") pod "c487095b-8d0d-4ea1-a6f1-f7975db4ce83" (UID: "c487095b-8d0d-4ea1-a6f1-f7975db4ce83"). InnerVolumeSpecName "kube-api-access-r257h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:39:15.877345 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:15.877303 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r257h\" (UniqueName: \"kubernetes.io/projected/c487095b-8d0d-4ea1-a6f1-f7975db4ce83-kube-api-access-r257h\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:39:15.877345 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:15.877338 2568 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c487095b-8d0d-4ea1-a6f1-f7975db4ce83-config-file\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:39:16.463974 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.463935 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-pvbrp"] Apr 16 19:39:16.464615 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.464590 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c487095b-8d0d-4ea1-a6f1-f7975db4ce83" containerName="limitador" Apr 16 19:39:16.464739 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.464617 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="c487095b-8d0d-4ea1-a6f1-f7975db4ce83" containerName="limitador" Apr 16 19:39:16.464739 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.464730 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="c487095b-8d0d-4ea1-a6f1-f7975db4ce83" containerName="limitador" Apr 16 19:39:16.467975 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.467951 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-pvbrp" Apr 16 19:39:16.471100 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.471075 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 19:39:16.471233 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.471075 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-zhwmw\"" Apr 16 19:39:16.474237 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.474204 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-pvbrp"] Apr 16 19:39:16.490747 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.490717 2568 generic.go:358] "Generic (PLEG): container finished" podID="c487095b-8d0d-4ea1-a6f1-f7975db4ce83" containerID="e18f75870772c02ede47a9c9d0c8adf8d5778488a83bd01e4af1ca9ae5c4336c" exitCode=0 Apr 16 19:39:16.490890 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.490769 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" Apr 16 19:39:16.490890 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.490793 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" event={"ID":"c487095b-8d0d-4ea1-a6f1-f7975db4ce83","Type":"ContainerDied","Data":"e18f75870772c02ede47a9c9d0c8adf8d5778488a83bd01e4af1ca9ae5c4336c"} Apr 16 19:39:16.490890 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.490835 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-w4g7z" event={"ID":"c487095b-8d0d-4ea1-a6f1-f7975db4ce83","Type":"ContainerDied","Data":"633d7af5b24971d44d5fd740bf9b450497761a347c25ed399192406349fd4703"} Apr 16 19:39:16.490890 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.490851 2568 scope.go:117] "RemoveContainer" containerID="e18f75870772c02ede47a9c9d0c8adf8d5778488a83bd01e4af1ca9ae5c4336c" Apr 16 19:39:16.499181 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.499166 2568 scope.go:117] "RemoveContainer" containerID="e18f75870772c02ede47a9c9d0c8adf8d5778488a83bd01e4af1ca9ae5c4336c" Apr 16 19:39:16.499463 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:39:16.499441 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e18f75870772c02ede47a9c9d0c8adf8d5778488a83bd01e4af1ca9ae5c4336c\": container with ID starting with e18f75870772c02ede47a9c9d0c8adf8d5778488a83bd01e4af1ca9ae5c4336c not found: ID does not exist" containerID="e18f75870772c02ede47a9c9d0c8adf8d5778488a83bd01e4af1ca9ae5c4336c" Apr 16 19:39:16.499933 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.499471 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e18f75870772c02ede47a9c9d0c8adf8d5778488a83bd01e4af1ca9ae5c4336c"} err="failed to get container status \"e18f75870772c02ede47a9c9d0c8adf8d5778488a83bd01e4af1ca9ae5c4336c\": rpc error: code = NotFound desc = could not find container \"e18f75870772c02ede47a9c9d0c8adf8d5778488a83bd01e4af1ca9ae5c4336c\": container with ID starting with e18f75870772c02ede47a9c9d0c8adf8d5778488a83bd01e4af1ca9ae5c4336c not found: ID does not exist" Apr 16 19:39:16.511734 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.511710 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-w4g7z"] Apr 16 19:39:16.515301 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.515272 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-w4g7z"] Apr 16 19:39:16.583655 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.583614 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qrwf\" (UniqueName: \"kubernetes.io/projected/09be7125-6ad9-41eb-983a-9ce9abbbc7e8-kube-api-access-5qrwf\") pod \"postgres-868db5846d-pvbrp\" (UID: \"09be7125-6ad9-41eb-983a-9ce9abbbc7e8\") " pod="opendatahub/postgres-868db5846d-pvbrp" Apr 16 19:39:16.583869 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.583661 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/09be7125-6ad9-41eb-983a-9ce9abbbc7e8-data\") pod \"postgres-868db5846d-pvbrp\" (UID: \"09be7125-6ad9-41eb-983a-9ce9abbbc7e8\") " pod="opendatahub/postgres-868db5846d-pvbrp" Apr 16 19:39:16.685227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.685175 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qrwf\" (UniqueName: \"kubernetes.io/projected/09be7125-6ad9-41eb-983a-9ce9abbbc7e8-kube-api-access-5qrwf\") pod \"postgres-868db5846d-pvbrp\" (UID: \"09be7125-6ad9-41eb-983a-9ce9abbbc7e8\") " pod="opendatahub/postgres-868db5846d-pvbrp" Apr 16 19:39:16.685412 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.685247 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/09be7125-6ad9-41eb-983a-9ce9abbbc7e8-data\") pod \"postgres-868db5846d-pvbrp\" (UID: \"09be7125-6ad9-41eb-983a-9ce9abbbc7e8\") " pod="opendatahub/postgres-868db5846d-pvbrp" Apr 16 19:39:16.685727 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.685660 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/09be7125-6ad9-41eb-983a-9ce9abbbc7e8-data\") pod \"postgres-868db5846d-pvbrp\" (UID: \"09be7125-6ad9-41eb-983a-9ce9abbbc7e8\") " pod="opendatahub/postgres-868db5846d-pvbrp" Apr 16 19:39:16.693960 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.693933 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qrwf\" (UniqueName: \"kubernetes.io/projected/09be7125-6ad9-41eb-983a-9ce9abbbc7e8-kube-api-access-5qrwf\") pod \"postgres-868db5846d-pvbrp\" (UID: \"09be7125-6ad9-41eb-983a-9ce9abbbc7e8\") " pod="opendatahub/postgres-868db5846d-pvbrp" Apr 16 19:39:16.782089 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.781993 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-pvbrp" Apr 16 19:39:16.904187 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:16.904155 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-pvbrp"] Apr 16 19:39:16.907632 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:39:16.907600 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09be7125_6ad9_41eb_983a_9ce9abbbc7e8.slice/crio-2b2ab4138f0f22b61dd7c908da586e6716018f64ba091f7e9ceac9b6f696aec4 WatchSource:0}: Error finding container 2b2ab4138f0f22b61dd7c908da586e6716018f64ba091f7e9ceac9b6f696aec4: Status 404 returned error can't find the container with id 2b2ab4138f0f22b61dd7c908da586e6716018f64ba091f7e9ceac9b6f696aec4 Apr 16 19:39:17.496252 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:17.496220 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-pvbrp" event={"ID":"09be7125-6ad9-41eb-983a-9ce9abbbc7e8","Type":"ContainerStarted","Data":"2b2ab4138f0f22b61dd7c908da586e6716018f64ba091f7e9ceac9b6f696aec4"} Apr 16 19:39:17.653089 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:17.653057 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c487095b-8d0d-4ea1-a6f1-f7975db4ce83" path="/var/lib/kubelet/pods/c487095b-8d0d-4ea1-a6f1-f7975db4ce83/volumes" Apr 16 19:39:22.055740 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:22.055715 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 19:39:22.519170 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:22.519132 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-pvbrp" event={"ID":"09be7125-6ad9-41eb-983a-9ce9abbbc7e8","Type":"ContainerStarted","Data":"ad9770975ab18ed3b67169361d3e05b06386a8c39105f2055b5cda54dda01806"} Apr 16 19:39:22.519393 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:22.519246 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-pvbrp" Apr 16 19:39:22.536127 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:22.536067 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-pvbrp" podStartSLOduration=1.391967593 podStartE2EDuration="6.536051607s" podCreationTimestamp="2026-04-16 19:39:16 +0000 UTC" firstStartedPulling="2026-04-16 19:39:16.908912757 +0000 UTC m=+539.832609335" lastFinishedPulling="2026-04-16 19:39:22.052996768 +0000 UTC m=+544.976693349" observedRunningTime="2026-04-16 19:39:22.535620898 +0000 UTC m=+545.459317499" watchObservedRunningTime="2026-04-16 19:39:22.536051607 +0000 UTC m=+545.459748206" Apr 16 19:39:28.551846 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:28.551812 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-pvbrp" Apr 16 19:39:29.087284 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.087235 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9wkzc"] Apr 16 19:39:29.093030 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.093004 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9wkzc" Apr 16 19:39:29.098716 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.098661 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9wkzc"] Apr 16 19:39:29.201330 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.201292 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hqgm\" (UniqueName: \"kubernetes.io/projected/98bc635d-7053-4cdd-b1c9-e87361be78be-kube-api-access-7hqgm\") pod \"authorino-8b475cf9f-9wkzc\" (UID: \"98bc635d-7053-4cdd-b1c9-e87361be78be\") " pod="kuadrant-system/authorino-8b475cf9f-9wkzc" Apr 16 19:39:29.275847 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.275808 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9wkzc"] Apr 16 19:39:29.276133 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:39:29.276107 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-7hqgm], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-9wkzc" podUID="98bc635d-7053-4cdd-b1c9-e87361be78be" Apr 16 19:39:29.300585 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.300545 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-56fdd757f5-v5q52"] Apr 16 19:39:29.302119 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.302087 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hqgm\" (UniqueName: \"kubernetes.io/projected/98bc635d-7053-4cdd-b1c9-e87361be78be-kube-api-access-7hqgm\") pod \"authorino-8b475cf9f-9wkzc\" (UID: \"98bc635d-7053-4cdd-b1c9-e87361be78be\") " pod="kuadrant-system/authorino-8b475cf9f-9wkzc" Apr 16 19:39:29.304435 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.304413 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-v5q52" Apr 16 19:39:29.306784 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.306764 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 19:39:29.311031 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.311007 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-v5q52"] Apr 16 19:39:29.316379 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.316354 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hqgm\" (UniqueName: \"kubernetes.io/projected/98bc635d-7053-4cdd-b1c9-e87361be78be-kube-api-access-7hqgm\") pod \"authorino-8b475cf9f-9wkzc\" (UID: \"98bc635d-7053-4cdd-b1c9-e87361be78be\") " pod="kuadrant-system/authorino-8b475cf9f-9wkzc" Apr 16 19:39:29.352347 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.352254 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-v5q52"] Apr 16 19:39:29.352570 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:39:29.352548 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-l7pbm tls-cert], unattached volumes=[], failed to process volumes=[kube-api-access-l7pbm tls-cert]: context canceled" pod="kuadrant-system/authorino-56fdd757f5-v5q52" podUID="bfdc3675-f193-4465-a2c7-b0e995d0235b" Apr 16 19:39:29.385793 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.385761 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8686888594-ggmfk"] Apr 16 19:39:29.389552 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.389529 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8686888594-ggmfk" Apr 16 19:39:29.395070 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.395041 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8686888594-ggmfk"] Apr 16 19:39:29.403227 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.403193 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/bfdc3675-f193-4465-a2c7-b0e995d0235b-tls-cert\") pod \"authorino-56fdd757f5-v5q52\" (UID: \"bfdc3675-f193-4465-a2c7-b0e995d0235b\") " pod="kuadrant-system/authorino-56fdd757f5-v5q52" Apr 16 19:39:29.403381 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.403247 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7pbm\" (UniqueName: \"kubernetes.io/projected/bfdc3675-f193-4465-a2c7-b0e995d0235b-kube-api-access-l7pbm\") pod \"authorino-56fdd757f5-v5q52\" (UID: \"bfdc3675-f193-4465-a2c7-b0e995d0235b\") " pod="kuadrant-system/authorino-56fdd757f5-v5q52" Apr 16 19:39:29.504722 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.504649 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/bfdc3675-f193-4465-a2c7-b0e995d0235b-tls-cert\") pod \"authorino-56fdd757f5-v5q52\" (UID: \"bfdc3675-f193-4465-a2c7-b0e995d0235b\") " pod="kuadrant-system/authorino-56fdd757f5-v5q52" Apr 16 19:39:29.504924 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.504744 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jrbr\" (UniqueName: \"kubernetes.io/projected/116077fb-55ed-444c-a319-dd5df133e639-kube-api-access-7jrbr\") pod \"authorino-8686888594-ggmfk\" (UID: \"116077fb-55ed-444c-a319-dd5df133e639\") " pod="kuadrant-system/authorino-8686888594-ggmfk" Apr 16 19:39:29.504924 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.504773 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7pbm\" (UniqueName: \"kubernetes.io/projected/bfdc3675-f193-4465-a2c7-b0e995d0235b-kube-api-access-l7pbm\") pod \"authorino-56fdd757f5-v5q52\" (UID: \"bfdc3675-f193-4465-a2c7-b0e995d0235b\") " pod="kuadrant-system/authorino-56fdd757f5-v5q52" Apr 16 19:39:29.504924 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.504810 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/116077fb-55ed-444c-a319-dd5df133e639-tls-cert\") pod \"authorino-8686888594-ggmfk\" (UID: \"116077fb-55ed-444c-a319-dd5df133e639\") " pod="kuadrant-system/authorino-8686888594-ggmfk" Apr 16 19:39:29.507307 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.507287 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/bfdc3675-f193-4465-a2c7-b0e995d0235b-tls-cert\") pod \"authorino-56fdd757f5-v5q52\" (UID: \"bfdc3675-f193-4465-a2c7-b0e995d0235b\") " pod="kuadrant-system/authorino-56fdd757f5-v5q52" Apr 16 19:39:29.513131 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.513104 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7pbm\" (UniqueName: \"kubernetes.io/projected/bfdc3675-f193-4465-a2c7-b0e995d0235b-kube-api-access-l7pbm\") pod \"authorino-56fdd757f5-v5q52\" (UID: \"bfdc3675-f193-4465-a2c7-b0e995d0235b\") " pod="kuadrant-system/authorino-56fdd757f5-v5q52" Apr 16 19:39:29.543772 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.543741 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-v5q52" Apr 16 19:39:29.543968 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.543741 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9wkzc" Apr 16 19:39:29.548987 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.548960 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-v5q52" Apr 16 19:39:29.552425 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.552402 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9wkzc" Apr 16 19:39:29.605792 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.605709 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hqgm\" (UniqueName: \"kubernetes.io/projected/98bc635d-7053-4cdd-b1c9-e87361be78be-kube-api-access-7hqgm\") pod \"98bc635d-7053-4cdd-b1c9-e87361be78be\" (UID: \"98bc635d-7053-4cdd-b1c9-e87361be78be\") " Apr 16 19:39:29.605792 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.605776 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/bfdc3675-f193-4465-a2c7-b0e995d0235b-tls-cert\") pod \"bfdc3675-f193-4465-a2c7-b0e995d0235b\" (UID: \"bfdc3675-f193-4465-a2c7-b0e995d0235b\") " Apr 16 19:39:29.605954 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.605862 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7pbm\" (UniqueName: \"kubernetes.io/projected/bfdc3675-f193-4465-a2c7-b0e995d0235b-kube-api-access-l7pbm\") pod \"bfdc3675-f193-4465-a2c7-b0e995d0235b\" (UID: \"bfdc3675-f193-4465-a2c7-b0e995d0235b\") " Apr 16 19:39:29.606048 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.606029 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jrbr\" (UniqueName: \"kubernetes.io/projected/116077fb-55ed-444c-a319-dd5df133e639-kube-api-access-7jrbr\") pod \"authorino-8686888594-ggmfk\" (UID: \"116077fb-55ed-444c-a319-dd5df133e639\") " pod="kuadrant-system/authorino-8686888594-ggmfk" Apr 16 19:39:29.606106 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.606079 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/116077fb-55ed-444c-a319-dd5df133e639-tls-cert\") pod \"authorino-8686888594-ggmfk\" (UID: \"116077fb-55ed-444c-a319-dd5df133e639\") " pod="kuadrant-system/authorino-8686888594-ggmfk" Apr 16 19:39:29.608152 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.608117 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfdc3675-f193-4465-a2c7-b0e995d0235b-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "bfdc3675-f193-4465-a2c7-b0e995d0235b" (UID: "bfdc3675-f193-4465-a2c7-b0e995d0235b"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:39:29.608152 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.608139 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98bc635d-7053-4cdd-b1c9-e87361be78be-kube-api-access-7hqgm" (OuterVolumeSpecName: "kube-api-access-7hqgm") pod "98bc635d-7053-4cdd-b1c9-e87361be78be" (UID: "98bc635d-7053-4cdd-b1c9-e87361be78be"). InnerVolumeSpecName "kube-api-access-7hqgm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:39:29.608342 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.608122 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfdc3675-f193-4465-a2c7-b0e995d0235b-kube-api-access-l7pbm" (OuterVolumeSpecName: "kube-api-access-l7pbm") pod "bfdc3675-f193-4465-a2c7-b0e995d0235b" (UID: "bfdc3675-f193-4465-a2c7-b0e995d0235b"). InnerVolumeSpecName "kube-api-access-l7pbm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:39:29.608995 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.608968 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/116077fb-55ed-444c-a319-dd5df133e639-tls-cert\") pod \"authorino-8686888594-ggmfk\" (UID: \"116077fb-55ed-444c-a319-dd5df133e639\") " pod="kuadrant-system/authorino-8686888594-ggmfk" Apr 16 19:39:29.613906 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.613885 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jrbr\" (UniqueName: \"kubernetes.io/projected/116077fb-55ed-444c-a319-dd5df133e639-kube-api-access-7jrbr\") pod \"authorino-8686888594-ggmfk\" (UID: \"116077fb-55ed-444c-a319-dd5df133e639\") " pod="kuadrant-system/authorino-8686888594-ggmfk" Apr 16 19:39:29.700911 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.700874 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8686888594-ggmfk" Apr 16 19:39:29.707027 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.707000 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7hqgm\" (UniqueName: \"kubernetes.io/projected/98bc635d-7053-4cdd-b1c9-e87361be78be-kube-api-access-7hqgm\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:39:29.707027 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.707028 2568 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/bfdc3675-f193-4465-a2c7-b0e995d0235b-tls-cert\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:39:29.707171 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.707040 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l7pbm\" (UniqueName: \"kubernetes.io/projected/bfdc3675-f193-4465-a2c7-b0e995d0235b-kube-api-access-l7pbm\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:39:29.831311 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:29.831284 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8686888594-ggmfk"] Apr 16 19:39:29.834023 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:39:29.833982 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod116077fb_55ed_444c_a319_dd5df133e639.slice/crio-dcd253de40ad744c6ac20bd92b037b82770034454c1e581808fac02f0183cd03 WatchSource:0}: Error finding container dcd253de40ad744c6ac20bd92b037b82770034454c1e581808fac02f0183cd03: Status 404 returned error can't find the container with id dcd253de40ad744c6ac20bd92b037b82770034454c1e581808fac02f0183cd03 Apr 16 19:39:30.548883 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:30.548794 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-9wkzc" Apr 16 19:39:30.548883 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:30.548818 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8686888594-ggmfk" event={"ID":"116077fb-55ed-444c-a319-dd5df133e639","Type":"ContainerStarted","Data":"c3be310a18e6251ec29178f8ab1ee28d8bc98a35ae24591a67991afb8ed3f436"} Apr 16 19:39:30.548883 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:30.548852 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8686888594-ggmfk" event={"ID":"116077fb-55ed-444c-a319-dd5df133e639","Type":"ContainerStarted","Data":"dcd253de40ad744c6ac20bd92b037b82770034454c1e581808fac02f0183cd03"} Apr 16 19:39:30.549142 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:30.549086 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-v5q52" Apr 16 19:39:30.567901 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:30.567844 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8686888594-ggmfk" podStartSLOduration=1.13159864 podStartE2EDuration="1.567827169s" podCreationTimestamp="2026-04-16 19:39:29 +0000 UTC" firstStartedPulling="2026-04-16 19:39:29.835364527 +0000 UTC m=+552.759061109" lastFinishedPulling="2026-04-16 19:39:30.271593058 +0000 UTC m=+553.195289638" observedRunningTime="2026-04-16 19:39:30.566503813 +0000 UTC m=+553.490200414" watchObservedRunningTime="2026-04-16 19:39:30.567827169 +0000 UTC m=+553.491523769" Apr 16 19:39:30.592480 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:30.592441 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9wkzc"] Apr 16 19:39:30.599890 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:30.599841 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-tcmw5"] Apr 16 19:39:30.600172 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:30.600136 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-tcmw5" podUID="d35d9da3-538a-4765-b2e3-5fbe42d7c743" containerName="authorino" containerID="cri-o://6732f702a4e2f7f9868465e4284f3a794c2bd253ff8715359e138da817ab0034" gracePeriod=30 Apr 16 19:39:30.602449 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:30.602421 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-9wkzc"] Apr 16 19:39:30.630585 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:30.630552 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-v5q52"] Apr 16 19:39:30.635327 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:30.635296 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-v5q52"] Apr 16 19:39:30.844645 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:30.844612 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-tcmw5" Apr 16 19:39:30.918790 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:30.918760 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7xsr\" (UniqueName: \"kubernetes.io/projected/d35d9da3-538a-4765-b2e3-5fbe42d7c743-kube-api-access-z7xsr\") pod \"d35d9da3-538a-4765-b2e3-5fbe42d7c743\" (UID: \"d35d9da3-538a-4765-b2e3-5fbe42d7c743\") " Apr 16 19:39:30.921065 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:30.921039 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d35d9da3-538a-4765-b2e3-5fbe42d7c743-kube-api-access-z7xsr" (OuterVolumeSpecName: "kube-api-access-z7xsr") pod "d35d9da3-538a-4765-b2e3-5fbe42d7c743" (UID: "d35d9da3-538a-4765-b2e3-5fbe42d7c743"). InnerVolumeSpecName "kube-api-access-z7xsr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:39:31.019854 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.019818 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z7xsr\" (UniqueName: \"kubernetes.io/projected/d35d9da3-538a-4765-b2e3-5fbe42d7c743-kube-api-access-z7xsr\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:39:31.395931 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.395899 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-j4w9f"] Apr 16 19:39:31.396246 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.396234 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d35d9da3-538a-4765-b2e3-5fbe42d7c743" containerName="authorino" Apr 16 19:39:31.396291 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.396250 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35d9da3-538a-4765-b2e3-5fbe42d7c743" containerName="authorino" Apr 16 19:39:31.396335 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.396325 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d35d9da3-538a-4765-b2e3-5fbe42d7c743" containerName="authorino" Apr 16 19:39:31.399464 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.399444 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-j4w9f" Apr 16 19:39:31.402489 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.402469 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-r6dgc\"" Apr 16 19:39:31.410745 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.410720 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-j4w9f"] Apr 16 19:39:31.525484 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.525454 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6f8f\" (UniqueName: \"kubernetes.io/projected/8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591-kube-api-access-f6f8f\") pod \"maas-controller-6d4c8f55f9-j4w9f\" (UID: \"8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591\") " pod="opendatahub/maas-controller-6d4c8f55f9-j4w9f" Apr 16 19:39:31.543106 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.543075 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-86c779c488-hpvrl"] Apr 16 19:39:31.546376 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.546361 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-86c779c488-hpvrl" Apr 16 19:39:31.553265 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.553224 2568 generic.go:358] "Generic (PLEG): container finished" podID="d35d9da3-538a-4765-b2e3-5fbe42d7c743" containerID="6732f702a4e2f7f9868465e4284f3a794c2bd253ff8715359e138da817ab0034" exitCode=0 Apr 16 19:39:31.553407 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.553326 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-tcmw5" Apr 16 19:39:31.553407 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.553368 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-tcmw5" event={"ID":"d35d9da3-538a-4765-b2e3-5fbe42d7c743","Type":"ContainerDied","Data":"6732f702a4e2f7f9868465e4284f3a794c2bd253ff8715359e138da817ab0034"} Apr 16 19:39:31.553407 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.553399 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-tcmw5" event={"ID":"d35d9da3-538a-4765-b2e3-5fbe42d7c743","Type":"ContainerDied","Data":"60e52948d5bd643ffc0234c120844e7054eb88589cb917405ad9e3f367a99f54"} Apr 16 19:39:31.553714 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.553419 2568 scope.go:117] "RemoveContainer" containerID="6732f702a4e2f7f9868465e4284f3a794c2bd253ff8715359e138da817ab0034" Apr 16 19:39:31.557801 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.557776 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-86c779c488-hpvrl"] Apr 16 19:39:31.564609 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.564585 2568 scope.go:117] "RemoveContainer" containerID="6732f702a4e2f7f9868465e4284f3a794c2bd253ff8715359e138da817ab0034" Apr 16 19:39:31.564938 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:39:31.564915 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6732f702a4e2f7f9868465e4284f3a794c2bd253ff8715359e138da817ab0034\": container with ID starting with 6732f702a4e2f7f9868465e4284f3a794c2bd253ff8715359e138da817ab0034 not found: ID does not exist" containerID="6732f702a4e2f7f9868465e4284f3a794c2bd253ff8715359e138da817ab0034" Apr 16 19:39:31.565016 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.564945 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6732f702a4e2f7f9868465e4284f3a794c2bd253ff8715359e138da817ab0034"} err="failed to get container status \"6732f702a4e2f7f9868465e4284f3a794c2bd253ff8715359e138da817ab0034\": rpc error: code = NotFound desc = could not find container \"6732f702a4e2f7f9868465e4284f3a794c2bd253ff8715359e138da817ab0034\": container with ID starting with 6732f702a4e2f7f9868465e4284f3a794c2bd253ff8715359e138da817ab0034 not found: ID does not exist" Apr 16 19:39:31.577604 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.577581 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-tcmw5"] Apr 16 19:39:31.581068 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.581049 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-tcmw5"] Apr 16 19:39:31.625927 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.625903 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsrzw\" (UniqueName: \"kubernetes.io/projected/e7df9913-e246-4762-a06f-cba4f338cc56-kube-api-access-tsrzw\") pod \"maas-controller-86c779c488-hpvrl\" (UID: \"e7df9913-e246-4762-a06f-cba4f338cc56\") " pod="opendatahub/maas-controller-86c779c488-hpvrl" Apr 16 19:39:31.626085 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.625951 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6f8f\" (UniqueName: \"kubernetes.io/projected/8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591-kube-api-access-f6f8f\") pod \"maas-controller-6d4c8f55f9-j4w9f\" (UID: \"8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591\") " pod="opendatahub/maas-controller-6d4c8f55f9-j4w9f" Apr 16 19:39:31.634688 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.634650 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6f8f\" (UniqueName: \"kubernetes.io/projected/8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591-kube-api-access-f6f8f\") pod \"maas-controller-6d4c8f55f9-j4w9f\" (UID: \"8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591\") " pod="opendatahub/maas-controller-6d4c8f55f9-j4w9f" Apr 16 19:39:31.653166 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.653114 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98bc635d-7053-4cdd-b1c9-e87361be78be" path="/var/lib/kubelet/pods/98bc635d-7053-4cdd-b1c9-e87361be78be/volumes" Apr 16 19:39:31.653356 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.653345 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfdc3675-f193-4465-a2c7-b0e995d0235b" path="/var/lib/kubelet/pods/bfdc3675-f193-4465-a2c7-b0e995d0235b/volumes" Apr 16 19:39:31.653535 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.653524 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d35d9da3-538a-4765-b2e3-5fbe42d7c743" path="/var/lib/kubelet/pods/d35d9da3-538a-4765-b2e3-5fbe42d7c743/volumes" Apr 16 19:39:31.669036 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.669013 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-j4w9f"] Apr 16 19:39:31.669243 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.669231 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-j4w9f" Apr 16 19:39:31.694242 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.694220 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-8944c6f4-sht7r"] Apr 16 19:39:31.699045 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.699026 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8944c6f4-sht7r" Apr 16 19:39:31.706866 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.706829 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-8944c6f4-sht7r"] Apr 16 19:39:31.727097 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.726906 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsrzw\" (UniqueName: \"kubernetes.io/projected/e7df9913-e246-4762-a06f-cba4f338cc56-kube-api-access-tsrzw\") pod \"maas-controller-86c779c488-hpvrl\" (UID: \"e7df9913-e246-4762-a06f-cba4f338cc56\") " pod="opendatahub/maas-controller-86c779c488-hpvrl" Apr 16 19:39:31.737041 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.737021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsrzw\" (UniqueName: \"kubernetes.io/projected/e7df9913-e246-4762-a06f-cba4f338cc56-kube-api-access-tsrzw\") pod \"maas-controller-86c779c488-hpvrl\" (UID: \"e7df9913-e246-4762-a06f-cba4f338cc56\") " pod="opendatahub/maas-controller-86c779c488-hpvrl" Apr 16 19:39:31.796720 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.796661 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-j4w9f"] Apr 16 19:39:31.800643 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:39:31.800586 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f61f7e5_fd65_4d21_b2f9_aa78ec5a1591.slice/crio-983bfa8830e46e217aff632e0c04fd44a7549f7b0c8e9dd72f7c609e7c4df569 WatchSource:0}: Error finding container 983bfa8830e46e217aff632e0c04fd44a7549f7b0c8e9dd72f7c609e7c4df569: Status 404 returned error can't find the container with id 983bfa8830e46e217aff632e0c04fd44a7549f7b0c8e9dd72f7c609e7c4df569 Apr 16 19:39:31.831341 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.831303 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6jm8\" (UniqueName: \"kubernetes.io/projected/1ac22216-feb3-42ec-af61-ea8025ac7410-kube-api-access-q6jm8\") pod \"maas-controller-8944c6f4-sht7r\" (UID: \"1ac22216-feb3-42ec-af61-ea8025ac7410\") " pod="opendatahub/maas-controller-8944c6f4-sht7r" Apr 16 19:39:31.858725 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.858689 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-86c779c488-hpvrl" Apr 16 19:39:31.935056 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.934227 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6jm8\" (UniqueName: \"kubernetes.io/projected/1ac22216-feb3-42ec-af61-ea8025ac7410-kube-api-access-q6jm8\") pod \"maas-controller-8944c6f4-sht7r\" (UID: \"1ac22216-feb3-42ec-af61-ea8025ac7410\") " pod="opendatahub/maas-controller-8944c6f4-sht7r" Apr 16 19:39:31.953690 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:31.953636 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6jm8\" (UniqueName: \"kubernetes.io/projected/1ac22216-feb3-42ec-af61-ea8025ac7410-kube-api-access-q6jm8\") pod \"maas-controller-8944c6f4-sht7r\" (UID: \"1ac22216-feb3-42ec-af61-ea8025ac7410\") " pod="opendatahub/maas-controller-8944c6f4-sht7r" Apr 16 19:39:32.013504 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:32.012976 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8944c6f4-sht7r" Apr 16 19:39:32.036883 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:32.036857 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-86c779c488-hpvrl"] Apr 16 19:39:32.039297 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:39:32.039264 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7df9913_e246_4762_a06f_cba4f338cc56.slice/crio-c7e1a668e2bc841809aeeeb232081f97e4c40f975b4e706ea1a98642a2ed8172 WatchSource:0}: Error finding container c7e1a668e2bc841809aeeeb232081f97e4c40f975b4e706ea1a98642a2ed8172: Status 404 returned error can't find the container with id c7e1a668e2bc841809aeeeb232081f97e4c40f975b4e706ea1a98642a2ed8172 Apr 16 19:39:32.145971 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:32.145947 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-8944c6f4-sht7r"] Apr 16 19:39:32.148205 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:39:32.148177 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ac22216_feb3_42ec_af61_ea8025ac7410.slice/crio-d09f94a2dcde3d08d070184300e57f34173d4d7a63cd2a84d2e0224d1eec32dc WatchSource:0}: Error finding container d09f94a2dcde3d08d070184300e57f34173d4d7a63cd2a84d2e0224d1eec32dc: Status 404 returned error can't find the container with id d09f94a2dcde3d08d070184300e57f34173d4d7a63cd2a84d2e0224d1eec32dc Apr 16 19:39:32.558606 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:32.558551 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8944c6f4-sht7r" event={"ID":"1ac22216-feb3-42ec-af61-ea8025ac7410","Type":"ContainerStarted","Data":"d09f94a2dcde3d08d070184300e57f34173d4d7a63cd2a84d2e0224d1eec32dc"} Apr 16 19:39:32.559904 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:32.559872 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-86c779c488-hpvrl" event={"ID":"e7df9913-e246-4762-a06f-cba4f338cc56","Type":"ContainerStarted","Data":"c7e1a668e2bc841809aeeeb232081f97e4c40f975b4e706ea1a98642a2ed8172"} Apr 16 19:39:32.562236 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:32.562210 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-j4w9f" event={"ID":"8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591","Type":"ContainerStarted","Data":"983bfa8830e46e217aff632e0c04fd44a7549f7b0c8e9dd72f7c609e7c4df569"} Apr 16 19:39:36.581406 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:36.581369 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-j4w9f" event={"ID":"8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591","Type":"ContainerStarted","Data":"c7499f6d9698399b9b6c2bfc9e75448200490c7a575e41944bceff6bf3a9cf59"} Apr 16 19:39:36.581900 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:36.581486 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-j4w9f" Apr 16 19:39:36.581900 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:36.581487 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-j4w9f" podUID="8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591" containerName="manager" containerID="cri-o://c7499f6d9698399b9b6c2bfc9e75448200490c7a575e41944bceff6bf3a9cf59" gracePeriod=10 Apr 16 19:39:36.583370 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:36.583344 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8944c6f4-sht7r" event={"ID":"1ac22216-feb3-42ec-af61-ea8025ac7410","Type":"ContainerStarted","Data":"c68dd69de98c1ef63a2013b2c81571422ee78dda7cab8fb49c5cfcf84f39842d"} Apr 16 19:39:36.583646 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:36.583624 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-8944c6f4-sht7r" Apr 16 19:39:36.585540 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:36.585517 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-86c779c488-hpvrl" event={"ID":"e7df9913-e246-4762-a06f-cba4f338cc56","Type":"ContainerStarted","Data":"9cc7e989771a0533e742883aaee282c359cbba3d8dfb9deabefb034d86f9651e"} Apr 16 19:39:36.585702 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:36.585666 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-86c779c488-hpvrl" Apr 16 19:39:36.607031 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:36.606987 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-j4w9f" podStartSLOduration=1.604167687 podStartE2EDuration="5.60697618s" podCreationTimestamp="2026-04-16 19:39:31 +0000 UTC" firstStartedPulling="2026-04-16 19:39:31.802382408 +0000 UTC m=+554.726078989" lastFinishedPulling="2026-04-16 19:39:35.80519089 +0000 UTC m=+558.728887482" observedRunningTime="2026-04-16 19:39:36.604188823 +0000 UTC m=+559.527885422" watchObservedRunningTime="2026-04-16 19:39:36.60697618 +0000 UTC m=+559.530672780" Apr 16 19:39:36.634544 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:36.634487 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-8944c6f4-sht7r" podStartSLOduration=1.970698794 podStartE2EDuration="5.634471648s" podCreationTimestamp="2026-04-16 19:39:31 +0000 UTC" firstStartedPulling="2026-04-16 19:39:32.149484145 +0000 UTC m=+555.073180723" lastFinishedPulling="2026-04-16 19:39:35.813256984 +0000 UTC m=+558.736953577" observedRunningTime="2026-04-16 19:39:36.632287696 +0000 UTC m=+559.555984286" watchObservedRunningTime="2026-04-16 19:39:36.634471648 +0000 UTC m=+559.558168247" Apr 16 19:39:36.661872 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:36.661823 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-86c779c488-hpvrl" podStartSLOduration=1.897623767 podStartE2EDuration="5.661808103s" podCreationTimestamp="2026-04-16 19:39:31 +0000 UTC" firstStartedPulling="2026-04-16 19:39:32.041357353 +0000 UTC m=+554.965053932" lastFinishedPulling="2026-04-16 19:39:35.805541689 +0000 UTC m=+558.729238268" observedRunningTime="2026-04-16 19:39:36.659446755 +0000 UTC m=+559.583143354" watchObservedRunningTime="2026-04-16 19:39:36.661808103 +0000 UTC m=+559.585504703" Apr 16 19:39:36.831912 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:36.831846 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-j4w9f" Apr 16 19:39:36.987838 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:36.987803 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6f8f\" (UniqueName: \"kubernetes.io/projected/8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591-kube-api-access-f6f8f\") pod \"8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591\" (UID: \"8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591\") " Apr 16 19:39:36.990177 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:36.990138 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591-kube-api-access-f6f8f" (OuterVolumeSpecName: "kube-api-access-f6f8f") pod "8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591" (UID: "8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591"). InnerVolumeSpecName "kube-api-access-f6f8f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:39:37.088638 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:37.088604 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f6f8f\" (UniqueName: \"kubernetes.io/projected/8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591-kube-api-access-f6f8f\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:39:37.590042 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:37.590011 2568 generic.go:358] "Generic (PLEG): container finished" podID="8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591" containerID="c7499f6d9698399b9b6c2bfc9e75448200490c7a575e41944bceff6bf3a9cf59" exitCode=0 Apr 16 19:39:37.590461 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:37.590077 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-j4w9f" Apr 16 19:39:37.590461 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:37.590083 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-j4w9f" event={"ID":"8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591","Type":"ContainerDied","Data":"c7499f6d9698399b9b6c2bfc9e75448200490c7a575e41944bceff6bf3a9cf59"} Apr 16 19:39:37.590461 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:37.590194 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-j4w9f" event={"ID":"8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591","Type":"ContainerDied","Data":"983bfa8830e46e217aff632e0c04fd44a7549f7b0c8e9dd72f7c609e7c4df569"} Apr 16 19:39:37.590461 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:37.590213 2568 scope.go:117] "RemoveContainer" containerID="c7499f6d9698399b9b6c2bfc9e75448200490c7a575e41944bceff6bf3a9cf59" Apr 16 19:39:37.601005 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:37.600983 2568 scope.go:117] "RemoveContainer" containerID="c7499f6d9698399b9b6c2bfc9e75448200490c7a575e41944bceff6bf3a9cf59" Apr 16 19:39:37.601262 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:39:37.601241 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7499f6d9698399b9b6c2bfc9e75448200490c7a575e41944bceff6bf3a9cf59\": container with ID starting with c7499f6d9698399b9b6c2bfc9e75448200490c7a575e41944bceff6bf3a9cf59 not found: ID does not exist" containerID="c7499f6d9698399b9b6c2bfc9e75448200490c7a575e41944bceff6bf3a9cf59" Apr 16 19:39:37.601310 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:37.601270 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7499f6d9698399b9b6c2bfc9e75448200490c7a575e41944bceff6bf3a9cf59"} err="failed to get container status \"c7499f6d9698399b9b6c2bfc9e75448200490c7a575e41944bceff6bf3a9cf59\": rpc error: code = NotFound desc = could not find container \"c7499f6d9698399b9b6c2bfc9e75448200490c7a575e41944bceff6bf3a9cf59\": container with ID starting with c7499f6d9698399b9b6c2bfc9e75448200490c7a575e41944bceff6bf3a9cf59 not found: ID does not exist" Apr 16 19:39:37.610545 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:37.610523 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-j4w9f"] Apr 16 19:39:37.613345 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:37.613325 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-j4w9f"] Apr 16 19:39:37.652885 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:37.652854 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591" path="/var/lib/kubelet/pods/8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591/volumes" Apr 16 19:39:38.102928 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.102886 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-76df6d87b-xkjhp"] Apr 16 19:39:38.103382 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.103364 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591" containerName="manager" Apr 16 19:39:38.103469 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.103385 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591" containerName="manager" Apr 16 19:39:38.103523 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.103504 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f61f7e5-fd65-4d21-b2f9-aa78ec5a1591" containerName="manager" Apr 16 19:39:38.106418 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.106393 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-76df6d87b-xkjhp" Apr 16 19:39:38.109027 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.109006 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-7fvpn\"" Apr 16 19:39:38.109278 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.109239 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 19:39:38.109406 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.109388 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 19:39:38.116705 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.116662 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-76df6d87b-xkjhp"] Apr 16 19:39:38.197496 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.197459 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvspv\" (UniqueName: \"kubernetes.io/projected/3fd73f29-3b5e-4f2f-b889-f87c767e51ec-kube-api-access-xvspv\") pod \"maas-api-76df6d87b-xkjhp\" (UID: \"3fd73f29-3b5e-4f2f-b889-f87c767e51ec\") " pod="opendatahub/maas-api-76df6d87b-xkjhp" Apr 16 19:39:38.197711 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.197512 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3fd73f29-3b5e-4f2f-b889-f87c767e51ec-maas-api-tls\") pod \"maas-api-76df6d87b-xkjhp\" (UID: \"3fd73f29-3b5e-4f2f-b889-f87c767e51ec\") " pod="opendatahub/maas-api-76df6d87b-xkjhp" Apr 16 19:39:38.298659 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.298612 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvspv\" (UniqueName: \"kubernetes.io/projected/3fd73f29-3b5e-4f2f-b889-f87c767e51ec-kube-api-access-xvspv\") pod \"maas-api-76df6d87b-xkjhp\" (UID: \"3fd73f29-3b5e-4f2f-b889-f87c767e51ec\") " pod="opendatahub/maas-api-76df6d87b-xkjhp" Apr 16 19:39:38.298884 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.298717 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3fd73f29-3b5e-4f2f-b889-f87c767e51ec-maas-api-tls\") pod \"maas-api-76df6d87b-xkjhp\" (UID: \"3fd73f29-3b5e-4f2f-b889-f87c767e51ec\") " pod="opendatahub/maas-api-76df6d87b-xkjhp" Apr 16 19:39:38.301585 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.301552 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3fd73f29-3b5e-4f2f-b889-f87c767e51ec-maas-api-tls\") pod \"maas-api-76df6d87b-xkjhp\" (UID: \"3fd73f29-3b5e-4f2f-b889-f87c767e51ec\") " pod="opendatahub/maas-api-76df6d87b-xkjhp" Apr 16 19:39:38.306265 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.306238 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvspv\" (UniqueName: \"kubernetes.io/projected/3fd73f29-3b5e-4f2f-b889-f87c767e51ec-kube-api-access-xvspv\") pod \"maas-api-76df6d87b-xkjhp\" (UID: \"3fd73f29-3b5e-4f2f-b889-f87c767e51ec\") " pod="opendatahub/maas-api-76df6d87b-xkjhp" Apr 16 19:39:38.417336 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.417237 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-76df6d87b-xkjhp" Apr 16 19:39:38.549255 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.549185 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-76df6d87b-xkjhp"] Apr 16 19:39:38.552381 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:39:38.552349 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fd73f29_3b5e_4f2f_b889_f87c767e51ec.slice/crio-a791930f36c0682a7adebc8973b948f51a7d48981c8236208bde7bd606bbe1c9 WatchSource:0}: Error finding container a791930f36c0682a7adebc8973b948f51a7d48981c8236208bde7bd606bbe1c9: Status 404 returned error can't find the container with id a791930f36c0682a7adebc8973b948f51a7d48981c8236208bde7bd606bbe1c9 Apr 16 19:39:38.595140 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:38.595107 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-76df6d87b-xkjhp" event={"ID":"3fd73f29-3b5e-4f2f-b889-f87c767e51ec","Type":"ContainerStarted","Data":"a791930f36c0682a7adebc8973b948f51a7d48981c8236208bde7bd606bbe1c9"} Apr 16 19:39:40.604452 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:40.604414 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-76df6d87b-xkjhp" event={"ID":"3fd73f29-3b5e-4f2f-b889-f87c767e51ec","Type":"ContainerStarted","Data":"c1944b363e91e68ca23d8f0e55b7ea3e003dd3eedcd8d9b50142fc3f41a3c8f6"} Apr 16 19:39:40.604890 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:40.604636 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-76df6d87b-xkjhp" Apr 16 19:39:40.620540 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:40.620486 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-76df6d87b-xkjhp" podStartSLOduration=1.142257522 podStartE2EDuration="2.620470704s" podCreationTimestamp="2026-04-16 19:39:38 +0000 UTC" firstStartedPulling="2026-04-16 19:39:38.554248967 +0000 UTC m=+561.477945545" lastFinishedPulling="2026-04-16 19:39:40.032462147 +0000 UTC m=+562.956158727" observedRunningTime="2026-04-16 19:39:40.619964984 +0000 UTC m=+563.543661586" watchObservedRunningTime="2026-04-16 19:39:40.620470704 +0000 UTC m=+563.544167314" Apr 16 19:39:46.613420 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:46.613391 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-76df6d87b-xkjhp" Apr 16 19:39:47.595783 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:47.595746 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-8944c6f4-sht7r" Apr 16 19:39:47.596103 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:47.596081 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-86c779c488-hpvrl" Apr 16 19:39:47.655301 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:47.655272 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-86c779c488-hpvrl"] Apr 16 19:39:47.655694 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:47.655483 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-86c779c488-hpvrl" podUID="e7df9913-e246-4762-a06f-cba4f338cc56" containerName="manager" containerID="cri-o://9cc7e989771a0533e742883aaee282c359cbba3d8dfb9deabefb034d86f9651e" gracePeriod=10 Apr 16 19:39:47.897019 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:47.896996 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-86c779c488-hpvrl" Apr 16 19:39:47.952167 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:47.952134 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-644587fcf-nqt9m"] Apr 16 19:39:47.952515 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:47.952502 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7df9913-e246-4762-a06f-cba4f338cc56" containerName="manager" Apr 16 19:39:47.952558 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:47.952517 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7df9913-e246-4762-a06f-cba4f338cc56" containerName="manager" Apr 16 19:39:47.952592 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:47.952584 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7df9913-e246-4762-a06f-cba4f338cc56" containerName="manager" Apr 16 19:39:47.956233 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:47.956213 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-644587fcf-nqt9m" Apr 16 19:39:47.960912 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:47.960883 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-644587fcf-nqt9m"] Apr 16 19:39:48.087796 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.087758 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsrzw\" (UniqueName: \"kubernetes.io/projected/e7df9913-e246-4762-a06f-cba4f338cc56-kube-api-access-tsrzw\") pod \"e7df9913-e246-4762-a06f-cba4f338cc56\" (UID: \"e7df9913-e246-4762-a06f-cba4f338cc56\") " Apr 16 19:39:48.087983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.087929 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5h26\" (UniqueName: \"kubernetes.io/projected/dfd56659-49ee-43e0-925b-02801943694e-kube-api-access-h5h26\") pod \"maas-controller-644587fcf-nqt9m\" (UID: \"dfd56659-49ee-43e0-925b-02801943694e\") " pod="opendatahub/maas-controller-644587fcf-nqt9m" Apr 16 19:39:48.090021 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.089994 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7df9913-e246-4762-a06f-cba4f338cc56-kube-api-access-tsrzw" (OuterVolumeSpecName: "kube-api-access-tsrzw") pod "e7df9913-e246-4762-a06f-cba4f338cc56" (UID: "e7df9913-e246-4762-a06f-cba4f338cc56"). InnerVolumeSpecName "kube-api-access-tsrzw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:39:48.189581 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.189490 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5h26\" (UniqueName: \"kubernetes.io/projected/dfd56659-49ee-43e0-925b-02801943694e-kube-api-access-h5h26\") pod \"maas-controller-644587fcf-nqt9m\" (UID: \"dfd56659-49ee-43e0-925b-02801943694e\") " pod="opendatahub/maas-controller-644587fcf-nqt9m" Apr 16 19:39:48.189767 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.189606 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tsrzw\" (UniqueName: \"kubernetes.io/projected/e7df9913-e246-4762-a06f-cba4f338cc56-kube-api-access-tsrzw\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:39:48.198350 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.198326 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5h26\" (UniqueName: \"kubernetes.io/projected/dfd56659-49ee-43e0-925b-02801943694e-kube-api-access-h5h26\") pod \"maas-controller-644587fcf-nqt9m\" (UID: \"dfd56659-49ee-43e0-925b-02801943694e\") " pod="opendatahub/maas-controller-644587fcf-nqt9m" Apr 16 19:39:48.268244 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.268206 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-644587fcf-nqt9m" Apr 16 19:39:48.395796 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.395760 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-644587fcf-nqt9m"] Apr 16 19:39:48.399101 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:39:48.399067 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfd56659_49ee_43e0_925b_02801943694e.slice/crio-7d19b5314a63c6193d5fa56de153bc5939cdf4df519f80e0cbb774c066be7ddb WatchSource:0}: Error finding container 7d19b5314a63c6193d5fa56de153bc5939cdf4df519f80e0cbb774c066be7ddb: Status 404 returned error can't find the container with id 7d19b5314a63c6193d5fa56de153bc5939cdf4df519f80e0cbb774c066be7ddb Apr 16 19:39:48.630934 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.630889 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-644587fcf-nqt9m" event={"ID":"dfd56659-49ee-43e0-925b-02801943694e","Type":"ContainerStarted","Data":"7d19b5314a63c6193d5fa56de153bc5939cdf4df519f80e0cbb774c066be7ddb"} Apr 16 19:39:48.632086 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.632059 2568 generic.go:358] "Generic (PLEG): container finished" podID="e7df9913-e246-4762-a06f-cba4f338cc56" containerID="9cc7e989771a0533e742883aaee282c359cbba3d8dfb9deabefb034d86f9651e" exitCode=0 Apr 16 19:39:48.632212 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.632098 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-86c779c488-hpvrl" event={"ID":"e7df9913-e246-4762-a06f-cba4f338cc56","Type":"ContainerDied","Data":"9cc7e989771a0533e742883aaee282c359cbba3d8dfb9deabefb034d86f9651e"} Apr 16 19:39:48.632212 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.632115 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-86c779c488-hpvrl" Apr 16 19:39:48.632212 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.632127 2568 scope.go:117] "RemoveContainer" containerID="9cc7e989771a0533e742883aaee282c359cbba3d8dfb9deabefb034d86f9651e" Apr 16 19:39:48.632212 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.632118 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-86c779c488-hpvrl" event={"ID":"e7df9913-e246-4762-a06f-cba4f338cc56","Type":"ContainerDied","Data":"c7e1a668e2bc841809aeeeb232081f97e4c40f975b4e706ea1a98642a2ed8172"} Apr 16 19:39:48.641513 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.641497 2568 scope.go:117] "RemoveContainer" containerID="9cc7e989771a0533e742883aaee282c359cbba3d8dfb9deabefb034d86f9651e" Apr 16 19:39:48.641778 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:39:48.641761 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cc7e989771a0533e742883aaee282c359cbba3d8dfb9deabefb034d86f9651e\": container with ID starting with 9cc7e989771a0533e742883aaee282c359cbba3d8dfb9deabefb034d86f9651e not found: ID does not exist" containerID="9cc7e989771a0533e742883aaee282c359cbba3d8dfb9deabefb034d86f9651e" Apr 16 19:39:48.641851 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.641787 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc7e989771a0533e742883aaee282c359cbba3d8dfb9deabefb034d86f9651e"} err="failed to get container status \"9cc7e989771a0533e742883aaee282c359cbba3d8dfb9deabefb034d86f9651e\": rpc error: code = NotFound desc = could not find container \"9cc7e989771a0533e742883aaee282c359cbba3d8dfb9deabefb034d86f9651e\": container with ID starting with 9cc7e989771a0533e742883aaee282c359cbba3d8dfb9deabefb034d86f9651e not found: ID does not exist" Apr 16 19:39:48.655354 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.655330 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-86c779c488-hpvrl"] Apr 16 19:39:48.659070 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:48.659048 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-86c779c488-hpvrl"] Apr 16 19:39:49.637371 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:49.637333 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-644587fcf-nqt9m" event={"ID":"dfd56659-49ee-43e0-925b-02801943694e","Type":"ContainerStarted","Data":"626683105f6e7f2279feb10cba07e0bfdff78817df7fd7b427c1b92f07d947ce"} Apr 16 19:39:49.637571 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:49.637450 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-644587fcf-nqt9m" Apr 16 19:39:49.652387 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:49.652357 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7df9913-e246-4762-a06f-cba4f338cc56" path="/var/lib/kubelet/pods/e7df9913-e246-4762-a06f-cba4f338cc56/volumes" Apr 16 19:39:49.654580 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:39:49.654538 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-644587fcf-nqt9m" podStartSLOduration=2.330611223 podStartE2EDuration="2.654526713s" podCreationTimestamp="2026-04-16 19:39:47 +0000 UTC" firstStartedPulling="2026-04-16 19:39:48.40049564 +0000 UTC m=+571.324192221" lastFinishedPulling="2026-04-16 19:39:48.724411132 +0000 UTC m=+571.648107711" observedRunningTime="2026-04-16 19:39:49.652322106 +0000 UTC m=+572.576018728" watchObservedRunningTime="2026-04-16 19:39:49.654526713 +0000 UTC m=+572.578223312" Apr 16 19:40:00.650377 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:00.650343 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-644587fcf-nqt9m" Apr 16 19:40:00.691036 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:00.691005 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-8944c6f4-sht7r"] Apr 16 19:40:00.691237 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:00.691216 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-8944c6f4-sht7r" podUID="1ac22216-feb3-42ec-af61-ea8025ac7410" containerName="manager" containerID="cri-o://c68dd69de98c1ef63a2013b2c81571422ee78dda7cab8fb49c5cfcf84f39842d" gracePeriod=10 Apr 16 19:40:00.937694 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:00.937645 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8944c6f4-sht7r" Apr 16 19:40:00.999627 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:00.999592 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6jm8\" (UniqueName: \"kubernetes.io/projected/1ac22216-feb3-42ec-af61-ea8025ac7410-kube-api-access-q6jm8\") pod \"1ac22216-feb3-42ec-af61-ea8025ac7410\" (UID: \"1ac22216-feb3-42ec-af61-ea8025ac7410\") " Apr 16 19:40:01.001812 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:01.001784 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ac22216-feb3-42ec-af61-ea8025ac7410-kube-api-access-q6jm8" (OuterVolumeSpecName: "kube-api-access-q6jm8") pod "1ac22216-feb3-42ec-af61-ea8025ac7410" (UID: "1ac22216-feb3-42ec-af61-ea8025ac7410"). InnerVolumeSpecName "kube-api-access-q6jm8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:40:01.101067 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:01.101015 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q6jm8\" (UniqueName: \"kubernetes.io/projected/1ac22216-feb3-42ec-af61-ea8025ac7410-kube-api-access-q6jm8\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:40:01.679770 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:01.679733 2568 generic.go:358] "Generic (PLEG): container finished" podID="1ac22216-feb3-42ec-af61-ea8025ac7410" containerID="c68dd69de98c1ef63a2013b2c81571422ee78dda7cab8fb49c5cfcf84f39842d" exitCode=0 Apr 16 19:40:01.680200 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:01.679775 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8944c6f4-sht7r" event={"ID":"1ac22216-feb3-42ec-af61-ea8025ac7410","Type":"ContainerDied","Data":"c68dd69de98c1ef63a2013b2c81571422ee78dda7cab8fb49c5cfcf84f39842d"} Apr 16 19:40:01.680200 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:01.679796 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8944c6f4-sht7r" Apr 16 19:40:01.680200 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:01.679815 2568 scope.go:117] "RemoveContainer" containerID="c68dd69de98c1ef63a2013b2c81571422ee78dda7cab8fb49c5cfcf84f39842d" Apr 16 19:40:01.680200 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:01.679803 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8944c6f4-sht7r" event={"ID":"1ac22216-feb3-42ec-af61-ea8025ac7410","Type":"ContainerDied","Data":"d09f94a2dcde3d08d070184300e57f34173d4d7a63cd2a84d2e0224d1eec32dc"} Apr 16 19:40:01.688773 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:01.688754 2568 scope.go:117] "RemoveContainer" containerID="c68dd69de98c1ef63a2013b2c81571422ee78dda7cab8fb49c5cfcf84f39842d" Apr 16 19:40:01.689037 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:40:01.689009 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c68dd69de98c1ef63a2013b2c81571422ee78dda7cab8fb49c5cfcf84f39842d\": container with ID starting with c68dd69de98c1ef63a2013b2c81571422ee78dda7cab8fb49c5cfcf84f39842d not found: ID does not exist" containerID="c68dd69de98c1ef63a2013b2c81571422ee78dda7cab8fb49c5cfcf84f39842d" Apr 16 19:40:01.689151 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:01.689050 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c68dd69de98c1ef63a2013b2c81571422ee78dda7cab8fb49c5cfcf84f39842d"} err="failed to get container status \"c68dd69de98c1ef63a2013b2c81571422ee78dda7cab8fb49c5cfcf84f39842d\": rpc error: code = NotFound desc = could not find container \"c68dd69de98c1ef63a2013b2c81571422ee78dda7cab8fb49c5cfcf84f39842d\": container with ID starting with c68dd69de98c1ef63a2013b2c81571422ee78dda7cab8fb49c5cfcf84f39842d not found: ID does not exist" Apr 16 19:40:01.703183 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:01.703155 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-8944c6f4-sht7r"] Apr 16 19:40:01.706885 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:01.706858 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-8944c6f4-sht7r"] Apr 16 19:40:03.654324 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:03.654289 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ac22216-feb3-42ec-af61-ea8025ac7410" path="/var/lib/kubelet/pods/1ac22216-feb3-42ec-af61-ea8025ac7410/volumes" Apr 16 19:40:08.721929 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.721889 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q"] Apr 16 19:40:08.722346 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.722333 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ac22216-feb3-42ec-af61-ea8025ac7410" containerName="manager" Apr 16 19:40:08.722399 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.722348 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac22216-feb3-42ec-af61-ea8025ac7410" containerName="manager" Apr 16 19:40:08.722436 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.722408 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ac22216-feb3-42ec-af61-ea8025ac7410" containerName="manager" Apr 16 19:40:08.729549 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.729516 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.733046 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.732981 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-dvjvp\"" Apr 16 19:40:08.733292 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.733271 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 19:40:08.733474 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.733450 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 19:40:08.733559 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.733208 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 16 19:40:08.735014 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.734991 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q"] Apr 16 19:40:08.773390 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.773353 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/61ea265b-b192-4b7a-b5d6-17d0c17910a9-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.773578 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.773409 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/61ea265b-b192-4b7a-b5d6-17d0c17910a9-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.773578 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.773504 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/61ea265b-b192-4b7a-b5d6-17d0c17910a9-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.773578 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.773539 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/61ea265b-b192-4b7a-b5d6-17d0c17910a9-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.773778 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.773582 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/61ea265b-b192-4b7a-b5d6-17d0c17910a9-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.773778 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.773642 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hllsl\" (UniqueName: \"kubernetes.io/projected/61ea265b-b192-4b7a-b5d6-17d0c17910a9-kube-api-access-hllsl\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.875111 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.875069 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/61ea265b-b192-4b7a-b5d6-17d0c17910a9-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.875111 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.875111 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/61ea265b-b192-4b7a-b5d6-17d0c17910a9-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.875379 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.875138 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/61ea265b-b192-4b7a-b5d6-17d0c17910a9-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.875379 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.875178 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hllsl\" (UniqueName: \"kubernetes.io/projected/61ea265b-b192-4b7a-b5d6-17d0c17910a9-kube-api-access-hllsl\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.875379 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.875208 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/61ea265b-b192-4b7a-b5d6-17d0c17910a9-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.875379 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.875283 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/61ea265b-b192-4b7a-b5d6-17d0c17910a9-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.875605 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.875551 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/61ea265b-b192-4b7a-b5d6-17d0c17910a9-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.875605 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.875577 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/61ea265b-b192-4b7a-b5d6-17d0c17910a9-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.875714 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.875655 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/61ea265b-b192-4b7a-b5d6-17d0c17910a9-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.877742 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.877716 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/61ea265b-b192-4b7a-b5d6-17d0c17910a9-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.877980 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.877964 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/61ea265b-b192-4b7a-b5d6-17d0c17910a9-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:08.882667 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:08.882646 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hllsl\" (UniqueName: \"kubernetes.io/projected/61ea265b-b192-4b7a-b5d6-17d0c17910a9-kube-api-access-hllsl\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q\" (UID: \"61ea265b-b192-4b7a-b5d6-17d0c17910a9\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:09.041014 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:09.040930 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:09.182663 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:09.182583 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q"] Apr 16 19:40:09.185383 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:40:09.185347 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61ea265b_b192_4b7a_b5d6_17d0c17910a9.slice/crio-58ca8f0420d2d90befba09658e48242753fa19aa5693d5473651b45a2bb59e09 WatchSource:0}: Error finding container 58ca8f0420d2d90befba09658e48242753fa19aa5693d5473651b45a2bb59e09: Status 404 returned error can't find the container with id 58ca8f0420d2d90befba09658e48242753fa19aa5693d5473651b45a2bb59e09 Apr 16 19:40:09.711266 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:09.711220 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" event={"ID":"61ea265b-b192-4b7a-b5d6-17d0c17910a9","Type":"ContainerStarted","Data":"58ca8f0420d2d90befba09658e48242753fa19aa5693d5473651b45a2bb59e09"} Apr 16 19:40:15.735690 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:15.735646 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" event={"ID":"61ea265b-b192-4b7a-b5d6-17d0c17910a9","Type":"ContainerStarted","Data":"f881c4040c5d644df5674788ae66cd61d6e69a1d119e175700b840475927cb4f"} Apr 16 19:40:20.756801 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:20.756754 2568 generic.go:358] "Generic (PLEG): container finished" podID="61ea265b-b192-4b7a-b5d6-17d0c17910a9" containerID="f881c4040c5d644df5674788ae66cd61d6e69a1d119e175700b840475927cb4f" exitCode=0 Apr 16 19:40:20.757204 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:20.756814 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" event={"ID":"61ea265b-b192-4b7a-b5d6-17d0c17910a9","Type":"ContainerDied","Data":"f881c4040c5d644df5674788ae66cd61d6e69a1d119e175700b840475927cb4f"} Apr 16 19:40:24.774240 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:24.774150 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" event={"ID":"61ea265b-b192-4b7a-b5d6-17d0c17910a9","Type":"ContainerStarted","Data":"4658d1a564eb529f012b239d3ec4f4dd31b9e6a732126346c9f212ef7ab63a2f"} Apr 16 19:40:24.774652 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:24.774371 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:24.794838 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:24.794783 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" podStartSLOduration=1.562458441 podStartE2EDuration="16.794767158s" podCreationTimestamp="2026-04-16 19:40:08 +0000 UTC" firstStartedPulling="2026-04-16 19:40:09.187031859 +0000 UTC m=+592.110728437" lastFinishedPulling="2026-04-16 19:40:24.419340575 +0000 UTC m=+607.343037154" observedRunningTime="2026-04-16 19:40:24.792513333 +0000 UTC m=+607.716209933" watchObservedRunningTime="2026-04-16 19:40:24.794767158 +0000 UTC m=+607.718463757" Apr 16 19:40:29.925929 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:29.925896 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-76df6d87b-xkjhp"] Apr 16 19:40:29.926345 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:29.926144 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-76df6d87b-xkjhp" podUID="3fd73f29-3b5e-4f2f-b889-f87c767e51ec" containerName="maas-api" containerID="cri-o://c1944b363e91e68ca23d8f0e55b7ea3e003dd3eedcd8d9b50142fc3f41a3c8f6" gracePeriod=30 Apr 16 19:40:30.175051 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.175026 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-76df6d87b-xkjhp" Apr 16 19:40:30.275845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.275739 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvspv\" (UniqueName: \"kubernetes.io/projected/3fd73f29-3b5e-4f2f-b889-f87c767e51ec-kube-api-access-xvspv\") pod \"3fd73f29-3b5e-4f2f-b889-f87c767e51ec\" (UID: \"3fd73f29-3b5e-4f2f-b889-f87c767e51ec\") " Apr 16 19:40:30.275845 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.275834 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3fd73f29-3b5e-4f2f-b889-f87c767e51ec-maas-api-tls\") pod \"3fd73f29-3b5e-4f2f-b889-f87c767e51ec\" (UID: \"3fd73f29-3b5e-4f2f-b889-f87c767e51ec\") " Apr 16 19:40:30.278092 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.278062 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd73f29-3b5e-4f2f-b889-f87c767e51ec-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "3fd73f29-3b5e-4f2f-b889-f87c767e51ec" (UID: "3fd73f29-3b5e-4f2f-b889-f87c767e51ec"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:40:30.278092 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.278082 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd73f29-3b5e-4f2f-b889-f87c767e51ec-kube-api-access-xvspv" (OuterVolumeSpecName: "kube-api-access-xvspv") pod "3fd73f29-3b5e-4f2f-b889-f87c767e51ec" (UID: "3fd73f29-3b5e-4f2f-b889-f87c767e51ec"). InnerVolumeSpecName "kube-api-access-xvspv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:40:30.376563 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.376527 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xvspv\" (UniqueName: \"kubernetes.io/projected/3fd73f29-3b5e-4f2f-b889-f87c767e51ec-kube-api-access-xvspv\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:40:30.376563 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.376558 2568 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/3fd73f29-3b5e-4f2f-b889-f87c767e51ec-maas-api-tls\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:40:30.795092 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.795059 2568 generic.go:358] "Generic (PLEG): container finished" podID="3fd73f29-3b5e-4f2f-b889-f87c767e51ec" containerID="c1944b363e91e68ca23d8f0e55b7ea3e003dd3eedcd8d9b50142fc3f41a3c8f6" exitCode=0 Apr 16 19:40:30.795273 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.795122 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-76df6d87b-xkjhp" Apr 16 19:40:30.795273 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.795149 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-76df6d87b-xkjhp" event={"ID":"3fd73f29-3b5e-4f2f-b889-f87c767e51ec","Type":"ContainerDied","Data":"c1944b363e91e68ca23d8f0e55b7ea3e003dd3eedcd8d9b50142fc3f41a3c8f6"} Apr 16 19:40:30.795273 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.795180 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-76df6d87b-xkjhp" event={"ID":"3fd73f29-3b5e-4f2f-b889-f87c767e51ec","Type":"ContainerDied","Data":"a791930f36c0682a7adebc8973b948f51a7d48981c8236208bde7bd606bbe1c9"} Apr 16 19:40:30.795273 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.795196 2568 scope.go:117] "RemoveContainer" containerID="c1944b363e91e68ca23d8f0e55b7ea3e003dd3eedcd8d9b50142fc3f41a3c8f6" Apr 16 19:40:30.806532 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.806490 2568 scope.go:117] "RemoveContainer" containerID="c1944b363e91e68ca23d8f0e55b7ea3e003dd3eedcd8d9b50142fc3f41a3c8f6" Apr 16 19:40:30.807358 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:40:30.807077 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1944b363e91e68ca23d8f0e55b7ea3e003dd3eedcd8d9b50142fc3f41a3c8f6\": container with ID starting with c1944b363e91e68ca23d8f0e55b7ea3e003dd3eedcd8d9b50142fc3f41a3c8f6 not found: ID does not exist" containerID="c1944b363e91e68ca23d8f0e55b7ea3e003dd3eedcd8d9b50142fc3f41a3c8f6" Apr 16 19:40:30.807358 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.807114 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1944b363e91e68ca23d8f0e55b7ea3e003dd3eedcd8d9b50142fc3f41a3c8f6"} err="failed to get container status \"c1944b363e91e68ca23d8f0e55b7ea3e003dd3eedcd8d9b50142fc3f41a3c8f6\": rpc error: code = NotFound desc = could not find container \"c1944b363e91e68ca23d8f0e55b7ea3e003dd3eedcd8d9b50142fc3f41a3c8f6\": container with ID starting with c1944b363e91e68ca23d8f0e55b7ea3e003dd3eedcd8d9b50142fc3f41a3c8f6 not found: ID does not exist" Apr 16 19:40:30.821146 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.821114 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-76df6d87b-xkjhp"] Apr 16 19:40:30.824233 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:30.824209 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-76df6d87b-xkjhp"] Apr 16 19:40:31.653178 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:31.653143 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd73f29-3b5e-4f2f-b889-f87c767e51ec" path="/var/lib/kubelet/pods/3fd73f29-3b5e-4f2f-b889-f87c767e51ec/volumes" Apr 16 19:40:33.912601 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:33.912558 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd"] Apr 16 19:40:33.913136 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:33.913116 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3fd73f29-3b5e-4f2f-b889-f87c767e51ec" containerName="maas-api" Apr 16 19:40:33.913275 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:33.913141 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd73f29-3b5e-4f2f-b889-f87c767e51ec" containerName="maas-api" Apr 16 19:40:33.913275 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:33.913226 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="3fd73f29-3b5e-4f2f-b889-f87c767e51ec" containerName="maas-api" Apr 16 19:40:33.918645 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:33.918620 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:33.921053 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:33.921027 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 16 19:40:33.926921 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:33.926888 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd"] Apr 16 19:40:34.108304 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.108270 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crxjt\" (UniqueName: \"kubernetes.io/projected/311819e6-5718-47ad-88f7-ffe7d6bfa34b-kube-api-access-crxjt\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.108474 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.108312 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/311819e6-5718-47ad-88f7-ffe7d6bfa34b-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.108474 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.108422 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/311819e6-5718-47ad-88f7-ffe7d6bfa34b-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.108474 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.108461 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/311819e6-5718-47ad-88f7-ffe7d6bfa34b-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.108642 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.108568 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/311819e6-5718-47ad-88f7-ffe7d6bfa34b-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.108642 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.108628 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/311819e6-5718-47ad-88f7-ffe7d6bfa34b-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.209881 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.209791 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/311819e6-5718-47ad-88f7-ffe7d6bfa34b-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.209881 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.209836 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/311819e6-5718-47ad-88f7-ffe7d6bfa34b-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.210104 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.209885 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crxjt\" (UniqueName: \"kubernetes.io/projected/311819e6-5718-47ad-88f7-ffe7d6bfa34b-kube-api-access-crxjt\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.210104 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.209914 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/311819e6-5718-47ad-88f7-ffe7d6bfa34b-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.210104 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.209974 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/311819e6-5718-47ad-88f7-ffe7d6bfa34b-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.210104 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.210007 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/311819e6-5718-47ad-88f7-ffe7d6bfa34b-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.210382 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.210351 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/311819e6-5718-47ad-88f7-ffe7d6bfa34b-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.210517 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.210383 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/311819e6-5718-47ad-88f7-ffe7d6bfa34b-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.210517 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.210436 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/311819e6-5718-47ad-88f7-ffe7d6bfa34b-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.212213 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.212195 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/311819e6-5718-47ad-88f7-ffe7d6bfa34b-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.212387 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.212371 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/311819e6-5718-47ad-88f7-ffe7d6bfa34b-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.219977 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.219958 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crxjt\" (UniqueName: \"kubernetes.io/projected/311819e6-5718-47ad-88f7-ffe7d6bfa34b-kube-api-access-crxjt\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd\" (UID: \"311819e6-5718-47ad-88f7-ffe7d6bfa34b\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.230824 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.230801 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:34.355505 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.355483 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd"] Apr 16 19:40:34.357697 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:40:34.357648 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod311819e6_5718_47ad_88f7_ffe7d6bfa34b.slice/crio-558143cb85708191777e54bae8faf6d612de4446549e87fd5d15edbe3afcd0ef WatchSource:0}: Error finding container 558143cb85708191777e54bae8faf6d612de4446549e87fd5d15edbe3afcd0ef: Status 404 returned error can't find the container with id 558143cb85708191777e54bae8faf6d612de4446549e87fd5d15edbe3afcd0ef Apr 16 19:40:34.813867 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.813820 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" event={"ID":"311819e6-5718-47ad-88f7-ffe7d6bfa34b","Type":"ContainerStarted","Data":"4a8ac9d5f0c7e1f60daa6637e0a469508049a2c60192273fa16c61346b0fdd9c"} Apr 16 19:40:34.814058 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:34.813882 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" event={"ID":"311819e6-5718-47ad-88f7-ffe7d6bfa34b","Type":"ContainerStarted","Data":"558143cb85708191777e54bae8faf6d612de4446549e87fd5d15edbe3afcd0ef"} Apr 16 19:40:35.790866 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:35.790835 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q" Apr 16 19:40:40.837269 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:40.837235 2568 generic.go:358] "Generic (PLEG): container finished" podID="311819e6-5718-47ad-88f7-ffe7d6bfa34b" containerID="4a8ac9d5f0c7e1f60daa6637e0a469508049a2c60192273fa16c61346b0fdd9c" exitCode=0 Apr 16 19:40:40.837783 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:40.837290 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" event={"ID":"311819e6-5718-47ad-88f7-ffe7d6bfa34b","Type":"ContainerDied","Data":"4a8ac9d5f0c7e1f60daa6637e0a469508049a2c60192273fa16c61346b0fdd9c"} Apr 16 19:40:41.842650 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:41.842617 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" event={"ID":"311819e6-5718-47ad-88f7-ffe7d6bfa34b","Type":"ContainerStarted","Data":"949b2ce90d5ba2c47452b7c65db6c41684ab6f511f475663988c45796ae1e0dd"} Apr 16 19:40:41.843114 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:41.842860 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:41.862014 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:41.861955 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" podStartSLOduration=8.631420768 podStartE2EDuration="8.861937465s" podCreationTimestamp="2026-04-16 19:40:33 +0000 UTC" firstStartedPulling="2026-04-16 19:40:40.837967665 +0000 UTC m=+623.761664243" lastFinishedPulling="2026-04-16 19:40:41.068484361 +0000 UTC m=+623.992180940" observedRunningTime="2026-04-16 19:40:41.859764775 +0000 UTC m=+624.783461375" watchObservedRunningTime="2026-04-16 19:40:41.861937465 +0000 UTC m=+624.785634066" Apr 16 19:40:52.859352 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:52.859322 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd" Apr 16 19:40:53.425118 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.425081 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq"] Apr 16 19:40:53.482535 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.482493 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq"] Apr 16 19:40:53.482754 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.482613 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.485731 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.485710 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 16 19:40:53.600176 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.600138 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab0e123a-fff4-4316-91c9-6317bb4bfa45-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.600443 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.600203 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzxcg\" (UniqueName: \"kubernetes.io/projected/ab0e123a-fff4-4316-91c9-6317bb4bfa45-kube-api-access-mzxcg\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.600443 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.600272 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ab0e123a-fff4-4316-91c9-6317bb4bfa45-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.600443 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.600346 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab0e123a-fff4-4316-91c9-6317bb4bfa45-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.600443 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.600378 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ab0e123a-fff4-4316-91c9-6317bb4bfa45-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.600443 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.600409 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab0e123a-fff4-4316-91c9-6317bb4bfa45-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.701511 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.701421 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzxcg\" (UniqueName: \"kubernetes.io/projected/ab0e123a-fff4-4316-91c9-6317bb4bfa45-kube-api-access-mzxcg\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.701511 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.701501 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ab0e123a-fff4-4316-91c9-6317bb4bfa45-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.701762 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.701563 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab0e123a-fff4-4316-91c9-6317bb4bfa45-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.701762 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.701590 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ab0e123a-fff4-4316-91c9-6317bb4bfa45-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.701762 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.701617 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab0e123a-fff4-4316-91c9-6317bb4bfa45-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.701762 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.701664 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab0e123a-fff4-4316-91c9-6317bb4bfa45-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.701983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.701962 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ab0e123a-fff4-4316-91c9-6317bb4bfa45-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.702041 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.702012 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab0e123a-fff4-4316-91c9-6317bb4bfa45-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.702148 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.702128 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab0e123a-fff4-4316-91c9-6317bb4bfa45-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.704019 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.703999 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ab0e123a-fff4-4316-91c9-6317bb4bfa45-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.704338 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.704319 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab0e123a-fff4-4316-91c9-6317bb4bfa45-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.716527 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.716493 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzxcg\" (UniqueName: \"kubernetes.io/projected/ab0e123a-fff4-4316-91c9-6317bb4bfa45-kube-api-access-mzxcg\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq\" (UID: \"ab0e123a-fff4-4316-91c9-6317bb4bfa45\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.792638 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.792600 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:40:53.944202 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:53.944176 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq"] Apr 16 19:40:53.945880 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:40:53.945854 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab0e123a_fff4_4316_91c9_6317bb4bfa45.slice/crio-ff1bf8d70e75ca82487de85e04337e0a802db43d540fa3f9f129fef66e35ac68 WatchSource:0}: Error finding container ff1bf8d70e75ca82487de85e04337e0a802db43d540fa3f9f129fef66e35ac68: Status 404 returned error can't find the container with id ff1bf8d70e75ca82487de85e04337e0a802db43d540fa3f9f129fef66e35ac68 Apr 16 19:40:54.897882 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:54.897837 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" event={"ID":"ab0e123a-fff4-4316-91c9-6317bb4bfa45","Type":"ContainerStarted","Data":"cbdace1ca122dcf67671824c7c556f72cefe21421812276b5fceea2a9c4903e5"} Apr 16 19:40:54.898057 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:54.897892 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" event={"ID":"ab0e123a-fff4-4316-91c9-6317bb4bfa45","Type":"ContainerStarted","Data":"ff1bf8d70e75ca82487de85e04337e0a802db43d540fa3f9f129fef66e35ac68"} Apr 16 19:40:55.213376 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.213301 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt"] Apr 16 19:40:55.217173 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.217151 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.219794 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.219770 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 16 19:40:55.226111 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.226091 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt"] Apr 16 19:40:55.318964 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.318929 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9469c878-a720-42f1-8a8e-8bc338793a8b-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.318964 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.318971 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9469c878-a720-42f1-8a8e-8bc338793a8b-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.319170 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.319013 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9469c878-a720-42f1-8a8e-8bc338793a8b-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.319170 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.319034 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9469c878-a720-42f1-8a8e-8bc338793a8b-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.319170 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.319090 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9469c878-a720-42f1-8a8e-8bc338793a8b-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.319170 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.319104 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5v8x\" (UniqueName: \"kubernetes.io/projected/9469c878-a720-42f1-8a8e-8bc338793a8b-kube-api-access-q5v8x\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.419576 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.419547 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9469c878-a720-42f1-8a8e-8bc338793a8b-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.419576 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.419580 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5v8x\" (UniqueName: \"kubernetes.io/projected/9469c878-a720-42f1-8a8e-8bc338793a8b-kube-api-access-q5v8x\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.419810 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.419635 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9469c878-a720-42f1-8a8e-8bc338793a8b-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.419810 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.419666 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9469c878-a720-42f1-8a8e-8bc338793a8b-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.419810 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.419766 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9469c878-a720-42f1-8a8e-8bc338793a8b-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.420019 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.419997 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9469c878-a720-42f1-8a8e-8bc338793a8b-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.420153 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.420130 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9469c878-a720-42f1-8a8e-8bc338793a8b-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.420288 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.420267 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9469c878-a720-42f1-8a8e-8bc338793a8b-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.420496 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.420376 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9469c878-a720-42f1-8a8e-8bc338793a8b-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.422141 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.422109 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9469c878-a720-42f1-8a8e-8bc338793a8b-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.422311 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.422297 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9469c878-a720-42f1-8a8e-8bc338793a8b-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.427317 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.427295 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5v8x\" (UniqueName: \"kubernetes.io/projected/9469c878-a720-42f1-8a8e-8bc338793a8b-kube-api-access-q5v8x\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt\" (UID: \"9469c878-a720-42f1-8a8e-8bc338793a8b\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.530478 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.530384 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:40:55.685778 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.685749 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt"] Apr 16 19:40:55.688153 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:40:55.688111 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9469c878_a720_42f1_8a8e_8bc338793a8b.slice/crio-a7852458e3916dc344f0f42539476a1bb2f5a7d72c3b63affced50d325fd5c26 WatchSource:0}: Error finding container a7852458e3916dc344f0f42539476a1bb2f5a7d72c3b63affced50d325fd5c26: Status 404 returned error can't find the container with id a7852458e3916dc344f0f42539476a1bb2f5a7d72c3b63affced50d325fd5c26 Apr 16 19:40:55.903343 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.903305 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" event={"ID":"9469c878-a720-42f1-8a8e-8bc338793a8b","Type":"ContainerStarted","Data":"06d828ff23e3d3a6ffea611a3b4a41c773a5b28ca1e0233bc8d32826e73b067c"} Apr 16 19:40:55.903343 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:40:55.903349 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" event={"ID":"9469c878-a720-42f1-8a8e-8bc338793a8b","Type":"ContainerStarted","Data":"a7852458e3916dc344f0f42539476a1bb2f5a7d72c3b63affced50d325fd5c26"} Apr 16 19:41:00.207872 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.207831 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw"] Apr 16 19:41:00.212735 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.212704 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.215248 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.215225 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 16 19:41:00.221743 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.221710 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw"] Apr 16 19:41:00.376480 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.376437 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ffa30b55-1c14-4753-9021-08156e1e29f7-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.376697 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.376490 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jpvn\" (UniqueName: \"kubernetes.io/projected/ffa30b55-1c14-4753-9021-08156e1e29f7-kube-api-access-8jpvn\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.376697 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.376523 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ffa30b55-1c14-4753-9021-08156e1e29f7-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.376697 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.376554 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ffa30b55-1c14-4753-9021-08156e1e29f7-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.376852 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.376699 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ffa30b55-1c14-4753-9021-08156e1e29f7-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.376852 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.376762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ffa30b55-1c14-4753-9021-08156e1e29f7-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.477437 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.477335 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ffa30b55-1c14-4753-9021-08156e1e29f7-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.477437 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.477401 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ffa30b55-1c14-4753-9021-08156e1e29f7-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.477652 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.477449 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ffa30b55-1c14-4753-9021-08156e1e29f7-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.477652 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.477548 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ffa30b55-1c14-4753-9021-08156e1e29f7-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.477652 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.477573 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jpvn\" (UniqueName: \"kubernetes.io/projected/ffa30b55-1c14-4753-9021-08156e1e29f7-kube-api-access-8jpvn\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.477652 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.477609 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ffa30b55-1c14-4753-9021-08156e1e29f7-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.477918 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.477893 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ffa30b55-1c14-4753-9021-08156e1e29f7-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.477983 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.477945 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ffa30b55-1c14-4753-9021-08156e1e29f7-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.478145 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.478075 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ffa30b55-1c14-4753-9021-08156e1e29f7-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.479811 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.479783 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ffa30b55-1c14-4753-9021-08156e1e29f7-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.480290 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.480270 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ffa30b55-1c14-4753-9021-08156e1e29f7-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.486070 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.486046 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jpvn\" (UniqueName: \"kubernetes.io/projected/ffa30b55-1c14-4753-9021-08156e1e29f7-kube-api-access-8jpvn\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-rpxtw\" (UID: \"ffa30b55-1c14-4753-9021-08156e1e29f7\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.528083 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.528033 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:00.669837 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.669813 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw"] Apr 16 19:41:00.671290 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:41:00.671255 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffa30b55_1c14_4753_9021_08156e1e29f7.slice/crio-8dbc811a5344a1df1ccec0ce98a9f0e1e5f082bc28b4d58695aace449910f2c8 WatchSource:0}: Error finding container 8dbc811a5344a1df1ccec0ce98a9f0e1e5f082bc28b4d58695aace449910f2c8: Status 404 returned error can't find the container with id 8dbc811a5344a1df1ccec0ce98a9f0e1e5f082bc28b4d58695aace449910f2c8 Apr 16 19:41:00.923869 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.923830 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" event={"ID":"ffa30b55-1c14-4753-9021-08156e1e29f7","Type":"ContainerStarted","Data":"eb3f8a8174551c8d7494bd66e076d95e3414b5b77072aed8c3e519ae68287fc8"} Apr 16 19:41:00.923869 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:00.923873 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" event={"ID":"ffa30b55-1c14-4753-9021-08156e1e29f7","Type":"ContainerStarted","Data":"8dbc811a5344a1df1ccec0ce98a9f0e1e5f082bc28b4d58695aace449910f2c8"} Apr 16 19:41:01.929289 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:01.929253 2568 generic.go:358] "Generic (PLEG): container finished" podID="9469c878-a720-42f1-8a8e-8bc338793a8b" containerID="06d828ff23e3d3a6ffea611a3b4a41c773a5b28ca1e0233bc8d32826e73b067c" exitCode=0 Apr 16 19:41:01.929715 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:01.929332 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" event={"ID":"9469c878-a720-42f1-8a8e-8bc338793a8b","Type":"ContainerDied","Data":"06d828ff23e3d3a6ffea611a3b4a41c773a5b28ca1e0233bc8d32826e73b067c"} Apr 16 19:41:02.934253 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:02.934213 2568 generic.go:358] "Generic (PLEG): container finished" podID="ab0e123a-fff4-4316-91c9-6317bb4bfa45" containerID="cbdace1ca122dcf67671824c7c556f72cefe21421812276b5fceea2a9c4903e5" exitCode=0 Apr 16 19:41:02.934708 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:02.934281 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" event={"ID":"ab0e123a-fff4-4316-91c9-6317bb4bfa45","Type":"ContainerDied","Data":"cbdace1ca122dcf67671824c7c556f72cefe21421812276b5fceea2a9c4903e5"} Apr 16 19:41:02.936132 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:02.936100 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" event={"ID":"9469c878-a720-42f1-8a8e-8bc338793a8b","Type":"ContainerStarted","Data":"36218153c3d347edbc33ce252744559cd950dab8e0f69d2996df01c0e25f41a8"} Apr 16 19:41:02.936379 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:02.936312 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:41:02.968917 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:02.968869 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" podStartSLOduration=7.745046393 podStartE2EDuration="7.968852957s" podCreationTimestamp="2026-04-16 19:40:55 +0000 UTC" firstStartedPulling="2026-04-16 19:41:01.930251559 +0000 UTC m=+644.853948136" lastFinishedPulling="2026-04-16 19:41:02.154058118 +0000 UTC m=+645.077754700" observedRunningTime="2026-04-16 19:41:02.967121918 +0000 UTC m=+645.890818518" watchObservedRunningTime="2026-04-16 19:41:02.968852957 +0000 UTC m=+645.892549560" Apr 16 19:41:03.942184 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:03.942140 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" event={"ID":"ab0e123a-fff4-4316-91c9-6317bb4bfa45","Type":"ContainerStarted","Data":"ec9d123c1f84e47a7da78b5f250b7803f49f7ce53dd5875ddeb61a06567b7ee1"} Apr 16 19:41:03.942661 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:03.942588 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:41:03.961941 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:03.961894 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" podStartSLOduration=10.754860103 podStartE2EDuration="10.961880967s" podCreationTimestamp="2026-04-16 19:40:53 +0000 UTC" firstStartedPulling="2026-04-16 19:41:02.935062474 +0000 UTC m=+645.858759053" lastFinishedPulling="2026-04-16 19:41:03.142083338 +0000 UTC m=+646.065779917" observedRunningTime="2026-04-16 19:41:03.959062915 +0000 UTC m=+646.882759515" watchObservedRunningTime="2026-04-16 19:41:03.961880967 +0000 UTC m=+646.885577610" Apr 16 19:41:06.953447 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:06.953412 2568 generic.go:358] "Generic (PLEG): container finished" podID="ffa30b55-1c14-4753-9021-08156e1e29f7" containerID="eb3f8a8174551c8d7494bd66e076d95e3414b5b77072aed8c3e519ae68287fc8" exitCode=0 Apr 16 19:41:06.953447 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:06.953451 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" event={"ID":"ffa30b55-1c14-4753-9021-08156e1e29f7","Type":"ContainerDied","Data":"eb3f8a8174551c8d7494bd66e076d95e3414b5b77072aed8c3e519ae68287fc8"} Apr 16 19:41:06.954082 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:06.954066 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:41:07.958108 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:07.958073 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" event={"ID":"ffa30b55-1c14-4753-9021-08156e1e29f7","Type":"ContainerStarted","Data":"8312d90665c033afaed68fe9c1a9a227a24507d3ce2903abf81a70f19a033433"} Apr 16 19:41:07.958599 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:07.958558 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:07.979024 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:07.978972 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" podStartSLOduration=7.791887371 podStartE2EDuration="7.978959956s" podCreationTimestamp="2026-04-16 19:41:00 +0000 UTC" firstStartedPulling="2026-04-16 19:41:06.954181219 +0000 UTC m=+649.877877798" lastFinishedPulling="2026-04-16 19:41:07.141253806 +0000 UTC m=+650.064950383" observedRunningTime="2026-04-16 19:41:07.977061777 +0000 UTC m=+650.900758377" watchObservedRunningTime="2026-04-16 19:41:07.978959956 +0000 UTC m=+650.902656555" Apr 16 19:41:13.955134 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:13.955104 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt" Apr 16 19:41:14.959721 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:14.959657 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq" Apr 16 19:41:18.975628 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:18.975595 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-rpxtw" Apr 16 19:41:49.177093 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:49.177056 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8686888594-ggmfk"] Apr 16 19:41:49.177758 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:49.177270 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8686888594-ggmfk" podUID="116077fb-55ed-444c-a319-dd5df133e639" containerName="authorino" containerID="cri-o://c3be310a18e6251ec29178f8ab1ee28d8bc98a35ae24591a67991afb8ed3f436" gracePeriod=30 Apr 16 19:41:49.433613 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:49.433545 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8686888594-ggmfk" Apr 16 19:41:49.558965 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:49.558929 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jrbr\" (UniqueName: \"kubernetes.io/projected/116077fb-55ed-444c-a319-dd5df133e639-kube-api-access-7jrbr\") pod \"116077fb-55ed-444c-a319-dd5df133e639\" (UID: \"116077fb-55ed-444c-a319-dd5df133e639\") " Apr 16 19:41:49.559158 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:49.559030 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/116077fb-55ed-444c-a319-dd5df133e639-tls-cert\") pod \"116077fb-55ed-444c-a319-dd5df133e639\" (UID: \"116077fb-55ed-444c-a319-dd5df133e639\") " Apr 16 19:41:49.561600 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:49.561570 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/116077fb-55ed-444c-a319-dd5df133e639-kube-api-access-7jrbr" (OuterVolumeSpecName: "kube-api-access-7jrbr") pod "116077fb-55ed-444c-a319-dd5df133e639" (UID: "116077fb-55ed-444c-a319-dd5df133e639"). InnerVolumeSpecName "kube-api-access-7jrbr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:41:49.570309 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:49.570278 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/116077fb-55ed-444c-a319-dd5df133e639-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "116077fb-55ed-444c-a319-dd5df133e639" (UID: "116077fb-55ed-444c-a319-dd5df133e639"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:41:49.660533 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:49.660504 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7jrbr\" (UniqueName: \"kubernetes.io/projected/116077fb-55ed-444c-a319-dd5df133e639-kube-api-access-7jrbr\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:41:49.660695 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:49.660539 2568 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/116077fb-55ed-444c-a319-dd5df133e639-tls-cert\") on node \"ip-10-0-133-198.ec2.internal\" DevicePath \"\"" Apr 16 19:41:50.113792 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:50.113757 2568 generic.go:358] "Generic (PLEG): container finished" podID="116077fb-55ed-444c-a319-dd5df133e639" containerID="c3be310a18e6251ec29178f8ab1ee28d8bc98a35ae24591a67991afb8ed3f436" exitCode=0 Apr 16 19:41:50.114000 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:50.113821 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8686888594-ggmfk" Apr 16 19:41:50.114000 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:50.113847 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8686888594-ggmfk" event={"ID":"116077fb-55ed-444c-a319-dd5df133e639","Type":"ContainerDied","Data":"c3be310a18e6251ec29178f8ab1ee28d8bc98a35ae24591a67991afb8ed3f436"} Apr 16 19:41:50.114000 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:50.113896 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8686888594-ggmfk" event={"ID":"116077fb-55ed-444c-a319-dd5df133e639","Type":"ContainerDied","Data":"dcd253de40ad744c6ac20bd92b037b82770034454c1e581808fac02f0183cd03"} Apr 16 19:41:50.114000 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:50.113915 2568 scope.go:117] "RemoveContainer" containerID="c3be310a18e6251ec29178f8ab1ee28d8bc98a35ae24591a67991afb8ed3f436" Apr 16 19:41:50.122657 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:50.122629 2568 scope.go:117] "RemoveContainer" containerID="c3be310a18e6251ec29178f8ab1ee28d8bc98a35ae24591a67991afb8ed3f436" Apr 16 19:41:50.122966 ip-10-0-133-198 kubenswrapper[2568]: E0416 19:41:50.122947 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3be310a18e6251ec29178f8ab1ee28d8bc98a35ae24591a67991afb8ed3f436\": container with ID starting with c3be310a18e6251ec29178f8ab1ee28d8bc98a35ae24591a67991afb8ed3f436 not found: ID does not exist" containerID="c3be310a18e6251ec29178f8ab1ee28d8bc98a35ae24591a67991afb8ed3f436" Apr 16 19:41:50.123018 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:50.122976 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3be310a18e6251ec29178f8ab1ee28d8bc98a35ae24591a67991afb8ed3f436"} err="failed to get container status \"c3be310a18e6251ec29178f8ab1ee28d8bc98a35ae24591a67991afb8ed3f436\": rpc error: code = NotFound desc = could not find container \"c3be310a18e6251ec29178f8ab1ee28d8bc98a35ae24591a67991afb8ed3f436\": container with ID starting with c3be310a18e6251ec29178f8ab1ee28d8bc98a35ae24591a67991afb8ed3f436 not found: ID does not exist" Apr 16 19:41:50.129328 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:50.129304 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8686888594-ggmfk"] Apr 16 19:41:50.133050 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:50.133025 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8686888594-ggmfk"] Apr 16 19:41:51.653798 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:41:51.653761 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="116077fb-55ed-444c-a319-dd5df133e639" path="/var/lib/kubelet/pods/116077fb-55ed-444c-a319-dd5df133e639/volumes" Apr 16 19:42:41.765789 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:41.765758 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-644587fcf-nqt9m_dfd56659-49ee-43e0-925b-02801943694e/manager/0.log" Apr 16 19:42:42.241307 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:42.241273 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-57586b9555-v675c_7458448c-7f4a-4d6e-8288-a5cc4c0993c2/manager/0.log" Apr 16 19:42:42.355529 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:42.355503 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-pvbrp_09be7125-6ad9-41eb-983a-9ce9abbbc7e8/postgres/0.log" Apr 16 19:42:43.857993 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:43.857963 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-bjtrt_c75a6e35-6996-48d2-a141-e778996fc546/manager/0.log" Apr 16 19:42:44.103622 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:44.103582 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-mrmsv_8a729e29-27a0-4a24-8ec0-ff538e303f7e/registry-server/0.log" Apr 16 19:42:44.449653 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:44.449619 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-bchsj_a39e3df1-0e66-40fc-ba71-aca0fec7fa56/manager/0.log" Apr 16 19:42:44.791386 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:44.791304 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f4ffn5_252f8668-c013-4a42-9977-123205eefcdb/istio-proxy/0.log" Apr 16 19:42:45.262545 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:45.262487 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-pzrls_81f21eff-1442-413c-b0d9-faf842fe8771/istio-proxy/0.log" Apr 16 19:42:45.715909 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:45.715859 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q_61ea265b-b192-4b7a-b5d6-17d0c17910a9/storage-initializer/0.log" Apr 16 19:42:45.723128 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:45.723103 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-rnh4q_61ea265b-b192-4b7a-b5d6-17d0c17910a9/main/0.log" Apr 16 19:42:45.831854 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:45.831824 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-rpxtw_ffa30b55-1c14-4753-9021-08156e1e29f7/storage-initializer/0.log" Apr 16 19:42:45.839182 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:45.839158 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-rpxtw_ffa30b55-1c14-4753-9021-08156e1e29f7/main/0.log" Apr 16 19:42:45.952110 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:45.952079 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd_311819e6-5718-47ad-88f7-ffe7d6bfa34b/main/0.log" Apr 16 19:42:45.958913 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:45.958889 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc4dfwd_311819e6-5718-47ad-88f7-ffe7d6bfa34b/storage-initializer/0.log" Apr 16 19:42:46.069518 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:46.069438 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt_9469c878-a720-42f1-8a8e-8bc338793a8b/storage-initializer/0.log" Apr 16 19:42:46.076375 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:46.076348 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-w46pt_9469c878-a720-42f1-8a8e-8bc338793a8b/main/0.log" Apr 16 19:42:46.184115 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:46.184079 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq_ab0e123a-fff4-4316-91c9-6317bb4bfa45/storage-initializer/0.log" Apr 16 19:42:46.191274 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:46.191245 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-cjvhq_ab0e123a-fff4-4316-91c9-6317bb4bfa45/main/0.log" Apr 16 19:42:53.450038 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:53.450003 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-krqn6_252a776b-215a-41af-9c37-185dedf959ea/global-pull-secret-syncer/0.log" Apr 16 19:42:53.535152 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:53.535123 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dxnh8_e224c06c-213a-487e-844c-5d72da62ac07/konnectivity-agent/0.log" Apr 16 19:42:53.628583 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:53.628551 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-198.ec2.internal_4cf8424ac7d796a78f96e62791daed1d/haproxy/0.log" Apr 16 19:42:57.703693 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:57.703650 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-bjtrt_c75a6e35-6996-48d2-a141-e778996fc546/manager/0.log" Apr 16 19:42:57.762330 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:57.762285 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-mrmsv_8a729e29-27a0-4a24-8ec0-ff538e303f7e/registry-server/0.log" Apr 16 19:42:57.856336 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:57.856304 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-bchsj_a39e3df1-0e66-40fc-ba71-aca0fec7fa56/manager/0.log" Apr 16 19:42:59.386314 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:59.386288 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-dpb5r_8bc472a4-e44f-4c82-a32b-3cda6b957d95/cluster-monitoring-operator/0.log" Apr 16 19:42:59.415850 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:59.415819 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7xz85_c0525fb0-30a0-4fd3-a006-2b9b67460566/kube-state-metrics/0.log" Apr 16 19:42:59.469602 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:59.469521 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7xz85_c0525fb0-30a0-4fd3-a006-2b9b67460566/kube-rbac-proxy-main/0.log" Apr 16 19:42:59.521981 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:59.521947 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-7xz85_c0525fb0-30a0-4fd3-a006-2b9b67460566/kube-rbac-proxy-self/0.log" Apr 16 19:42:59.579331 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:59.579301 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-422kv_30844433-d473-4019-98ea-1e208e6aea91/monitoring-plugin/0.log" Apr 16 19:42:59.692997 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:59.692966 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9b9rf_3e3d17f2-5079-43eb-a07c-110bfa423d12/node-exporter/0.log" Apr 16 19:42:59.712116 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:59.712088 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9b9rf_3e3d17f2-5079-43eb-a07c-110bfa423d12/kube-rbac-proxy/0.log" Apr 16 19:42:59.733689 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:59.733599 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9b9rf_3e3d17f2-5079-43eb-a07c-110bfa423d12/init-textfile/0.log" Apr 16 19:42:59.918993 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:59.918963 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_15f0261f-6878-42f5-8a58-ae42cf943b64/prometheus/0.log" Apr 16 19:42:59.940720 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:59.940659 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_15f0261f-6878-42f5-8a58-ae42cf943b64/config-reloader/0.log" Apr 16 19:42:59.964974 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:59.964942 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_15f0261f-6878-42f5-8a58-ae42cf943b64/thanos-sidecar/0.log" Apr 16 19:42:59.987168 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:42:59.987092 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_15f0261f-6878-42f5-8a58-ae42cf943b64/kube-rbac-proxy-web/0.log" Apr 16 19:43:00.015109 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:00.015083 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_15f0261f-6878-42f5-8a58-ae42cf943b64/kube-rbac-proxy/0.log" Apr 16 19:43:00.035946 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:00.035919 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_15f0261f-6878-42f5-8a58-ae42cf943b64/kube-rbac-proxy-thanos/0.log" Apr 16 19:43:00.061362 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:00.061334 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_15f0261f-6878-42f5-8a58-ae42cf943b64/init-config-reloader/0.log" Apr 16 19:43:00.144157 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:00.144127 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-6wwx6_403c7ad0-c2e7-4007-a079-55ac5fac2efe/prometheus-operator-admission-webhook/0.log" Apr 16 19:43:00.261946 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:00.261868 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57b9b4c777-nggfv_75ffdd93-6f95-4760-97cb-bc2ff147f109/thanos-query/0.log" Apr 16 19:43:00.283448 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:00.283418 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57b9b4c777-nggfv_75ffdd93-6f95-4760-97cb-bc2ff147f109/kube-rbac-proxy-web/0.log" Apr 16 19:43:00.304210 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:00.304181 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57b9b4c777-nggfv_75ffdd93-6f95-4760-97cb-bc2ff147f109/kube-rbac-proxy/0.log" Apr 16 19:43:00.330238 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:00.330213 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57b9b4c777-nggfv_75ffdd93-6f95-4760-97cb-bc2ff147f109/prom-label-proxy/0.log" Apr 16 19:43:00.356702 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:00.356653 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57b9b4c777-nggfv_75ffdd93-6f95-4760-97cb-bc2ff147f109/kube-rbac-proxy-rules/0.log" Apr 16 19:43:00.378329 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:00.378304 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57b9b4c777-nggfv_75ffdd93-6f95-4760-97cb-bc2ff147f109/kube-rbac-proxy-metrics/0.log" Apr 16 19:43:01.925965 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:01.925932 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn"] Apr 16 19:43:01.926567 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:01.926543 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="116077fb-55ed-444c-a319-dd5df133e639" containerName="authorino" Apr 16 19:43:01.926660 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:01.926571 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="116077fb-55ed-444c-a319-dd5df133e639" containerName="authorino" Apr 16 19:43:01.926756 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:01.926701 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="116077fb-55ed-444c-a319-dd5df133e639" containerName="authorino" Apr 16 19:43:01.930060 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:01.930036 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:01.932825 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:01.932803 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r5h99\"/\"openshift-service-ca.crt\"" Apr 16 19:43:01.932955 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:01.932846 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r5h99\"/\"kube-root-ca.crt\"" Apr 16 19:43:01.933751 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:01.933734 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-r5h99\"/\"default-dockercfg-5clhq\"" Apr 16 19:43:01.940864 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:01.940832 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn"] Apr 16 19:43:02.054344 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.054307 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/512b8994-d201-4eab-ba87-be93f41917c1-podres\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.054535 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.054361 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/512b8994-d201-4eab-ba87-be93f41917c1-sys\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.054535 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.054398 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/512b8994-d201-4eab-ba87-be93f41917c1-lib-modules\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.054535 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.054434 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/512b8994-d201-4eab-ba87-be93f41917c1-proc\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.054535 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.054478 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbc9x\" (UniqueName: \"kubernetes.io/projected/512b8994-d201-4eab-ba87-be93f41917c1-kube-api-access-pbc9x\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.155921 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.155881 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbc9x\" (UniqueName: \"kubernetes.io/projected/512b8994-d201-4eab-ba87-be93f41917c1-kube-api-access-pbc9x\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.156125 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.155961 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/512b8994-d201-4eab-ba87-be93f41917c1-podres\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.156125 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.155990 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/512b8994-d201-4eab-ba87-be93f41917c1-sys\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.156125 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.156020 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/512b8994-d201-4eab-ba87-be93f41917c1-lib-modules\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.156125 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.156037 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/512b8994-d201-4eab-ba87-be93f41917c1-proc\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.156125 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.156105 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/512b8994-d201-4eab-ba87-be93f41917c1-sys\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.156125 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.156125 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/512b8994-d201-4eab-ba87-be93f41917c1-proc\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.156330 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.156150 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/512b8994-d201-4eab-ba87-be93f41917c1-podres\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.156330 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.156150 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/512b8994-d201-4eab-ba87-be93f41917c1-lib-modules\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.164326 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.164287 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbc9x\" (UniqueName: \"kubernetes.io/projected/512b8994-d201-4eab-ba87-be93f41917c1-kube-api-access-pbc9x\") pod \"perf-node-gather-daemonset-4ffxn\" (UID: \"512b8994-d201-4eab-ba87-be93f41917c1\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.242175 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.242072 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:02.375176 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:02.375149 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn"] Apr 16 19:43:02.377882 ip-10-0-133-198 kubenswrapper[2568]: W0416 19:43:02.377857 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod512b8994_d201_4eab_ba87_be93f41917c1.slice/crio-5447720e1d6a7795305231d27e93074953182e2fe7712ba40391fede1cfd695a WatchSource:0}: Error finding container 5447720e1d6a7795305231d27e93074953182e2fe7712ba40391fede1cfd695a: Status 404 returned error can't find the container with id 5447720e1d6a7795305231d27e93074953182e2fe7712ba40391fede1cfd695a Apr 16 19:43:03.381505 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:03.381470 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" event={"ID":"512b8994-d201-4eab-ba87-be93f41917c1","Type":"ContainerStarted","Data":"1655f4713a75946a352568640742007f6954a5b15c60904983921ef6591c0d08"} Apr 16 19:43:03.381505 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:03.381506 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" event={"ID":"512b8994-d201-4eab-ba87-be93f41917c1","Type":"ContainerStarted","Data":"5447720e1d6a7795305231d27e93074953182e2fe7712ba40391fede1cfd695a"} Apr 16 19:43:03.381976 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:03.381604 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:03.399270 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:03.399217 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" podStartSLOduration=2.3991970670000002 podStartE2EDuration="2.399197067s" podCreationTimestamp="2026-04-16 19:43:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:43:03.39684422 +0000 UTC m=+766.320540821" watchObservedRunningTime="2026-04-16 19:43:03.399197067 +0000 UTC m=+766.322893710" Apr 16 19:43:03.764403 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:03.764316 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-brrh2_d702f7c4-fdbf-4294-9aee-07546029945f/dns/0.log" Apr 16 19:43:03.784903 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:03.784877 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-brrh2_d702f7c4-fdbf-4294-9aee-07546029945f/kube-rbac-proxy/0.log" Apr 16 19:43:03.915600 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:03.915569 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ftbrq_fd63a5dc-d580-47b6-b37d-b1972fcea60a/dns-node-resolver/0.log" Apr 16 19:43:04.398384 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:04.398352 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-8698848557-44r8j_37b94f27-fe0b-4bd4-9976-8459a9f483b5/registry/0.log" Apr 16 19:43:04.463036 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:04.463002 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vsb4w_ab25b071-b28e-4ed4-8595-4ea92620f2bd/node-ca/0.log" Apr 16 19:43:05.235917 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:05.235890 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f4ffn5_252f8668-c013-4a42-9977-123205eefcdb/istio-proxy/0.log" Apr 16 19:43:05.384884 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:05.384855 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-pzrls_81f21eff-1442-413c-b0d9-faf842fe8771/istio-proxy/0.log" Apr 16 19:43:05.889699 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:05.889635 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gqlxj_09f9c8cc-cfd9-462c-9967-1afa5e6543ea/serve-healthcheck-canary/0.log" Apr 16 19:43:06.480039 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:06.480012 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nb8l9_1c015403-db65-48c7-860c-61aaa90431fd/kube-rbac-proxy/0.log" Apr 16 19:43:06.498581 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:06.498558 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nb8l9_1c015403-db65-48c7-860c-61aaa90431fd/exporter/0.log" Apr 16 19:43:06.518197 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:06.518167 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-nb8l9_1c015403-db65-48c7-860c-61aaa90431fd/extractor/0.log" Apr 16 19:43:08.373538 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:08.373505 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-644587fcf-nqt9m_dfd56659-49ee-43e0-925b-02801943694e/manager/0.log" Apr 16 19:43:08.493129 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:08.493102 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-57586b9555-v675c_7458448c-7f4a-4d6e-8288-a5cc4c0993c2/manager/0.log" Apr 16 19:43:08.516522 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:08.516497 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-pvbrp_09be7125-6ad9-41eb-983a-9ce9abbbc7e8/postgres/0.log" Apr 16 19:43:09.395879 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:09.395853 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-4ffxn" Apr 16 19:43:09.603351 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:09.603326 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5bfdb756-t47zs_6f0dab7e-ceb4-418e-98e0-f497e72ca500/manager/0.log" Apr 16 19:43:15.222471 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:15.222400 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gt9gg_901bc30b-5940-410f-8379-b703113afa1a/kube-multus-additional-cni-plugins/0.log" Apr 16 19:43:15.243563 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:15.243537 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gt9gg_901bc30b-5940-410f-8379-b703113afa1a/egress-router-binary-copy/0.log" Apr 16 19:43:15.263217 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:15.263187 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gt9gg_901bc30b-5940-410f-8379-b703113afa1a/cni-plugins/0.log" Apr 16 19:43:15.289331 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:15.289304 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gt9gg_901bc30b-5940-410f-8379-b703113afa1a/bond-cni-plugin/0.log" Apr 16 19:43:15.308624 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:15.308598 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gt9gg_901bc30b-5940-410f-8379-b703113afa1a/routeoverride-cni/0.log" Apr 16 19:43:15.329168 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:15.329138 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gt9gg_901bc30b-5940-410f-8379-b703113afa1a/whereabouts-cni-bincopy/0.log" Apr 16 19:43:15.350388 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:15.350358 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gt9gg_901bc30b-5940-410f-8379-b703113afa1a/whereabouts-cni/0.log" Apr 16 19:43:15.696228 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:15.696194 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-g2sd7_aff5494c-5205-4f24-a716-e1d25bc64f7c/kube-multus/0.log" Apr 16 19:43:15.836408 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:15.836382 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-p9mcp_1a9f09ff-a8fe-41b7-b833-8c5091a88fb6/network-metrics-daemon/0.log" Apr 16 19:43:15.856070 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:15.856039 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-p9mcp_1a9f09ff-a8fe-41b7-b833-8c5091a88fb6/kube-rbac-proxy/0.log" Apr 16 19:43:17.247199 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:17.247158 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vf8sc_a149ee97-cada-4c41-b88a-0351739b3d48/ovn-controller/0.log" Apr 16 19:43:17.272029 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:17.271987 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vf8sc_a149ee97-cada-4c41-b88a-0351739b3d48/ovn-acl-logging/0.log" Apr 16 19:43:17.297610 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:17.297571 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vf8sc_a149ee97-cada-4c41-b88a-0351739b3d48/kube-rbac-proxy-node/0.log" Apr 16 19:43:17.321487 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:17.321454 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vf8sc_a149ee97-cada-4c41-b88a-0351739b3d48/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:43:17.342093 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:17.342065 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vf8sc_a149ee97-cada-4c41-b88a-0351739b3d48/northd/0.log" Apr 16 19:43:17.364101 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:17.364022 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vf8sc_a149ee97-cada-4c41-b88a-0351739b3d48/nbdb/0.log" Apr 16 19:43:17.386206 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:17.386180 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vf8sc_a149ee97-cada-4c41-b88a-0351739b3d48/sbdb/0.log" Apr 16 19:43:17.501410 ip-10-0-133-198 kubenswrapper[2568]: I0416 19:43:17.501363 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vf8sc_a149ee97-cada-4c41-b88a-0351739b3d48/ovnkube-controller/0.log"