Apr 20 23:12:50.207271 ip-10-0-134-166 systemd[1]: Starting Kubernetes Kubelet... Apr 20 23:12:50.652196 ip-10-0-134-166 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 23:12:50.652196 ip-10-0-134-166 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 23:12:50.652196 ip-10-0-134-166 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 23:12:50.652196 ip-10-0-134-166 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 23:12:50.652196 ip-10-0-134-166 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 23:12:50.653267 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.653171 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 23:12:50.655563 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655547 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:50.655563 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655563 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655567 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655572 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655575 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655578 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655581 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655585 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655589 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655593 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655595 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655598 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655601 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655603 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655606 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655609 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655611 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655614 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655617 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655619 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:50.655628 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655622 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655625 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655628 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655631 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655638 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655641 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655644 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655646 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655649 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655651 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655654 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655657 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655659 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655662 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655664 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655667 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655669 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655672 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655674 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655677 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:50.656097 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655679 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655682 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655684 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655689 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655693 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655697 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655699 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655703 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655705 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655708 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655710 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655713 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655716 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655719 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655723 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655725 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655728 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655731 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655733 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:50.656649 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655736 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655739 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655741 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655744 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655746 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655749 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655751 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655754 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655756 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655759 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655761 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655764 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655766 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655769 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655771 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655774 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655777 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655780 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655783 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655785 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:50.657136 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655787 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655790 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655792 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655795 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655797 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655800 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.655802 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656234 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656240 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656243 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656246 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656249 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656252 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656254 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656257 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656260 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656262 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656265 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656268 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656270 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:50.657617 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656273 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656276 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656278 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656283 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656287 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656290 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656293 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656296 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656299 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656302 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656305 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656308 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656310 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656314 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656316 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656319 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656321 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656324 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656326 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:50.658123 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656330 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656333 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656336 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656339 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656341 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656344 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656346 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656349 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656351 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656354 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656357 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656359 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656362 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656364 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656366 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656369 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656371 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656374 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656376 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656379 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:50.658599 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656381 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656384 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656393 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656396 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656398 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656401 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656403 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656406 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656408 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656411 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656413 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656416 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656420 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656423 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656426 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656430 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656433 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656436 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656438 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656441 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:50.659106 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656443 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656446 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656448 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656451 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656453 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656456 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656458 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656461 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656463 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656465 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656468 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656470 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656473 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.656475 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656558 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656579 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656586 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656591 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656596 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656599 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656608 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 23:12:50.659641 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656612 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656616 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656620 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656623 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656627 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656630 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656633 2577 flags.go:64] FLAG: --cgroup-root="" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656636 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656639 2577 flags.go:64] FLAG: --client-ca-file="" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656642 2577 flags.go:64] FLAG: --cloud-config="" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656645 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656648 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656655 2577 flags.go:64] FLAG: --cluster-domain="" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656658 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656662 2577 flags.go:64] FLAG: --config-dir="" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656665 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656669 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656673 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656676 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656679 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656682 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656685 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656688 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656691 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656695 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 23:12:50.660175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656697 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656703 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656714 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656717 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656720 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656723 2577 flags.go:64] FLAG: --enable-server="true" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656726 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656733 2577 flags.go:64] FLAG: --event-burst="100" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656737 2577 flags.go:64] FLAG: --event-qps="50" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656740 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656743 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656746 2577 flags.go:64] FLAG: --eviction-hard="" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656750 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656754 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656757 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656760 2577 flags.go:64] FLAG: --eviction-soft="" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656763 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656766 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656769 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656772 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656775 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656778 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656780 2577 flags.go:64] FLAG: --feature-gates="" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656784 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656787 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 23:12:50.660810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656790 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656794 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656797 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656800 2577 flags.go:64] FLAG: --help="false" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656803 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-134-166.ec2.internal" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656806 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656809 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656812 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656816 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656820 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656826 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656829 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656832 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656835 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656838 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656841 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656844 2577 flags.go:64] FLAG: --kube-reserved="" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656846 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656849 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656852 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656855 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656858 2577 flags.go:64] FLAG: --lock-file="" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656861 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656865 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 23:12:50.661456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656867 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656873 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656876 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656879 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656882 2577 flags.go:64] FLAG: --logging-format="text" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656885 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656888 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656890 2577 flags.go:64] FLAG: --manifest-url="" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656893 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656897 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656900 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656904 2577 flags.go:64] FLAG: --max-pods="110" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656907 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656910 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656913 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656918 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656921 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656925 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656928 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656936 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656939 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656955 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656958 2577 flags.go:64] FLAG: --pod-cidr="" Apr 20 23:12:50.662070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656961 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656967 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656970 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656973 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656976 2577 flags.go:64] FLAG: --port="10250" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656979 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656982 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06ed431e2ed57ade5" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656985 2577 flags.go:64] FLAG: --qos-reserved="" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656988 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656991 2577 flags.go:64] FLAG: --register-node="true" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656994 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.656997 2577 flags.go:64] FLAG: --register-with-taints="" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657001 2577 flags.go:64] FLAG: --registry-burst="10" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657004 2577 flags.go:64] FLAG: --registry-qps="5" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657007 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657010 2577 flags.go:64] FLAG: --reserved-memory="" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657014 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657017 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657020 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657023 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657025 2577 flags.go:64] FLAG: --runonce="false" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657028 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657031 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657035 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657040 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657043 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 23:12:50.662727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657046 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657049 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657052 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657055 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657061 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657064 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657067 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657070 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657074 2577 flags.go:64] FLAG: --system-cgroups="" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657077 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657082 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657085 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657088 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657093 2577 flags.go:64] FLAG: --tls-min-version="" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657096 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657099 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657102 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657105 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657108 2577 flags.go:64] FLAG: --v="2" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657112 2577 flags.go:64] FLAG: --version="false" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657117 2577 flags.go:64] FLAG: --vmodule="" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657122 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657125 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657244 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:50.663389 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657249 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657252 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657255 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657258 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657261 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657264 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657272 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657275 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657278 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657281 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657284 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657287 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657292 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657294 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657297 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657300 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657303 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657306 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657308 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657311 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:50.664004 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657314 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657316 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657319 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657322 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657325 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657327 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657330 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657333 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657335 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657338 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657340 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657343 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657346 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657349 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657351 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657354 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657357 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657360 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657364 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657367 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:50.664506 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657369 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657372 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657374 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657377 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657380 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657383 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657386 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657388 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657391 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657393 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657396 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657398 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657401 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657404 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657406 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657409 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657411 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657414 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657416 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657419 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:50.665005 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657421 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657424 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657426 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657429 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657432 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657434 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657437 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657439 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657441 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657444 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657448 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657451 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657455 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657458 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657460 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657463 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657466 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657469 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657472 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:50.665500 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657475 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657478 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657480 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657484 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657487 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.657490 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.657495 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.664243 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.664361 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664410 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664416 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664419 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664423 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664426 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664429 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:50.666028 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664432 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664435 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664438 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664441 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664443 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664446 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664449 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664451 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664454 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664457 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664459 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664462 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664465 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664468 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664470 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664473 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664475 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664478 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664481 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664484 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:50.666409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664486 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664488 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664491 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664493 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664496 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664500 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664503 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664506 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664508 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664511 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664514 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664516 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664520 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664525 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664528 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664530 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664533 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664536 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664538 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:50.666894 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664541 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664543 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664546 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664549 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664551 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664554 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664556 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664559 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664562 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664564 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664567 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664570 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664573 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664575 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664578 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664581 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664583 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664586 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664588 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664592 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:50.667382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664594 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664597 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664599 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664602 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664604 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664607 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664609 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664612 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664614 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664617 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664619 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664622 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664624 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664627 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664629 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664632 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664635 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664637 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664640 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:50.667890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664642 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:50.668452 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664646 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:50.668452 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.664653 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 23:12:50.668452 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664753 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 23:12:50.668452 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664758 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 23:12:50.668452 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664762 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 23:12:50.668452 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664765 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 23:12:50.668452 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664769 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 23:12:50.668452 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664773 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 23:12:50.668452 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664776 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 23:12:50.668452 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664780 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 23:12:50.668452 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664783 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 23:12:50.668452 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664786 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 23:12:50.668452 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664790 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 23:12:50.668452 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664793 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664795 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664798 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664801 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664804 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664806 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664809 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664811 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664814 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664817 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664819 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664822 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664824 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664827 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664830 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664832 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664835 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664837 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664840 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664842 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 23:12:50.668850 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664845 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664847 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664850 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664853 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664856 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664858 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664861 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664863 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664866 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664868 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664871 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664874 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664877 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664880 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664882 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664885 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664888 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664890 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664893 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664896 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 23:12:50.669367 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664898 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664901 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664903 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664905 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664908 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664911 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664913 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664916 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664918 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664921 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664924 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664926 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664928 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664931 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664934 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664938 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664956 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664960 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664963 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664966 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 23:12:50.669860 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664969 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664971 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664974 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664977 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664979 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664982 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664985 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664988 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664990 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664992 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664995 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.664998 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.665000 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.665003 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:50.665006 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.665011 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 23:12:50.670361 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.665140 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 23:12:50.670764 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.667156 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 23:12:50.670764 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.668217 2577 server.go:1019] "Starting client certificate rotation" Apr 20 23:12:50.670764 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.668314 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 23:12:50.670764 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.669176 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 23:12:50.694465 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.694439 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 23:12:50.698743 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.698724 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 23:12:50.718302 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.718278 2577 log.go:25] "Validated CRI v1 runtime API" Apr 20 23:12:50.724612 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.724591 2577 log.go:25] "Validated CRI v1 image API" Apr 20 23:12:50.725835 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.725815 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 23:12:50.729116 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.729094 2577 fs.go:135] Filesystem UUIDs: map[17ad9816-8fcc-47e6-9aa1-4c3ac869f4e9:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a5d337b1-380e-4cb8-a98d-5de93f9bb76f:/dev/nvme0n1p4] Apr 20 23:12:50.729193 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.729114 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 23:12:50.731770 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.731754 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 23:12:50.734592 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.734442 2577 manager.go:217] Machine: {Timestamp:2026-04-20 23:12:50.733183093 +0000 UTC m=+0.404552232 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3093764 MemoryCapacity:33164500992 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec222f8bb4e4d8c725e7484ab5885aef SystemUUID:ec222f8b-b4e4-d8c7-25e7-484ab5885aef BootID:24799950-f6aa-4af6-9e31-996180cd41fe Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582250496 Type:vfs Inodes:4048401 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:39:8b:47:61:ab Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:39:8b:47:61:ab Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9e:76:82:c0:09:cf Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164500992 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 23:12:50.734592 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.734589 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 23:12:50.734708 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.734685 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 23:12:50.736602 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.736573 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 23:12:50.736754 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.736604 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-166.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 23:12:50.736801 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.736767 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 23:12:50.736801 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.736777 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 23:12:50.736801 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.736790 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 23:12:50.738205 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.738192 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 23:12:50.739693 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.739681 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 23:12:50.739824 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.739814 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 23:12:50.742302 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.742291 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 20 23:12:50.742354 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.742309 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 23:12:50.742354 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.742324 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 23:12:50.742354 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.742338 2577 kubelet.go:397] "Adding apiserver pod source" Apr 20 23:12:50.742354 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.742347 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 23:12:50.743426 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.743412 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 23:12:50.743477 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.743437 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 23:12:50.747024 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.747006 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 23:12:50.749124 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.749109 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 23:12:50.750584 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.750570 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 23:12:50.750659 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.750593 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 23:12:50.750659 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.750603 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 23:12:50.750659 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.750611 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 23:12:50.750659 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.750620 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 23:12:50.750659 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.750635 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 23:12:50.750659 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.750644 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 23:12:50.750659 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.750652 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 23:12:50.750862 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.750663 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 23:12:50.750862 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.750672 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 23:12:50.750862 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.750699 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 23:12:50.750862 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.750712 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 23:12:50.751212 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.751193 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bsmqp" Apr 20 23:12:50.752199 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.752177 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 23:12:50.752250 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.752218 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 23:12:50.752366 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.752355 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 23:12:50.752399 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.752369 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 23:12:50.756183 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.756171 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 23:12:50.756226 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.756208 2577 server.go:1295] "Started kubelet" Apr 20 23:12:50.756326 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.756286 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 23:12:50.756393 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.756315 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 23:12:50.756393 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.756390 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 23:12:50.757121 ip-10-0-134-166 systemd[1]: Started Kubernetes Kubelet. Apr 20 23:12:50.758311 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.758293 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 20 23:12:50.758400 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.758364 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 23:12:50.763479 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.763439 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:12:50.764353 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.764333 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 23:12:50.764446 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.764419 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 23:12:50.765037 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.765019 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 23:12:50.765037 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.765038 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 23:12:50.765135 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.765048 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 23:12:50.765188 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.765177 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 20 23:12:50.765223 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.765194 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 20 23:12:50.765862 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.765848 2577 factory.go:55] Registering systemd factory Apr 20 23:12:50.765933 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.765870 2577 factory.go:223] Registration of the systemd container factory successfully Apr 20 23:12:50.766264 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.766246 2577 factory.go:153] Registering CRI-O factory Apr 20 23:12:50.766543 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.766523 2577 factory.go:223] Registration of the crio container factory successfully Apr 20 23:12:50.766601 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.766544 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-166.ec2.internal\" not found" Apr 20 23:12:50.766648 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.766615 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 23:12:50.766648 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.766646 2577 factory.go:103] Registering Raw factory Apr 20 23:12:50.766731 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.766641 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 23:12:50.766731 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.766663 2577 manager.go:1196] Started watching for new ooms in manager Apr 20 23:12:50.767085 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.767071 2577 manager.go:319] Starting recovery of all containers Apr 20 23:12:50.775015 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.774986 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 23:12:50.775220 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.775188 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 23:12:50.776313 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.774961 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a1d96581 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.756183425 +0000 UTC m=+0.427552568,LastTimestamp:2026-04-20 23:12:50.756183425 +0000 UTC m=+0.427552568,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:50.777833 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.777814 2577 manager.go:324] Recovery completed Apr 20 23:12:50.782551 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.782539 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:50.785652 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.785632 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:50.785738 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.785664 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:50.785738 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.785675 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:50.786192 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.786176 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 23:12:50.786192 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.786189 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 23:12:50.786309 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.786210 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 23:12:50.787542 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.787473 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b0628 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785650216 +0000 UTC m=+0.457019354,LastTimestamp:2026-04-20 23:12:50.785650216 +0000 UTC m=+0.457019354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:50.788631 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.788617 2577 policy_none.go:49] "None policy: Start" Apr 20 23:12:50.788702 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.788633 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 23:12:50.788702 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.788645 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 20 23:12:50.795677 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.795591 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b51e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785669607 +0000 UTC m=+0.457038744,LastTimestamp:2026-04-20 23:12:50.785669607 +0000 UTC m=+0.457038744,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:50.808249 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.808165 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b78b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.78567954 +0000 UTC m=+0.457048678,LastTimestamp:2026-04-20 23:12:50.78567954 +0000 UTC m=+0.457048678,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:50.833637 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.833616 2577 manager.go:341] "Starting Device Plugin manager" Apr 20 23:12:50.837956 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.833721 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 23:12:50.837956 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.833736 2577 server.go:85] "Starting device plugin registration server" Apr 20 23:12:50.837956 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.834112 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 23:12:50.837956 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.834128 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 23:12:50.837956 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.834279 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 23:12:50.837956 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.834383 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 23:12:50.837956 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.834390 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 23:12:50.837956 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.835562 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 23:12:50.837956 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.835610 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-166.ec2.internal\" not found" Apr 20 23:12:50.846839 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.846752 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a69e0aca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.836179658 +0000 UTC m=+0.507548797,LastTimestamp:2026-04-20 23:12:50.836179658 +0000 UTC m=+0.507548797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:50.912218 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.912135 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 23:12:50.913511 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.913487 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 23:12:50.913511 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.913516 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 23:12:50.913667 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.913545 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 23:12:50.913667 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.913553 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 23:12:50.913667 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.913643 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 23:12:50.925754 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.925714 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 20 23:12:50.935997 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.935980 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:50.937259 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.937238 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:50.937373 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.937272 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:50.937373 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.937286 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:50.937373 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:50.937317 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:50.947633 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.947535 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b0628\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b0628 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785650216 +0000 UTC m=+0.457019354,LastTimestamp:2026-04-20 23:12:50.937257469 +0000 UTC m=+0.608626606,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:50.954980 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.954936 2577 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:50.955107 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.954926 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b51e7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b51e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785669607 +0000 UTC m=+0.457038744,LastTimestamp:2026-04-20 23:12:50.93727729 +0000 UTC m=+0.608646427,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:50.964618 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.964538 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b78b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b78b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.78567954 +0000 UTC m=+0.457048678,LastTimestamp:2026-04-20 23:12:50.937292934 +0000 UTC m=+0.608662074,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:50.986358 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:50.986330 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="400ms" Apr 20 23:12:51.014568 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.014517 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-166.ec2.internal"] Apr 20 23:12:51.014659 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.014645 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:51.016262 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.016243 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:51.016358 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.016277 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:51.016358 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.016292 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:51.017697 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.017680 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:51.017842 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.017827 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.017881 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.017859 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:51.018843 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.018826 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:51.018920 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.018853 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:51.018920 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.018863 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:51.018920 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.018826 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:51.019091 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.018961 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:51.019091 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.018980 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:51.020156 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.020140 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.020244 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.020169 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:51.021182 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.021164 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:51.021277 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.021194 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:51.021277 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.021206 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:51.025478 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.025405 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b0628\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b0628 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785650216 +0000 UTC m=+0.457019354,LastTimestamp:2026-04-20 23:12:51.016261174 +0000 UTC m=+0.687630312,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.033697 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.033625 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b51e7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b51e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785669607 +0000 UTC m=+0.457038744,LastTimestamp:2026-04-20 23:12:51.016284121 +0000 UTC m=+0.687653260,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.042594 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.042518 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b78b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b78b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.78567954 +0000 UTC m=+0.457048678,LastTimestamp:2026-04-20 23:12:51.016297003 +0000 UTC m=+0.687666141,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.051144 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.051066 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b0628\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b0628 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785650216 +0000 UTC m=+0.457019354,LastTimestamp:2026-04-20 23:12:51.018844676 +0000 UTC m=+0.690213815,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.051974 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.051940 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-166.ec2.internal\" not found" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.056525 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.056503 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-166.ec2.internal\" not found" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.059717 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.059647 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b51e7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b51e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785669607 +0000 UTC m=+0.457038744,LastTimestamp:2026-04-20 23:12:51.018858212 +0000 UTC m=+0.690227350,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.067210 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.067189 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ff387f4dda566eb1da7627d669fc453f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal\" (UID: \"ff387f4dda566eb1da7627d669fc453f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.067314 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.067218 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff387f4dda566eb1da7627d669fc453f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal\" (UID: \"ff387f4dda566eb1da7627d669fc453f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.067314 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.067234 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6dec7aee4b5cc7a9a5eebe9083c21fbd-config\") pod \"kube-apiserver-proxy-ip-10-0-134-166.ec2.internal\" (UID: \"6dec7aee4b5cc7a9a5eebe9083c21fbd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.070569 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.070498 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b78b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b78b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.78567954 +0000 UTC m=+0.457048678,LastTimestamp:2026-04-20 23:12:51.018867084 +0000 UTC m=+0.690236221,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.078434 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.078356 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b0628\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b0628 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785650216 +0000 UTC m=+0.457019354,LastTimestamp:2026-04-20 23:12:51.018924793 +0000 UTC m=+0.690293931,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.086385 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.086305 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b51e7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b51e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785669607 +0000 UTC m=+0.457038744,LastTimestamp:2026-04-20 23:12:51.018971283 +0000 UTC m=+0.690340425,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.098632 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.098551 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b78b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b78b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.78567954 +0000 UTC m=+0.457048678,LastTimestamp:2026-04-20 23:12:51.018986365 +0000 UTC m=+0.690355504,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.106585 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.106512 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b0628\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b0628 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785650216 +0000 UTC m=+0.457019354,LastTimestamp:2026-04-20 23:12:51.02117837 +0000 UTC m=+0.692547516,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.119135 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.119058 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b51e7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b51e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785669607 +0000 UTC m=+0.457038744,LastTimestamp:2026-04-20 23:12:51.021199977 +0000 UTC m=+0.692569115,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.128141 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.128063 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b78b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b78b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.78567954 +0000 UTC m=+0.457048678,LastTimestamp:2026-04-20 23:12:51.021212156 +0000 UTC m=+0.692581300,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.155133 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.155089 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:51.156182 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.156164 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:51.156254 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.156198 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:51.156254 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.156208 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:51.156254 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.156239 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.165792 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.165672 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b0628\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b0628 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785650216 +0000 UTC m=+0.457019354,LastTimestamp:2026-04-20 23:12:51.15618307 +0000 UTC m=+0.827552208,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.168343 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.168322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ff387f4dda566eb1da7627d669fc453f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal\" (UID: \"ff387f4dda566eb1da7627d669fc453f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.168413 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.168353 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff387f4dda566eb1da7627d669fc453f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal\" (UID: \"ff387f4dda566eb1da7627d669fc453f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.168413 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.168369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6dec7aee4b5cc7a9a5eebe9083c21fbd-config\") pod \"kube-apiserver-proxy-ip-10-0-134-166.ec2.internal\" (UID: \"6dec7aee4b5cc7a9a5eebe9083c21fbd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.168490 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.168436 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ff387f4dda566eb1da7627d669fc453f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal\" (UID: \"ff387f4dda566eb1da7627d669fc453f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.168524 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.168494 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6dec7aee4b5cc7a9a5eebe9083c21fbd-config\") pod \"kube-apiserver-proxy-ip-10-0-134-166.ec2.internal\" (UID: \"6dec7aee4b5cc7a9a5eebe9083c21fbd\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.168524 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.168520 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff387f4dda566eb1da7627d669fc453f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal\" (UID: \"ff387f4dda566eb1da7627d669fc453f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.173156 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.173070 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b51e7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b51e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785669607 +0000 UTC m=+0.457038744,LastTimestamp:2026-04-20 23:12:51.156203137 +0000 UTC m=+0.827572275,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.173268 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.173126 2577 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.183706 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.183622 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b78b4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b78b4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.78567954 +0000 UTC m=+0.457048678,LastTimestamp:2026-04-20 23:12:51.156212585 +0000 UTC m=+0.827581724,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.354587 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.354540 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.358248 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.358226 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.395985 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.395939 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="800ms" Apr 20 23:12:51.573956 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.573917 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:51.575577 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.575560 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:51.575702 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.575590 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:51.575702 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.575600 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:51.575702 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.575636 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.584534 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.584452 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b0628\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b0628 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785650216 +0000 UTC m=+0.457019354,LastTimestamp:2026-04-20 23:12:51.575576121 +0000 UTC m=+1.246945260,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.592724 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.592687 2577 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:51.592829 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.592756 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"ip-10-0-134-166.ec2.internal.18a83395a39b51e7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-166.ec2.internal.18a83395a39b51e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-166.ec2.internal,UID:ip-10-0-134-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-134-166.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:50.785669607 +0000 UTC m=+0.457038744,LastTimestamp:2026-04-20 23:12:51.57559436 +0000 UTC m=+1.246963498,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:51.657240 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.657203 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 23:12:51.773484 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:51.773453 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:12:51.869863 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.869776 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 23:12:51.869863 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.869795 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 23:12:51.903090 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:51.903045 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 20 23:12:52.011063 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:52.011012 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff387f4dda566eb1da7627d669fc453f.slice/crio-d1c912c3c04ec7fa4c2a87fb15efba1d7610e942e449a9951577c8721fd3a378 WatchSource:0}: Error finding container d1c912c3c04ec7fa4c2a87fb15efba1d7610e942e449a9951577c8721fd3a378: Status 404 returned error can't find the container with id d1c912c3c04ec7fa4c2a87fb15efba1d7610e942e449a9951577c8721fd3a378 Apr 20 23:12:52.011409 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:12:52.011387 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6dec7aee4b5cc7a9a5eebe9083c21fbd.slice/crio-120ad7136d0214b1f727987b3f339ca07fb9dcbcb1b2ad77b7632ab2f9617153 WatchSource:0}: Error finding container 120ad7136d0214b1f727987b3f339ca07fb9dcbcb1b2ad77b7632ab2f9617153: Status 404 returned error can't find the container with id 120ad7136d0214b1f727987b3f339ca07fb9dcbcb1b2ad77b7632ab2f9617153 Apr 20 23:12:52.016061 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:52.016045 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 23:12:52.026212 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:52.026126 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-134-166.ec2.internal.18a83395ecf4f182 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-134-166.ec2.internal,UID:6dec7aee4b5cc7a9a5eebe9083c21fbd,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:294af5c64228434d1ed6ee8ea3ac802e3c999aa847223e3b2efa18425a9fe421\",Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:52.016279938 +0000 UTC m=+1.687649063,LastTimestamp:2026-04-20 23:12:52.016279938 +0000 UTC m=+1.687649063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:52.034368 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:52.034267 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a83395ecf5ed41 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal,UID:ff387f4dda566eb1da7627d669fc453f,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\",Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:52.016344385 +0000 UTC m=+1.687713528,LastTimestamp:2026-04-20 23:12:52.016344385 +0000 UTC m=+1.687713528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:52.204237 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:52.204144 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="1.6s" Apr 20 23:12:52.393386 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:52.393344 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:52.394350 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:52.394330 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:52.394473 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:52.394366 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:52.394473 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:52.394377 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:52.394473 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:52.394407 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:52.411350 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:52.411318 2577 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:52.773201 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:52.773167 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:12:52.919208 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:52.919144 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" event={"ID":"ff387f4dda566eb1da7627d669fc453f","Type":"ContainerStarted","Data":"d1c912c3c04ec7fa4c2a87fb15efba1d7610e942e449a9951577c8721fd3a378"} Apr 20 23:12:52.920230 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:52.920202 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-166.ec2.internal" event={"ID":"6dec7aee4b5cc7a9a5eebe9083c21fbd","Type":"ContainerStarted","Data":"120ad7136d0214b1f727987b3f339ca07fb9dcbcb1b2ad77b7632ab2f9617153"} Apr 20 23:12:53.655662 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:53.655559 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-134-166.ec2.internal.18a833964e13b5d7 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-134-166.ec2.internal,UID:6dec7aee4b5cc7a9a5eebe9083c21fbd,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:294af5c64228434d1ed6ee8ea3ac802e3c999aa847223e3b2efa18425a9fe421\" in 1.629s (1.629s including waiting). Image size: 488332864 bytes.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:53.645686231 +0000 UTC m=+3.317055369,LastTimestamp:2026-04-20 23:12:53.645686231 +0000 UTC m=+3.317055369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:53.666023 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:53.665986 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 20 23:12:53.666180 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:53.666045 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a833964e19b286 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal,UID:ff387f4dda566eb1da7627d669fc453f,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" in 1.629s (1.629s including waiting). Image size: 468435751 bytes.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:53.646078598 +0000 UTC m=+3.317447723,LastTimestamp:2026-04-20 23:12:53.646078598 +0000 UTC m=+3.317447723,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:53.673712 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:53.673676 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 23:12:53.735369 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:53.735263 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-134-166.ec2.internal.18a8339652cc4ed1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-134-166.ec2.internal,UID:6dec7aee4b5cc7a9a5eebe9083c21fbd,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Created,Message:Created container: haproxy,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:53.724892881 +0000 UTC m=+3.396262006,LastTimestamp:2026-04-20 23:12:53.724892881 +0000 UTC m=+3.396262006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:53.743523 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:53.743425 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{kube-apiserver-proxy-ip-10-0-134-166.ec2.internal.18a833965333d38e kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-proxy-ip-10-0-134-166.ec2.internal,UID:6dec7aee4b5cc7a9a5eebe9083c21fbd,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{haproxy},},Reason:Started,Message:Started container haproxy,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:53.73167707 +0000 UTC m=+3.403046210,LastTimestamp:2026-04-20 23:12:53.73167707 +0000 UTC m=+3.403046210,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:53.771284 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:53.771246 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:12:53.811984 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:53.811937 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="3.2s" Apr 20 23:12:53.923393 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:53.923305 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-166.ec2.internal" event={"ID":"6dec7aee4b5cc7a9a5eebe9083c21fbd","Type":"ContainerStarted","Data":"c1485730ab8d8589471e506e5c79e172f4801d06faf3e33b0bd6d1a37f20ac98"} Apr 20 23:12:53.923393 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:53.923376 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:53.925056 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:53.925019 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:53.925175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:53.925068 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:53.925175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:53.925082 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:53.925290 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:53.925270 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-166.ec2.internal\" not found" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:54.011788 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.011753 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:54.012835 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.012817 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:54.012929 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.012872 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:54.012929 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.012886 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:54.012929 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.012916 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:54.032335 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:54.032302 2577 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:54.253018 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:54.252918 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a8339671c56605 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal,UID:ff387f4dda566eb1da7627d669fc453f,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:54.244533765 +0000 UTC m=+3.915902913,LastTimestamp:2026-04-20 23:12:54.244533765 +0000 UTC m=+3.915902913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:54.264261 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:54.264183 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a833967234cf61 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal,UID:ff387f4dda566eb1da7627d669fc453f,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:54.251835233 +0000 UTC m=+3.923204371,LastTimestamp:2026-04-20 23:12:54.251835233 +0000 UTC m=+3.923204371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:54.461227 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:54.461144 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 23:12:54.657285 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:54.657254 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 23:12:54.772358 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.772296 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:12:54.926790 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.926739 2577 generic.go:358] "Generic (PLEG): container finished" podID="ff387f4dda566eb1da7627d669fc453f" containerID="f95b92c3e75ebb87285641b728b53b22fab0ebc26722da89d732d7067839d4e0" exitCode=0 Apr 20 23:12:54.927152 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.926818 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" event={"ID":"ff387f4dda566eb1da7627d669fc453f","Type":"ContainerDied","Data":"f95b92c3e75ebb87285641b728b53b22fab0ebc26722da89d732d7067839d4e0"} Apr 20 23:12:54.927152 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.926840 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:54.927152 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.926851 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:54.927674 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.927655 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:54.927758 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.927689 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:54.927758 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.927690 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:54.927758 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.927704 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:54.927758 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.927711 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:54.927758 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:54.927728 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:54.927999 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:54.927981 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-166.ec2.internal\" not found" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:54.929902 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:54.928362 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-166.ec2.internal\" not found" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:54.940822 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:54.940744 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a833969aba3084 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal,UID:ff387f4dda566eb1da7627d669fc453f,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" already present on machine,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:54.931665028 +0000 UTC m=+4.603034175,LastTimestamp:2026-04-20 23:12:54.931665028 +0000 UTC m=+4.603034175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:55.047247 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:55.047153 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a83396a12102e9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal,UID:ff387f4dda566eb1da7627d669fc453f,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:55.039066857 +0000 UTC m=+4.710436025,LastTimestamp:2026-04-20 23:12:55.039066857 +0000 UTC m=+4.710436025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:55.055855 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:55.055768 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a83396a19d0ddc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal,UID:ff387f4dda566eb1da7627d669fc453f,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:55.047196124 +0000 UTC m=+4.718565265,LastTimestamp:2026-04-20 23:12:55.047196124 +0000 UTC m=+4.718565265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:55.774379 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:55.774348 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:12:55.929124 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:55.929080 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_ff387f4dda566eb1da7627d669fc453f/kube-rbac-proxy-crio/0.log" Apr 20 23:12:55.929502 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:55.929433 2577 generic.go:358] "Generic (PLEG): container finished" podID="ff387f4dda566eb1da7627d669fc453f" containerID="571ad39532b2217cc60f47f9e33f3b7d9246e05235c9e7254a7d7f2a1dfcecd7" exitCode=1 Apr 20 23:12:55.929502 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:55.929465 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" event={"ID":"ff387f4dda566eb1da7627d669fc453f","Type":"ContainerDied","Data":"571ad39532b2217cc60f47f9e33f3b7d9246e05235c9e7254a7d7f2a1dfcecd7"} Apr 20 23:12:55.929607 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:55.929538 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:55.930360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:55.930344 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:55.930421 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:55.930375 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:55.930421 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:55.930385 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:55.930561 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:55.930548 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-166.ec2.internal\" not found" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:55.930618 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:55.930595 2577 scope.go:117] "RemoveContainer" containerID="571ad39532b2217cc60f47f9e33f3b7d9246e05235c9e7254a7d7f2a1dfcecd7" Apr 20 23:12:55.940978 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:55.940884 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a833969aba3084\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a833969aba3084 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal,UID:ff387f4dda566eb1da7627d669fc453f,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8e1f4bf2daa69c9e69766cebcfced43fa0ee926d4479becdc8a5b05b93ca6e81\" already present on machine,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:54.931665028 +0000 UTC m=+4.603034175,LastTimestamp:2026-04-20 23:12:55.932458094 +0000 UTC m=+5.603827240,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:56.041099 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:56.041009 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a83396a12102e9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a83396a12102e9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal,UID:ff387f4dda566eb1da7627d669fc453f,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:55.039066857 +0000 UTC m=+4.710436025,LastTimestamp:2026-04-20 23:12:56.031980211 +0000 UTC m=+5.703349363,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:56.050512 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:56.050425 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a83396a19d0ddc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a83396a19d0ddc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal,UID:ff387f4dda566eb1da7627d669fc453f,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:55.047196124 +0000 UTC m=+4.718565265,LastTimestamp:2026-04-20 23:12:56.039479627 +0000 UTC m=+5.710848777,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:56.772176 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:56.772148 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:12:56.932423 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:56.932399 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_ff387f4dda566eb1da7627d669fc453f/kube-rbac-proxy-crio/1.log" Apr 20 23:12:56.932746 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:56.932733 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_ff387f4dda566eb1da7627d669fc453f/kube-rbac-proxy-crio/0.log" Apr 20 23:12:56.933073 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:56.933045 2577 generic.go:358] "Generic (PLEG): container finished" podID="ff387f4dda566eb1da7627d669fc453f" containerID="c0927acedb949f30a85d2209166d9f5f74f8f140ac2a7a6dadbced296d4a299c" exitCode=1 Apr 20 23:12:56.933189 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:56.933081 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" event={"ID":"ff387f4dda566eb1da7627d669fc453f","Type":"ContainerDied","Data":"c0927acedb949f30a85d2209166d9f5f74f8f140ac2a7a6dadbced296d4a299c"} Apr 20 23:12:56.933189 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:56.933109 2577 scope.go:117] "RemoveContainer" containerID="571ad39532b2217cc60f47f9e33f3b7d9246e05235c9e7254a7d7f2a1dfcecd7" Apr 20 23:12:56.933189 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:56.933137 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:56.933924 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:56.933897 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:56.934030 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:56.933933 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:56.934030 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:56.933964 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:56.934429 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:56.934246 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-166.ec2.internal\" not found" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:56.934429 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:56.934306 2577 scope.go:117] "RemoveContainer" containerID="c0927acedb949f30a85d2209166d9f5f74f8f140ac2a7a6dadbced296d4a299c" Apr 20 23:12:56.934525 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:56.934493 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_openshift-machine-config-operator(ff387f4dda566eb1da7627d669fc453f)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" podUID="ff387f4dda566eb1da7627d669fc453f" Apr 20 23:12:56.941686 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:56.941615 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a83397121a2277 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal,UID:ff387f4dda566eb1da7627d669fc453f,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_openshift-machine-config-operator(ff387f4dda566eb1da7627d669fc453f),Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:56.934441591 +0000 UTC m=+6.605810739,LastTimestamp:2026-04-20 23:12:56.934441591 +0000 UTC m=+6.605810739,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:57.020860 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:57.020823 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="6.4s" Apr 20 23:12:57.232865 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:57.232825 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:57.233862 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:57.233843 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:57.233998 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:57.233879 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:57.233998 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:57.233892 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:57.233998 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:57.233927 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:57.250223 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:57.250194 2577 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:57.773381 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:57.773346 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:12:57.935413 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:57.935384 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_ff387f4dda566eb1da7627d669fc453f/kube-rbac-proxy-crio/1.log" Apr 20 23:12:57.935830 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:57.935815 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:12:57.936603 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:57.936583 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:12:57.936679 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:57.936621 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:12:57.936679 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:57.936636 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:12:57.938766 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:57.938748 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-166.ec2.internal\" not found" node="ip-10-0-134-166.ec2.internal" Apr 20 23:12:57.938842 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:57.938814 2577 scope.go:117] "RemoveContainer" containerID="c0927acedb949f30a85d2209166d9f5f74f8f140ac2a7a6dadbced296d4a299c" Apr 20 23:12:57.939031 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:57.939011 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_openshift-machine-config-operator(ff387f4dda566eb1da7627d669fc453f)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" podUID="ff387f4dda566eb1da7627d669fc453f" Apr 20 23:12:57.947934 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:57.947857 2577 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a83397121a2277\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal.18a83397121a2277 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal,UID:ff387f4dda566eb1da7627d669fc453f,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_openshift-machine-config-operator(ff387f4dda566eb1da7627d669fc453f),Source:EventSource{Component:kubelet,Host:ip-10-0-134-166.ec2.internal,},FirstTimestamp:2026-04-20 23:12:56.934441591 +0000 UTC m=+6.605810739,LastTimestamp:2026-04-20 23:12:57.938979456 +0000 UTC m=+7.610348597,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-166.ec2.internal,}" Apr 20 23:12:58.499714 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:58.499675 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 23:12:58.772821 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:58.772737 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:12:59.080509 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:59.080477 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 20 23:12:59.080509 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:12:59.080477 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 23:12:59.773546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:12:59.773517 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:13:00.304221 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:00.304180 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 23:13:00.771546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:00.771516 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:13:00.835793 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:00.835757 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-166.ec2.internal\" not found" Apr 20 23:13:01.773099 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:01.773063 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:13:02.772153 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:02.772120 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:13:03.429796 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:03.429756 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Apr 20 23:13:03.650934 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:03.650892 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:13:03.651929 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:03.651911 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:13:03.652052 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:03.651956 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:13:03.652052 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:03.651968 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:13:03.652052 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:03.651994 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-166.ec2.internal" Apr 20 23:13:03.669263 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:03.669237 2577 kubelet_node_status.go:116] "Unable to register node with API server, error getting existing node" err="nodes \"ip-10-0-134-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"nodes\" in API group \"\" at the cluster scope" node="ip-10-0-134-166.ec2.internal" Apr 20 23:13:03.772237 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:03.772167 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:13:04.774084 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:04.774047 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:13:05.772823 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:05.772791 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:13:05.956086 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:05.956049 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 23:13:06.772982 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:06.772933 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:13:07.776088 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:07.776056 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:13:07.888257 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:07.888224 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Apr 20 23:13:08.773586 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:08.773556 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 23:13:08.819830 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:08.819803 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bsmqp" Apr 20 23:13:08.914439 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:08.914397 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:13:08.915456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:08.915440 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:13:08.915540 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:08.915472 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:13:08.915540 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:08.915486 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:13:08.915763 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:08.915748 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-166.ec2.internal\" not found" node="ip-10-0-134-166.ec2.internal" Apr 20 23:13:08.915824 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:08.915813 2577 scope.go:117] "RemoveContainer" containerID="c0927acedb949f30a85d2209166d9f5f74f8f140ac2a7a6dadbced296d4a299c" Apr 20 23:13:09.668234 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.668199 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 23:13:09.780061 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.780030 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-166.ec2.internal" not found Apr 20 23:13:09.795463 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.795445 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-166.ec2.internal" not found Apr 20 23:13:09.807693 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.807672 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:13:09.821460 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.821414 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 23:08:08 +0000 UTC" deadline="2027-10-19 09:55:35.500505927 +0000 UTC" Apr 20 23:13:09.821460 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.821457 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13114h42m25.679051496s" Apr 20 23:13:09.854889 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.854870 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-166.ec2.internal" not found Apr 20 23:13:09.953828 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.953768 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_ff387f4dda566eb1da7627d669fc453f/kube-rbac-proxy-crio/2.log" Apr 20 23:13:09.954190 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.954175 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_ff387f4dda566eb1da7627d669fc453f/kube-rbac-proxy-crio/1.log" Apr 20 23:13:09.954512 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.954491 2577 generic.go:358] "Generic (PLEG): container finished" podID="ff387f4dda566eb1da7627d669fc453f" containerID="e09f74f7868f52dc9fafb978926c7d4883e2009c4e8f29d2520b661d8579a4e5" exitCode=1 Apr 20 23:13:09.954567 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.954526 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" event={"ID":"ff387f4dda566eb1da7627d669fc453f","Type":"ContainerDied","Data":"e09f74f7868f52dc9fafb978926c7d4883e2009c4e8f29d2520b661d8579a4e5"} Apr 20 23:13:09.954567 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.954554 2577 scope.go:117] "RemoveContainer" containerID="c0927acedb949f30a85d2209166d9f5f74f8f140ac2a7a6dadbced296d4a299c" Apr 20 23:13:09.954686 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.954672 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:13:09.955710 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.955472 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:13:09.955710 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.955501 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:13:09.955710 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.955511 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:13:09.955864 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:09.955744 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-166.ec2.internal\" not found" node="ip-10-0-134-166.ec2.internal" Apr 20 23:13:09.955864 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:09.955789 2577 scope.go:117] "RemoveContainer" containerID="e09f74f7868f52dc9fafb978926c7d4883e2009c4e8f29d2520b661d8579a4e5" Apr 20 23:13:09.955976 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:09.955910 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_openshift-machine-config-operator(ff387f4dda566eb1da7627d669fc453f)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" podUID="ff387f4dda566eb1da7627d669fc453f" Apr 20 23:13:10.131802 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.131773 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-166.ec2.internal" not found Apr 20 23:13:10.131802 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:10.131802 2577 csi_plugin.go:397] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ip-10-0-134-166.ec2.internal" not found Apr 20 23:13:10.169293 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.169269 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-166.ec2.internal" not found Apr 20 23:13:10.185275 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.185245 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-166.ec2.internal" not found Apr 20 23:13:10.248782 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.248731 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-166.ec2.internal" not found Apr 20 23:13:10.435158 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:10.435133 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-166.ec2.internal\" not found" node="ip-10-0-134-166.ec2.internal" Apr 20 23:13:10.505983 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.505927 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-166.ec2.internal" not found Apr 20 23:13:10.505983 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:10.505961 2577 csi_plugin.go:397] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ip-10-0-134-166.ec2.internal" not found Apr 20 23:13:10.669880 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.669837 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 23:13:10.670817 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.670798 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 23:13:10.670915 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.670831 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 23:13:10.670915 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.670844 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 23:13:10.670915 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.670876 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-166.ec2.internal" Apr 20 23:13:10.677987 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.677973 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-166.ec2.internal" Apr 20 23:13:10.678034 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:10.677996 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-166.ec2.internal\": node \"ip-10-0-134-166.ec2.internal\" not found" Apr 20 23:13:10.755671 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.755629 2577 apiserver.go:52] "Watching apiserver" Apr 20 23:13:10.761031 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.760984 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 23:13:10.761159 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.761142 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-2lxxb","openshift-network-diagnostics/network-check-target-cvv7k"] Apr 20 23:13:10.766074 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.766056 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-166.ec2.internal" Apr 20 23:13:10.767339 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.767322 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:10.767406 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:10.767388 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:10.767452 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.767322 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.769974 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.769931 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 23:13:10.770109 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.769991 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 23:13:10.770109 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.770023 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 23:13:10.770109 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.770051 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qhtjr\"" Apr 20 23:13:10.770265 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.770249 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 23:13:10.772489 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.772469 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 23:13:10.780356 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-cnibin\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780473 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780359 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-os-release\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780473 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780381 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-var-lib-kubelet\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780473 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780404 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-system-cni-dir\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780473 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780449 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-multus-socket-dir-parent\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780630 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780498 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-var-lib-cni-multus\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780630 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780520 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-multus-conf-dir\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780630 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780536 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b96ea9fa-073d-43b9-86ea-ea051d78bec9-multus-daemon-config\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780630 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-etc-kubernetes\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780630 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780596 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrhdd\" (UniqueName: \"kubernetes.io/projected/b96ea9fa-073d-43b9-86ea-ea051d78bec9-kube-api-access-nrhdd\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780630 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780628 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b96ea9fa-073d-43b9-86ea-ea051d78bec9-cni-binary-copy\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780820 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780647 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-hostroot\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780820 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780662 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-run-multus-certs\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780820 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780689 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48wx7\" (UniqueName: \"kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7\") pod \"network-check-target-cvv7k\" (UID: \"dfd7de81-99ac-43ab-b2b7-773e89f49916\") " pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:10.780820 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780710 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-run-k8s-cni-cncf-io\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780820 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780725 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-run-netns\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780820 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780751 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-var-lib-cni-bin\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.780820 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.780783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-multus-cni-dir\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.787738 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.787719 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 23:13:10.788828 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.788813 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-166.ec2.internal"] Apr 20 23:13:10.789001 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.788987 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 23:13:10.789049 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.789039 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" Apr 20 23:13:10.801307 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.801282 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-zh8tf"] Apr 20 23:13:10.804143 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.804123 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zh8tf" Apr 20 23:13:10.806729 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.806710 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 23:13:10.806816 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.806784 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 23:13:10.807062 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.807049 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 23:13:10.807138 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.807124 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-rsfgq\"" Apr 20 23:13:10.811169 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.811152 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kbfps"] Apr 20 23:13:10.814047 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.814032 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.816354 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.816333 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 23:13:10.816455 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.816399 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 23:13:10.817030 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.817013 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 23:13:10.817163 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.817148 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 23:13:10.817685 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.817670 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 23:13:10.817763 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.817708 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 23:13:10.817763 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.817734 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vbkp7\"" Apr 20 23:13:10.826213 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.826194 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-p7prf" Apr 20 23:13:10.829482 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.829469 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-2mnql"] Apr 20 23:13:10.832267 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.832252 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2mnql" Apr 20 23:13:10.836509 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.836462 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-166.ec2.internal" podStartSLOduration=0.836446678 podStartE2EDuration="836.446678ms" podCreationTimestamp="2026-04-20 23:13:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:13:10.834895195 +0000 UTC m=+20.506264343" watchObservedRunningTime="2026-04-20 23:13:10.836446678 +0000 UTC m=+20.507815827" Apr 20 23:13:10.836813 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.836776 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 23:13:10.836917 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.836836 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jht7f\"" Apr 20 23:13:10.837005 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.836986 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 23:13:10.840813 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.840795 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-p7prf" Apr 20 23:13:10.866095 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.866073 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 23:13:10.880925 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.880900 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-multus-cni-dir\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881025 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.880931 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-var-lib-kubelet\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881031 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-multus-cni-dir\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881070 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881057 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-var-lib-kubelet\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881137 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881082 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fcba1bc2-a90d-4680-b9a4-71239880dae3-host-slash\") pod \"iptables-alerter-zh8tf\" (UID: \"fcba1bc2-a90d-4680-b9a4-71239880dae3\") " pod="openshift-network-operator/iptables-alerter-zh8tf" Apr 20 23:13:10.881137 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881098 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-run-openvswitch\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881137 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881114 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-node-log\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881285 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881142 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-kubelet\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881285 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-system-cni-dir\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881285 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881197 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-var-lib-cni-multus\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881285 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881219 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-system-cni-dir\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881285 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b96ea9fa-073d-43b9-86ea-ea051d78bec9-multus-daemon-config\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881285 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881262 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48hn9\" (UniqueName: \"kubernetes.io/projected/fcba1bc2-a90d-4680-b9a4-71239880dae3-kube-api-access-48hn9\") pod \"iptables-alerter-zh8tf\" (UID: \"fcba1bc2-a90d-4680-b9a4-71239880dae3\") " pod="openshift-network-operator/iptables-alerter-zh8tf" Apr 20 23:13:10.881285 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881285 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-run-ovn\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881598 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-var-lib-cni-multus\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881598 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881365 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-cni-netd\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881598 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/33327497-14af-4ec5-a658-d9a33e5963c1-agent-certs\") pod \"konnectivity-agent-2mnql\" (UID: \"33327497-14af-4ec5-a658-d9a33e5963c1\") " pod="kube-system/konnectivity-agent-2mnql" Apr 20 23:13:10.881598 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881455 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b96ea9fa-073d-43b9-86ea-ea051d78bec9-cni-binary-copy\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881598 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881482 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-run-multus-certs\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881598 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881513 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-run-ovn-kubernetes\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881598 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881542 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92tdb\" (UniqueName: \"kubernetes.io/projected/da93c8a9-809d-4b57-b6ad-138eab016391-kube-api-access-92tdb\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881598 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881567 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-log-socket\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881598 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881592 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da93c8a9-809d-4b57-b6ad-138eab016391-ovnkube-config\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881607 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-run-multus-certs\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da93c8a9-809d-4b57-b6ad-138eab016391-env-overrides\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881647 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-cnibin\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881673 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-os-release\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881709 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-slash\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881724 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-cnibin\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881750 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-os-release\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881756 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b96ea9fa-073d-43b9-86ea-ea051d78bec9-multus-daemon-config\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881738 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-run-netns\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881800 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-run-systemd\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881818 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-cni-bin\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881842 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da93c8a9-809d-4b57-b6ad-138eab016391-ovn-node-metrics-cert\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881861 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-systemd-units\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b96ea9fa-073d-43b9-86ea-ea051d78bec9-cni-binary-copy\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881886 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-multus-socket-dir-parent\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-multus-conf-dir\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.881926 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-multus-conf-dir\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881974 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-etc-kubernetes\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.881998 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrhdd\" (UniqueName: \"kubernetes.io/projected/b96ea9fa-073d-43b9-86ea-ea051d78bec9-kube-api-access-nrhdd\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882007 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-multus-socket-dir-parent\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882016 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-var-lib-openvswitch\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-hostroot\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882060 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48wx7\" (UniqueName: \"kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7\") pod \"network-check-target-cvv7k\" (UID: \"dfd7de81-99ac-43ab-b2b7-773e89f49916\") " pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882078 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-etc-kubernetes\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882084 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-hostroot\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882086 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-etc-openvswitch\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882115 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da93c8a9-809d-4b57-b6ad-138eab016391-ovnkube-script-lib\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882136 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-run-k8s-cni-cncf-io\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882152 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-run-netns\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882196 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-var-lib-cni-bin\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882210 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-run-k8s-cni-cncf-io\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882218 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-run-netns\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fcba1bc2-a90d-4680-b9a4-71239880dae3-iptables-alerter-script\") pod \"iptables-alerter-zh8tf\" (UID: \"fcba1bc2-a90d-4680-b9a4-71239880dae3\") " pod="openshift-network-operator/iptables-alerter-zh8tf" Apr 20 23:13:10.882418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882251 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.882876 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882267 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b96ea9fa-073d-43b9-86ea-ea051d78bec9-host-var-lib-cni-bin\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.882876 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.882280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/33327497-14af-4ec5-a658-d9a33e5963c1-konnectivity-ca\") pod \"konnectivity-agent-2mnql\" (UID: \"33327497-14af-4ec5-a658-d9a33e5963c1\") " pod="kube-system/konnectivity-agent-2mnql" Apr 20 23:13:10.889241 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:10.889219 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:13:10.889241 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:10.889240 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:13:10.889397 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:10.889251 2577 projected.go:194] Error preparing data for projected volume kube-api-access-48wx7 for pod openshift-network-diagnostics/network-check-target-cvv7k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:10.889397 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:10.889306 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7 podName:dfd7de81-99ac-43ab-b2b7-773e89f49916 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:11.38929083 +0000 UTC m=+21.060659954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-48wx7" (UniqueName: "kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7") pod "network-check-target-cvv7k" (UID: "dfd7de81-99ac-43ab-b2b7-773e89f49916") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:10.889724 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.889706 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 23:13:10.893635 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.893511 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrhdd\" (UniqueName: \"kubernetes.io/projected/b96ea9fa-073d-43b9-86ea-ea051d78bec9-kube-api-access-nrhdd\") pod \"multus-2lxxb\" (UID: \"b96ea9fa-073d-43b9-86ea-ea051d78bec9\") " pod="openshift-multus/multus-2lxxb" Apr 20 23:13:10.896440 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.896422 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 23:13:10.897915 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.897897 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal"] Apr 20 23:13:10.906821 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.906803 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zhjs2"] Apr 20 23:13:10.909867 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.909849 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zhjs2" Apr 20 23:13:10.910244 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.910228 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz"] Apr 20 23:13:10.912181 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.912156 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4f2gz\"" Apr 20 23:13:10.912448 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.912433 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 23:13:10.912509 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.912451 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 23:13:10.912895 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.912880 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:10.915109 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.915088 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 23:13:10.915156 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.915124 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 23:13:10.915413 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.915399 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-gxzb9\"" Apr 20 23:13:10.915684 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.915671 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 23:13:10.939152 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.939125 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-7vff9"] Apr 20 23:13:10.941889 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.941874 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:10.941972 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:10.941940 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:10.942011 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.941981 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-g2ng5"] Apr 20 23:13:10.944717 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.944705 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:10.947934 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.947920 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 23:13:10.948367 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.948353 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 23:13:10.948420 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.948356 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vx8r6\"" Apr 20 23:13:10.956879 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.956864 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_ff387f4dda566eb1da7627d669fc453f/kube-rbac-proxy-crio/2.log" Apr 20 23:13:10.957452 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.957439 2577 scope.go:117] "RemoveContainer" containerID="e09f74f7868f52dc9fafb978926c7d4883e2009c4e8f29d2520b661d8579a4e5" Apr 20 23:13:10.957591 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:10.957575 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_openshift-machine-config-operator(ff387f4dda566eb1da7627d669fc453f)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" podUID="ff387f4dda566eb1da7627d669fc453f" Apr 20 23:13:10.969373 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.969351 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-5f4p9"] Apr 20 23:13:10.972082 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.972068 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5f4p9" Apr 20 23:13:10.974508 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.974489 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 23:13:10.974607 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.974551 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 23:13:10.974752 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.974738 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 23:13:10.976219 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.976206 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jhf8q\"" Apr 20 23:13:10.982713 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.982696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/33327497-14af-4ec5-a658-d9a33e5963c1-konnectivity-ca\") pod \"konnectivity-agent-2mnql\" (UID: \"33327497-14af-4ec5-a658-d9a33e5963c1\") " pod="kube-system/konnectivity-agent-2mnql" Apr 20 23:13:10.982792 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.982723 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-registration-dir\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:10.982792 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.982743 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-node-log\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.982792 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.982761 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48hn9\" (UniqueName: \"kubernetes.io/projected/fcba1bc2-a90d-4680-b9a4-71239880dae3-kube-api-access-48hn9\") pod \"iptables-alerter-zh8tf\" (UID: \"fcba1bc2-a90d-4680-b9a4-71239880dae3\") " pod="openshift-network-operator/iptables-alerter-zh8tf" Apr 20 23:13:10.982909 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.982821 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-node-log\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.982909 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.982864 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:10.982909 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.982887 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-etc-selinux\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:10.983062 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.982909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-run-ovn-kubernetes\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983062 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.982926 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92tdb\" (UniqueName: \"kubernetes.io/projected/da93c8a9-809d-4b57-b6ad-138eab016391-kube-api-access-92tdb\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983062 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.982955 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-device-dir\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:10.983062 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.982988 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da93c8a9-809d-4b57-b6ad-138eab016391-ovnkube-config\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983062 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.982992 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-run-ovn-kubernetes\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983062 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983005 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-socket-dir\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:10.983062 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983045 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:10.983360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-slash\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-run-netns\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da93c8a9-809d-4b57-b6ad-138eab016391-ovn-node-metrics-cert\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983134 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-slash\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983161 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5e09233-03a8-4a45-aea7-fd8ccb794be7-tmp-dir\") pod \"node-resolver-zhjs2\" (UID: \"e5e09233-03a8-4a45-aea7-fd8ccb794be7\") " pod="openshift-dns/node-resolver-zhjs2" Apr 20 23:13:10.983360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983181 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-run-netns\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-etc-openvswitch\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983246 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da93c8a9-809d-4b57-b6ad-138eab016391-ovnkube-script-lib\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983256 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/33327497-14af-4ec5-a658-d9a33e5963c1-konnectivity-ca\") pod \"konnectivity-agent-2mnql\" (UID: \"33327497-14af-4ec5-a658-d9a33e5963c1\") " pod="kube-system/konnectivity-agent-2mnql" Apr 20 23:13:10.983360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983290 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-os-release\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:10.983360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983323 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-cni-binary-copy\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:10.983360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fcba1bc2-a90d-4680-b9a4-71239880dae3-iptables-alerter-script\") pod \"iptables-alerter-zh8tf\" (UID: \"fcba1bc2-a90d-4680-b9a4-71239880dae3\") " pod="openshift-network-operator/iptables-alerter-zh8tf" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983366 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-etc-openvswitch\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983423 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs\") pod \"network-metrics-daemon-7vff9\" (UID: \"25b85e22-f989-497b-a027-9ffb78b0533d\") " pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fcba1bc2-a90d-4680-b9a4-71239880dae3-host-slash\") pod \"iptables-alerter-zh8tf\" (UID: \"fcba1bc2-a90d-4680-b9a4-71239880dae3\") " pod="openshift-network-operator/iptables-alerter-zh8tf" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983457 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-run-openvswitch\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983509 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da93c8a9-809d-4b57-b6ad-138eab016391-ovnkube-config\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983520 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz7kt\" (UniqueName: \"kubernetes.io/projected/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-kube-api-access-vz7kt\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983558 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-sys-fs\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983559 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-run-openvswitch\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983588 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fcba1bc2-a90d-4680-b9a4-71239880dae3-host-slash\") pod \"iptables-alerter-zh8tf\" (UID: \"fcba1bc2-a90d-4680-b9a4-71239880dae3\") " pod="openshift-network-operator/iptables-alerter-zh8tf" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-kubelet\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983653 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-kubelet\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-run-ovn\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983697 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-cni-netd\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983698 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da93c8a9-809d-4b57-b6ad-138eab016391-ovnkube-script-lib\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.983848 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983725 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-run-ovn\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/33327497-14af-4ec5-a658-d9a33e5963c1-agent-certs\") pod \"konnectivity-agent-2mnql\" (UID: \"33327497-14af-4ec5-a658-d9a33e5963c1\") " pod="kube-system/konnectivity-agent-2mnql" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983773 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-system-cni-dir\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983842 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fcba1bc2-a90d-4680-b9a4-71239880dae3-iptables-alerter-script\") pod \"iptables-alerter-zh8tf\" (UID: \"fcba1bc2-a90d-4680-b9a4-71239880dae3\") " pod="openshift-network-operator/iptables-alerter-zh8tf" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983873 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-cni-netd\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983882 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-log-socket\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983905 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-log-socket\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983910 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da93c8a9-809d-4b57-b6ad-138eab016391-env-overrides\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983932 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983972 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-run-systemd\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.983991 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-cni-bin\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.984007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5e09233-03a8-4a45-aea7-fd8ccb794be7-hosts-file\") pod \"node-resolver-zhjs2\" (UID: \"e5e09233-03a8-4a45-aea7-fd8ccb794be7\") " pod="openshift-dns/node-resolver-zhjs2" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.984025 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-cnibin\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.984051 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.984057 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-host-cni-bin\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.984079 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-systemd-units\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.984080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-run-systemd\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.984491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.984113 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-var-lib-openvswitch\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.985012 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.984133 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-systemd-units\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.985012 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.984173 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwrvm\" (UniqueName: \"kubernetes.io/projected/e5e09233-03a8-4a45-aea7-fd8ccb794be7-kube-api-access-mwrvm\") pod \"node-resolver-zhjs2\" (UID: \"e5e09233-03a8-4a45-aea7-fd8ccb794be7\") " pod="openshift-dns/node-resolver-zhjs2" Apr 20 23:13:10.985012 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.984190 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da93c8a9-809d-4b57-b6ad-138eab016391-var-lib-openvswitch\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.985012 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.984223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s9h5\" (UniqueName: \"kubernetes.io/projected/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-kube-api-access-5s9h5\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:10.985012 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.984244 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97l6z\" (UniqueName: \"kubernetes.io/projected/25b85e22-f989-497b-a027-9ffb78b0533d-kube-api-access-97l6z\") pod \"network-metrics-daemon-7vff9\" (UID: \"25b85e22-f989-497b-a027-9ffb78b0533d\") " pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:10.985012 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.984274 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da93c8a9-809d-4b57-b6ad-138eab016391-env-overrides\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.985674 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.985643 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da93c8a9-809d-4b57-b6ad-138eab016391-ovn-node-metrics-cert\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:10.985962 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.985931 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/33327497-14af-4ec5-a658-d9a33e5963c1-agent-certs\") pod \"konnectivity-agent-2mnql\" (UID: \"33327497-14af-4ec5-a658-d9a33e5963c1\") " pod="kube-system/konnectivity-agent-2mnql" Apr 20 23:13:10.996431 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:10.996398 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48hn9\" (UniqueName: \"kubernetes.io/projected/fcba1bc2-a90d-4680-b9a4-71239880dae3-kube-api-access-48hn9\") pod \"iptables-alerter-zh8tf\" (UID: \"fcba1bc2-a90d-4680-b9a4-71239880dae3\") " pod="openshift-network-operator/iptables-alerter-zh8tf" Apr 20 23:13:11.002891 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.002870 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92tdb\" (UniqueName: \"kubernetes.io/projected/da93c8a9-809d-4b57-b6ad-138eab016391-kube-api-access-92tdb\") pod \"ovnkube-node-kbfps\" (UID: \"da93c8a9-809d-4b57-b6ad-138eab016391\") " pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:11.004719 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.004699 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-9hwxp"] Apr 20 23:13:11.008687 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.008672 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.011217 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.011177 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 23:13:11.011401 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.011388 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 23:13:11.011504 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.011486 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-cljzk\"" Apr 20 23:13:11.075759 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.075723 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2lxxb" Apr 20 23:13:11.084784 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.084759 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.084887 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.084802 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-kubernetes\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.084887 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.084831 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5e09233-03a8-4a45-aea7-fd8ccb794be7-tmp-dir\") pod \"node-resolver-zhjs2\" (UID: \"e5e09233-03a8-4a45-aea7-fd8ccb794be7\") " pod="openshift-dns/node-resolver-zhjs2" Apr 20 23:13:11.084887 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.084857 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-modprobe-d\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.085060 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.084896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-os-release\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.085060 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.084921 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-cni-binary-copy\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.085060 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:13:11.084999 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb96ea9fa_073d_43b9_86ea_ea051d78bec9.slice/crio-b22043ec1ef76fb4873edf8bec84470eb7b1368780d8a8f6400fe421b2ef984f WatchSource:0}: Error finding container b22043ec1ef76fb4873edf8bec84470eb7b1368780d8a8f6400fe421b2ef984f: Status 404 returned error can't find the container with id b22043ec1ef76fb4873edf8bec84470eb7b1368780d8a8f6400fe421b2ef984f Apr 20 23:13:11.085060 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-os-release\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.085060 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085042 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs\") pod \"network-metrics-daemon-7vff9\" (UID: \"25b85e22-f989-497b-a027-9ffb78b0533d\") " pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:11.085288 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085078 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-sysctl-d\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.085288 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085105 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-sysctl-conf\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.085288 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085142 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-systemd\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.085288 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085181 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vz7kt\" (UniqueName: \"kubernetes.io/projected/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-kube-api-access-vz7kt\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.085288 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:11.085190 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:11.085288 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-sys-fs\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.085288 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:11.085260 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs podName:25b85e22-f989-497b-a027-9ffb78b0533d nodeName:}" failed. No retries permitted until 2026-04-20 23:13:11.585244835 +0000 UTC m=+21.256613959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs") pod "network-metrics-daemon-7vff9" (UID: "25b85e22-f989-497b-a027-9ffb78b0533d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:11.085580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085332 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-sys-fs\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.085580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085334 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-system-cni-dir\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.085580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085369 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-sysconfig\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.085580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085378 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-system-cni-dir\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.085580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085393 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-host\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.085580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085423 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.085580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085439 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5e09233-03a8-4a45-aea7-fd8ccb794be7-tmp-dir\") pod \"node-resolver-zhjs2\" (UID: \"e5e09233-03a8-4a45-aea7-fd8ccb794be7\") " pod="openshift-dns/node-resolver-zhjs2" Apr 20 23:13:11.085580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085448 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmp6s\" (UniqueName: \"kubernetes.io/projected/c6979533-8e83-4666-8eef-f9eddb53e99e-kube-api-access-fmp6s\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.085580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085499 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5e09233-03a8-4a45-aea7-fd8ccb794be7-hosts-file\") pod \"node-resolver-zhjs2\" (UID: \"e5e09233-03a8-4a45-aea7-fd8ccb794be7\") " pod="openshift-dns/node-resolver-zhjs2" Apr 20 23:13:11.085580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085511 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.085580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085522 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-sys\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.085580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085543 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-cnibin\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.085580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085570 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.085580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085575 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5e09233-03a8-4a45-aea7-fd8ccb794be7-hosts-file\") pod \"node-resolver-zhjs2\" (UID: \"e5e09233-03a8-4a45-aea7-fd8ccb794be7\") " pod="openshift-dns/node-resolver-zhjs2" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085594 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwrvm\" (UniqueName: \"kubernetes.io/projected/e5e09233-03a8-4a45-aea7-fd8ccb794be7-kube-api-access-mwrvm\") pod \"node-resolver-zhjs2\" (UID: \"e5e09233-03a8-4a45-aea7-fd8ccb794be7\") " pod="openshift-dns/node-resolver-zhjs2" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085597 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-cnibin\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085618 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-tuned\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085721 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-cni-binary-copy\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085823 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s9h5\" (UniqueName: \"kubernetes.io/projected/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-kube-api-access-5s9h5\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085843 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085856 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97l6z\" (UniqueName: \"kubernetes.io/projected/25b85e22-f989-497b-a027-9ffb78b0533d-kube-api-access-97l6z\") pod \"network-metrics-daemon-7vff9\" (UID: \"25b85e22-f989-497b-a027-9ffb78b0533d\") " pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085883 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e70db85-c7e8-41b6-a59d-3b622a5e7bb0-host\") pod \"node-ca-5f4p9\" (UID: \"3e70db85-c7e8-41b6-a59d-3b622a5e7bb0\") " pod="openshift-image-registry/node-ca-5f4p9" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085907 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3e70db85-c7e8-41b6-a59d-3b622a5e7bb0-serviceca\") pod \"node-ca-5f4p9\" (UID: \"3e70db85-c7e8-41b6-a59d-3b622a5e7bb0\") " pod="openshift-image-registry/node-ca-5f4p9" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085957 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-registration-dir\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.085983 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c6979533-8e83-4666-8eef-f9eddb53e99e-tmp\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.086012 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.086068 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-registration-dir\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.086077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-etc-selinux\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.086121 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-device-dir\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.086144 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.086235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.086150 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-run\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.086782 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.086177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2xxl\" (UniqueName: \"kubernetes.io/projected/3e70db85-c7e8-41b6-a59d-3b622a5e7bb0-kube-api-access-c2xxl\") pod \"node-ca-5f4p9\" (UID: \"3e70db85-c7e8-41b6-a59d-3b622a5e7bb0\") " pod="openshift-image-registry/node-ca-5f4p9" Apr 20 23:13:11.086782 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.086194 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-device-dir\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.086782 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.086206 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-socket-dir\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.086782 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.086213 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-etc-selinux\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.086782 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.086232 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-lib-modules\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.086782 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.086259 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-var-lib-kubelet\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.086782 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.086312 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-socket-dir\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.086782 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.086608 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.100787 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.100759 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwrvm\" (UniqueName: \"kubernetes.io/projected/e5e09233-03a8-4a45-aea7-fd8ccb794be7-kube-api-access-mwrvm\") pod \"node-resolver-zhjs2\" (UID: \"e5e09233-03a8-4a45-aea7-fd8ccb794be7\") " pod="openshift-dns/node-resolver-zhjs2" Apr 20 23:13:11.101166 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.101147 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz7kt\" (UniqueName: \"kubernetes.io/projected/042233f9-bbe1-4bd7-acbd-b2180ef39cbb-kube-api-access-vz7kt\") pod \"multus-additional-cni-plugins-g2ng5\" (UID: \"042233f9-bbe1-4bd7-acbd-b2180ef39cbb\") " pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.101285 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.101187 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s9h5\" (UniqueName: \"kubernetes.io/projected/d7d6b9db-ce8b-4cdf-9672-afbd58183b48-kube-api-access-5s9h5\") pod \"aws-ebs-csi-driver-node-4jscz\" (UID: \"d7d6b9db-ce8b-4cdf-9672-afbd58183b48\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.102960 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.102918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97l6z\" (UniqueName: \"kubernetes.io/projected/25b85e22-f989-497b-a027-9ffb78b0533d-kube-api-access-97l6z\") pod \"network-metrics-daemon-7vff9\" (UID: \"25b85e22-f989-497b-a027-9ffb78b0533d\") " pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:11.112324 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.112309 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zh8tf" Apr 20 23:13:11.118634 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:13:11.118609 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcba1bc2_a90d_4680_b9a4_71239880dae3.slice/crio-bca9d8e9a4ee1545368c8fdbf231b9b427d9fbe1a69da2f3bd4f55675ff0ee6f WatchSource:0}: Error finding container bca9d8e9a4ee1545368c8fdbf231b9b427d9fbe1a69da2f3bd4f55675ff0ee6f: Status 404 returned error can't find the container with id bca9d8e9a4ee1545368c8fdbf231b9b427d9fbe1a69da2f3bd4f55675ff0ee6f Apr 20 23:13:11.123578 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.123560 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:11.129029 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:13:11.129004 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda93c8a9_809d_4b57_b6ad_138eab016391.slice/crio-977263e5bc9a93f10933d936b4d273aa51dd0d0135f1545489f0b6d3b5612387 WatchSource:0}: Error finding container 977263e5bc9a93f10933d936b4d273aa51dd0d0135f1545489f0b6d3b5612387: Status 404 returned error can't find the container with id 977263e5bc9a93f10933d936b4d273aa51dd0d0135f1545489f0b6d3b5612387 Apr 20 23:13:11.139825 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.139804 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2mnql" Apr 20 23:13:11.145722 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:13:11.145701 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33327497_14af_4ec5_a658_d9a33e5963c1.slice/crio-93b9de7a0231ceeee3360657f0df19912a97b76ca1954ebabefa86b05883909a WatchSource:0}: Error finding container 93b9de7a0231ceeee3360657f0df19912a97b76ca1954ebabefa86b05883909a: Status 404 returned error can't find the container with id 93b9de7a0231ceeee3360657f0df19912a97b76ca1954ebabefa86b05883909a Apr 20 23:13:11.186664 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186631 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-run\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.186664 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2xxl\" (UniqueName: \"kubernetes.io/projected/3e70db85-c7e8-41b6-a59d-3b622a5e7bb0-kube-api-access-c2xxl\") pod \"node-ca-5f4p9\" (UID: \"3e70db85-c7e8-41b6-a59d-3b622a5e7bb0\") " pod="openshift-image-registry/node-ca-5f4p9" Apr 20 23:13:11.186877 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-lib-modules\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.186877 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186760 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-var-lib-kubelet\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.186877 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-kubernetes\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.186877 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186825 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-lib-modules\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.186877 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186831 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-modprobe-d\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.186877 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186864 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-kubernetes\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-var-lib-kubelet\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186887 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-sysctl-d\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186826 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-run\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186908 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-sysctl-conf\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186929 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-systemd\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186972 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-modprobe-d\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186978 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-sysconfig\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.186990 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-sysctl-d\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.187003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-host\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.187027 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmp6s\" (UniqueName: \"kubernetes.io/projected/c6979533-8e83-4666-8eef-f9eddb53e99e-kube-api-access-fmp6s\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.187046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-sys\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.187049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-systemd\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.187028 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-sysconfig\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.187055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-host\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.187083 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-sysctl-conf\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.187093 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6979533-8e83-4666-8eef-f9eddb53e99e-sys\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.187103 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-tuned\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.187138 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e70db85-c7e8-41b6-a59d-3b622a5e7bb0-host\") pod \"node-ca-5f4p9\" (UID: \"3e70db85-c7e8-41b6-a59d-3b622a5e7bb0\") " pod="openshift-image-registry/node-ca-5f4p9" Apr 20 23:13:11.187740 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.187154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3e70db85-c7e8-41b6-a59d-3b622a5e7bb0-serviceca\") pod \"node-ca-5f4p9\" (UID: \"3e70db85-c7e8-41b6-a59d-3b622a5e7bb0\") " pod="openshift-image-registry/node-ca-5f4p9" Apr 20 23:13:11.187740 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.187171 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c6979533-8e83-4666-8eef-f9eddb53e99e-tmp\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.187740 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.187211 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e70db85-c7e8-41b6-a59d-3b622a5e7bb0-host\") pod \"node-ca-5f4p9\" (UID: \"3e70db85-c7e8-41b6-a59d-3b622a5e7bb0\") " pod="openshift-image-registry/node-ca-5f4p9" Apr 20 23:13:11.188679 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.188662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3e70db85-c7e8-41b6-a59d-3b622a5e7bb0-serviceca\") pod \"node-ca-5f4p9\" (UID: \"3e70db85-c7e8-41b6-a59d-3b622a5e7bb0\") " pod="openshift-image-registry/node-ca-5f4p9" Apr 20 23:13:11.189321 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.189305 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c6979533-8e83-4666-8eef-f9eddb53e99e-tmp\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.189368 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.189338 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c6979533-8e83-4666-8eef-f9eddb53e99e-etc-tuned\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.198454 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.198425 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2xxl\" (UniqueName: \"kubernetes.io/projected/3e70db85-c7e8-41b6-a59d-3b622a5e7bb0-kube-api-access-c2xxl\") pod \"node-ca-5f4p9\" (UID: \"3e70db85-c7e8-41b6-a59d-3b622a5e7bb0\") " pod="openshift-image-registry/node-ca-5f4p9" Apr 20 23:13:11.198616 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.198600 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmp6s\" (UniqueName: \"kubernetes.io/projected/c6979533-8e83-4666-8eef-f9eddb53e99e-kube-api-access-fmp6s\") pod \"tuned-9hwxp\" (UID: \"c6979533-8e83-4666-8eef-f9eddb53e99e\") " pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.219493 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.219469 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zhjs2" Apr 20 23:13:11.224106 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.224091 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" Apr 20 23:13:11.225769 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:13:11.225748 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5e09233_03a8_4a45_aea7_fd8ccb794be7.slice/crio-2f377bda001f86180064aca10c0be6c00f2dc76c7a849c93d7f6f44e355b2096 WatchSource:0}: Error finding container 2f377bda001f86180064aca10c0be6c00f2dc76c7a849c93d7f6f44e355b2096: Status 404 returned error can't find the container with id 2f377bda001f86180064aca10c0be6c00f2dc76c7a849c93d7f6f44e355b2096 Apr 20 23:13:11.231629 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:13:11.231606 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7d6b9db_ce8b_4cdf_9672_afbd58183b48.slice/crio-edb47605f33f4d68604480065b71f2bc62f67b5812d05e5349928f5f596c709e WatchSource:0}: Error finding container edb47605f33f4d68604480065b71f2bc62f67b5812d05e5349928f5f596c709e: Status 404 returned error can't find the container with id edb47605f33f4d68604480065b71f2bc62f67b5812d05e5349928f5f596c709e Apr 20 23:13:11.252805 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.252773 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g2ng5" Apr 20 23:13:11.259194 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:13:11.259169 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod042233f9_bbe1_4bd7_acbd_b2180ef39cbb.slice/crio-f49a6d8cecad419f713513a00450c37feaf02da4946a0a3f4b45b1d4d24d91a6 WatchSource:0}: Error finding container f49a6d8cecad419f713513a00450c37feaf02da4946a0a3f4b45b1d4d24d91a6: Status 404 returned error can't find the container with id f49a6d8cecad419f713513a00450c37feaf02da4946a0a3f4b45b1d4d24d91a6 Apr 20 23:13:11.280029 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.280000 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5f4p9" Apr 20 23:13:11.286780 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:13:11.286754 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e70db85_c7e8_41b6_a59d_3b622a5e7bb0.slice/crio-3ccc7d0357eeb70e6b671c858de2654cf531c3706b401a8b84217b273e994896 WatchSource:0}: Error finding container 3ccc7d0357eeb70e6b671c858de2654cf531c3706b401a8b84217b273e994896: Status 404 returned error can't find the container with id 3ccc7d0357eeb70e6b671c858de2654cf531c3706b401a8b84217b273e994896 Apr 20 23:13:11.316975 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.316937 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" Apr 20 23:13:11.323811 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:13:11.323774 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6979533_8e83_4666_8eef_f9eddb53e99e.slice/crio-60595c1b153e4666cbea8bc0d109fafe73eafae676cad89e71061ec934d208a1 WatchSource:0}: Error finding container 60595c1b153e4666cbea8bc0d109fafe73eafae676cad89e71061ec934d208a1: Status 404 returned error can't find the container with id 60595c1b153e4666cbea8bc0d109fafe73eafae676cad89e71061ec934d208a1 Apr 20 23:13:11.488361 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.488315 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48wx7\" (UniqueName: \"kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7\") pod \"network-check-target-cvv7k\" (UID: \"dfd7de81-99ac-43ab-b2b7-773e89f49916\") " pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:11.488554 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:11.488493 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:13:11.488554 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:11.488509 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:13:11.488554 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:11.488518 2577 projected.go:194] Error preparing data for projected volume kube-api-access-48wx7 for pod openshift-network-diagnostics/network-check-target-cvv7k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:11.488715 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:11.488577 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7 podName:dfd7de81-99ac-43ab-b2b7-773e89f49916 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:12.488562737 +0000 UTC m=+22.159931863 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-48wx7" (UniqueName: "kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7") pod "network-check-target-cvv7k" (UID: "dfd7de81-99ac-43ab-b2b7-773e89f49916") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:11.589882 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.589164 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs\") pod \"network-metrics-daemon-7vff9\" (UID: \"25b85e22-f989-497b-a027-9ffb78b0533d\") " pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:11.589882 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:11.589382 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:11.589882 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:11.589453 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs podName:25b85e22-f989-497b-a027-9ffb78b0533d nodeName:}" failed. No retries permitted until 2026-04-20 23:13:12.589432454 +0000 UTC m=+22.260801593 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs") pod "network-metrics-daemon-7vff9" (UID: "25b85e22-f989-497b-a027-9ffb78b0533d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:11.841477 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.841379 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 23:08:10 +0000 UTC" deadline="2027-11-24 03:38:27.740978285 +0000 UTC" Apr 20 23:13:11.841477 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.841423 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13972h25m15.899559013s" Apr 20 23:13:11.986090 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:11.986048 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2ng5" event={"ID":"042233f9-bbe1-4bd7-acbd-b2180ef39cbb","Type":"ContainerStarted","Data":"f49a6d8cecad419f713513a00450c37feaf02da4946a0a3f4b45b1d4d24d91a6"} Apr 20 23:13:12.010399 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.009417 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zh8tf" event={"ID":"fcba1bc2-a90d-4680-b9a4-71239880dae3","Type":"ContainerStarted","Data":"bca9d8e9a4ee1545368c8fdbf231b9b427d9fbe1a69da2f3bd4f55675ff0ee6f"} Apr 20 23:13:12.014034 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.013969 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2lxxb" event={"ID":"b96ea9fa-073d-43b9-86ea-ea051d78bec9","Type":"ContainerStarted","Data":"b22043ec1ef76fb4873edf8bec84470eb7b1368780d8a8f6400fe421b2ef984f"} Apr 20 23:13:12.025968 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.025922 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" event={"ID":"d7d6b9db-ce8b-4cdf-9672-afbd58183b48","Type":"ContainerStarted","Data":"edb47605f33f4d68604480065b71f2bc62f67b5812d05e5349928f5f596c709e"} Apr 20 23:13:12.040446 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.040407 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zhjs2" event={"ID":"e5e09233-03a8-4a45-aea7-fd8ccb794be7","Type":"ContainerStarted","Data":"2f377bda001f86180064aca10c0be6c00f2dc76c7a849c93d7f6f44e355b2096"} Apr 20 23:13:12.047200 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.047133 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2mnql" event={"ID":"33327497-14af-4ec5-a658-d9a33e5963c1","Type":"ContainerStarted","Data":"93b9de7a0231ceeee3360657f0df19912a97b76ca1954ebabefa86b05883909a"} Apr 20 23:13:12.064601 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.064537 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" event={"ID":"da93c8a9-809d-4b57-b6ad-138eab016391","Type":"ContainerStarted","Data":"977263e5bc9a93f10933d936b4d273aa51dd0d0135f1545489f0b6d3b5612387"} Apr 20 23:13:12.072921 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.072884 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" event={"ID":"c6979533-8e83-4666-8eef-f9eddb53e99e","Type":"ContainerStarted","Data":"60595c1b153e4666cbea8bc0d109fafe73eafae676cad89e71061ec934d208a1"} Apr 20 23:13:12.082270 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.082235 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5f4p9" event={"ID":"3e70db85-c7e8-41b6-a59d-3b622a5e7bb0","Type":"ContainerStarted","Data":"3ccc7d0357eeb70e6b671c858de2654cf531c3706b401a8b84217b273e994896"} Apr 20 23:13:12.497405 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.496735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48wx7\" (UniqueName: \"kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7\") pod \"network-check-target-cvv7k\" (UID: \"dfd7de81-99ac-43ab-b2b7-773e89f49916\") " pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:12.497405 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:12.496920 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:13:12.497405 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:12.496939 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:13:12.497405 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:12.496969 2577 projected.go:194] Error preparing data for projected volume kube-api-access-48wx7 for pod openshift-network-diagnostics/network-check-target-cvv7k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:12.497405 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:12.497033 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7 podName:dfd7de81-99ac-43ab-b2b7-773e89f49916 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:14.497012289 +0000 UTC m=+24.168381415 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-48wx7" (UniqueName: "kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7") pod "network-check-target-cvv7k" (UID: "dfd7de81-99ac-43ab-b2b7-773e89f49916") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:12.541458 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.541105 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:13:12.598561 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.597869 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs\") pod \"network-metrics-daemon-7vff9\" (UID: \"25b85e22-f989-497b-a027-9ffb78b0533d\") " pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:12.598561 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:12.598118 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:12.598561 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:12.598189 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs podName:25b85e22-f989-497b-a027-9ffb78b0533d nodeName:}" failed. No retries permitted until 2026-04-20 23:13:14.5981675 +0000 UTC m=+24.269536647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs") pod "network-metrics-daemon-7vff9" (UID: "25b85e22-f989-497b-a027-9ffb78b0533d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:12.842331 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.842233 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 23:08:10 +0000 UTC" deadline="2028-01-28 10:38:53.900189214 +0000 UTC" Apr 20 23:13:12.842331 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.842274 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15539h25m41.057919035s" Apr 20 23:13:12.915144 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.915110 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:12.915366 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:12.915251 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:12.915989 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:12.915968 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:12.916151 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:12.916126 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:14.512914 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:14.512858 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48wx7\" (UniqueName: \"kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7\") pod \"network-check-target-cvv7k\" (UID: \"dfd7de81-99ac-43ab-b2b7-773e89f49916\") " pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:14.513604 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:14.513583 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:13:14.513693 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:14.513612 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:13:14.513693 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:14.513626 2577 projected.go:194] Error preparing data for projected volume kube-api-access-48wx7 for pod openshift-network-diagnostics/network-check-target-cvv7k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:14.513796 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:14.513697 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7 podName:dfd7de81-99ac-43ab-b2b7-773e89f49916 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:18.513675807 +0000 UTC m=+28.185044933 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-48wx7" (UniqueName: "kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7") pod "network-check-target-cvv7k" (UID: "dfd7de81-99ac-43ab-b2b7-773e89f49916") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:14.613519 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:14.613477 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs\") pod \"network-metrics-daemon-7vff9\" (UID: \"25b85e22-f989-497b-a027-9ffb78b0533d\") " pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:14.613711 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:14.613659 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:14.613769 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:14.613726 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs podName:25b85e22-f989-497b-a027-9ffb78b0533d nodeName:}" failed. No retries permitted until 2026-04-20 23:13:18.613708488 +0000 UTC m=+28.285077618 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs") pod "network-metrics-daemon-7vff9" (UID: "25b85e22-f989-497b-a027-9ffb78b0533d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:14.914649 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:14.914504 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:14.914649 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:14.914606 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:14.915074 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:14.915053 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:14.915185 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:14.915168 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:16.913991 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:16.913961 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:16.914456 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:16.914125 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:16.914456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:16.914179 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:16.914456 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:16.914306 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:18.550315 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:18.550272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48wx7\" (UniqueName: \"kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7\") pod \"network-check-target-cvv7k\" (UID: \"dfd7de81-99ac-43ab-b2b7-773e89f49916\") " pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:18.550815 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:18.550429 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:13:18.550815 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:18.550448 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:13:18.550815 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:18.550463 2577 projected.go:194] Error preparing data for projected volume kube-api-access-48wx7 for pod openshift-network-diagnostics/network-check-target-cvv7k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:18.550815 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:18.550528 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7 podName:dfd7de81-99ac-43ab-b2b7-773e89f49916 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:26.550511152 +0000 UTC m=+36.221880281 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-48wx7" (UniqueName: "kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7") pod "network-check-target-cvv7k" (UID: "dfd7de81-99ac-43ab-b2b7-773e89f49916") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:18.651695 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:18.651654 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs\") pod \"network-metrics-daemon-7vff9\" (UID: \"25b85e22-f989-497b-a027-9ffb78b0533d\") " pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:18.651897 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:18.651805 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:18.651897 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:18.651871 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs podName:25b85e22-f989-497b-a027-9ffb78b0533d nodeName:}" failed. No retries permitted until 2026-04-20 23:13:26.651851865 +0000 UTC m=+36.323220998 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs") pod "network-metrics-daemon-7vff9" (UID: "25b85e22-f989-497b-a027-9ffb78b0533d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:18.914789 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:18.914273 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:18.914789 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:18.914426 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:18.915054 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:18.914795 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:18.915054 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:18.914906 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:20.914798 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:20.914758 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:20.915275 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:20.914882 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:20.915275 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:20.914934 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:20.915275 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:20.915077 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:22.914729 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:22.914691 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:22.915282 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:22.914707 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:22.915282 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:22.914824 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:22.915282 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:22.914955 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:23.914659 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:23.914633 2577 scope.go:117] "RemoveContainer" containerID="e09f74f7868f52dc9fafb978926c7d4883e2009c4e8f29d2520b661d8579a4e5" Apr 20 23:13:23.914869 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:23.914835 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_openshift-machine-config-operator(ff387f4dda566eb1da7627d669fc453f)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" podUID="ff387f4dda566eb1da7627d669fc453f" Apr 20 23:13:24.913988 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:24.913933 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:24.914182 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:24.913994 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:24.914182 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:24.914092 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:24.914301 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:24.914226 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:26.122637 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:26.122599 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:13:26.615247 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:26.615211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48wx7\" (UniqueName: \"kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7\") pod \"network-check-target-cvv7k\" (UID: \"dfd7de81-99ac-43ab-b2b7-773e89f49916\") " pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:26.615430 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:26.615397 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:13:26.615490 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:26.615434 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:13:26.615490 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:26.615449 2577 projected.go:194] Error preparing data for projected volume kube-api-access-48wx7 for pod openshift-network-diagnostics/network-check-target-cvv7k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:26.615573 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:26.615517 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7 podName:dfd7de81-99ac-43ab-b2b7-773e89f49916 nodeName:}" failed. No retries permitted until 2026-04-20 23:13:42.615495437 +0000 UTC m=+52.286864568 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-48wx7" (UniqueName: "kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7") pod "network-check-target-cvv7k" (UID: "dfd7de81-99ac-43ab-b2b7-773e89f49916") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:26.716368 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:26.716319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs\") pod \"network-metrics-daemon-7vff9\" (UID: \"25b85e22-f989-497b-a027-9ffb78b0533d\") " pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:26.716558 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:26.716471 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:26.716558 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:26.716550 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs podName:25b85e22-f989-497b-a027-9ffb78b0533d nodeName:}" failed. No retries permitted until 2026-04-20 23:13:42.716528561 +0000 UTC m=+52.387897687 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs") pod "network-metrics-daemon-7vff9" (UID: "25b85e22-f989-497b-a027-9ffb78b0533d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:26.914456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:26.914365 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:26.914456 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:26.914422 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:26.914658 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:26.914510 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:26.914658 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:26.914630 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:28.914516 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:28.914111 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:28.915316 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:28.914138 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:28.915316 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:28.914609 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:28.915316 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:28.915077 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:29.119879 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.119790 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" event={"ID":"d7d6b9db-ce8b-4cdf-9672-afbd58183b48","Type":"ContainerStarted","Data":"b8b5922eaa50588ae22652fa655ca478f9a5d980829fad9d0dc94f5f9f195cde"} Apr 20 23:13:29.121256 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.121220 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zhjs2" event={"ID":"e5e09233-03a8-4a45-aea7-fd8ccb794be7","Type":"ContainerStarted","Data":"91067dc7f099ab3359933b9f76fb5f360538d7e7c6fc2fa34317fb4169ab8106"} Apr 20 23:13:29.122910 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.122880 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2mnql" event={"ID":"33327497-14af-4ec5-a658-d9a33e5963c1","Type":"ContainerStarted","Data":"fcb1dae8629523a5737b6a54e6ec249a7c4e7531e2d7216229d88e22ef79b970"} Apr 20 23:13:29.125844 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.125808 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" event={"ID":"da93c8a9-809d-4b57-b6ad-138eab016391","Type":"ContainerStarted","Data":"bfdd1adb7c651edef7af79e4a8aa62bbf1771bfb6a59a22adfb32618814a4b5a"} Apr 20 23:13:29.125844 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.125839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" event={"ID":"da93c8a9-809d-4b57-b6ad-138eab016391","Type":"ContainerStarted","Data":"0d6f4db7ac0176dd77c7ff1609f5568503334ead35b133432849bd09e71f4a02"} Apr 20 23:13:29.126035 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.125854 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" event={"ID":"da93c8a9-809d-4b57-b6ad-138eab016391","Type":"ContainerStarted","Data":"5f2db9c1cf530f4ddd74858ba075d2604ecc9834d8b6229aefba287e74586eed"} Apr 20 23:13:29.126035 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.125868 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" event={"ID":"da93c8a9-809d-4b57-b6ad-138eab016391","Type":"ContainerStarted","Data":"d3cb2b666a646af7f5ebc7eed2784667c3d93e9cdcbf16fdea72fb73a456ad84"} Apr 20 23:13:29.126035 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.125880 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" event={"ID":"da93c8a9-809d-4b57-b6ad-138eab016391","Type":"ContainerStarted","Data":"bf0d7910b46be48ef6dfe89e879090e92b528a062259e03497a7b343d6527db4"} Apr 20 23:13:29.126035 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.125891 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" event={"ID":"da93c8a9-809d-4b57-b6ad-138eab016391","Type":"ContainerStarted","Data":"4a2a9c7a94207778cac36115abd5734c536665f681285f2137ae66ca8f8fec95"} Apr 20 23:13:29.127238 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.127208 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" event={"ID":"c6979533-8e83-4666-8eef-f9eddb53e99e","Type":"ContainerStarted","Data":"aa1b7f106fe041fe3ee33e8758aa86e0f81cd93567a8e01532503a01791999eb"} Apr 20 23:13:29.128829 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.128800 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5f4p9" event={"ID":"3e70db85-c7e8-41b6-a59d-3b622a5e7bb0","Type":"ContainerStarted","Data":"26b6f95748341a39570ca7b8580f7564f233f2459bd87e87b6d0c4216930dbf7"} Apr 20 23:13:29.130263 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.130242 2577 generic.go:358] "Generic (PLEG): container finished" podID="042233f9-bbe1-4bd7-acbd-b2180ef39cbb" containerID="8d45c475b5ef6be764f5eba73af182a3212d52c4876ccce36aa3504e8ccb4fbd" exitCode=0 Apr 20 23:13:29.130366 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.130305 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2ng5" event={"ID":"042233f9-bbe1-4bd7-acbd-b2180ef39cbb","Type":"ContainerDied","Data":"8d45c475b5ef6be764f5eba73af182a3212d52c4876ccce36aa3504e8ccb4fbd"} Apr 20 23:13:29.132003 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.131657 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2lxxb" event={"ID":"b96ea9fa-073d-43b9-86ea-ea051d78bec9","Type":"ContainerStarted","Data":"7690086d737e17da01c24f41b8aeef32b505b2643457c6815a6eac3e334de093"} Apr 20 23:13:29.141231 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.141178 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zhjs2" podStartSLOduration=2.274176981 podStartE2EDuration="19.141160995s" podCreationTimestamp="2026-04-20 23:13:10 +0000 UTC" firstStartedPulling="2026-04-20 23:13:11.227858614 +0000 UTC m=+20.899227740" lastFinishedPulling="2026-04-20 23:13:28.094842614 +0000 UTC m=+37.766211754" observedRunningTime="2026-04-20 23:13:29.140134843 +0000 UTC m=+38.811503992" watchObservedRunningTime="2026-04-20 23:13:29.141160995 +0000 UTC m=+38.812530145" Apr 20 23:13:29.177774 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.177719 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9hwxp" podStartSLOduration=2.408066991 podStartE2EDuration="19.177703995s" podCreationTimestamp="2026-04-20 23:13:10 +0000 UTC" firstStartedPulling="2026-04-20 23:13:11.325248112 +0000 UTC m=+20.996617236" lastFinishedPulling="2026-04-20 23:13:28.094885104 +0000 UTC m=+37.766254240" observedRunningTime="2026-04-20 23:13:29.177429699 +0000 UTC m=+38.848798847" watchObservedRunningTime="2026-04-20 23:13:29.177703995 +0000 UTC m=+38.849073141" Apr 20 23:13:29.192775 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.192715 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2mnql" podStartSLOduration=2.2667948620000002 podStartE2EDuration="19.192690844s" podCreationTimestamp="2026-04-20 23:13:10 +0000 UTC" firstStartedPulling="2026-04-20 23:13:11.147081488 +0000 UTC m=+20.818450614" lastFinishedPulling="2026-04-20 23:13:28.072977468 +0000 UTC m=+37.744346596" observedRunningTime="2026-04-20 23:13:29.192571921 +0000 UTC m=+38.863941068" watchObservedRunningTime="2026-04-20 23:13:29.192690844 +0000 UTC m=+38.864060005" Apr 20 23:13:29.210131 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.210082 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5f4p9" podStartSLOduration=7.033499434 podStartE2EDuration="19.210066843s" podCreationTimestamp="2026-04-20 23:13:10 +0000 UTC" firstStartedPulling="2026-04-20 23:13:11.288271515 +0000 UTC m=+20.959640640" lastFinishedPulling="2026-04-20 23:13:23.464838921 +0000 UTC m=+33.136208049" observedRunningTime="2026-04-20 23:13:29.209903038 +0000 UTC m=+38.881272187" watchObservedRunningTime="2026-04-20 23:13:29.210066843 +0000 UTC m=+38.881435990" Apr 20 23:13:29.233440 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.233376 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2lxxb" podStartSLOduration=2.189033963 podStartE2EDuration="19.233360911s" podCreationTimestamp="2026-04-20 23:13:10 +0000 UTC" firstStartedPulling="2026-04-20 23:13:11.086877662 +0000 UTC m=+20.758246788" lastFinishedPulling="2026-04-20 23:13:28.131204594 +0000 UTC m=+37.802573736" observedRunningTime="2026-04-20 23:13:29.23328036 +0000 UTC m=+38.904649508" watchObservedRunningTime="2026-04-20 23:13:29.233360911 +0000 UTC m=+38.904730057" Apr 20 23:13:29.396435 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.396400 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 23:13:29.856451 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.856338 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T23:13:29.396430115Z","UUID":"7fd0ddf5-5860-431c-b8d4-946f7561adcb","Handler":null,"Name":"","Endpoint":""} Apr 20 23:13:29.860056 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.860026 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 23:13:29.860056 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:29.860060 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 23:13:30.083762 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.083727 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tq6lg"] Apr 20 23:13:30.087304 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.087252 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.089899 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.089753 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 23:13:30.089899 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.089782 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 23:13:30.089899 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.089813 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 23:13:30.091019 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.090354 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 23:13:30.091019 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.090621 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 23:13:30.091019 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.090856 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kdhph\"" Apr 20 23:13:30.091019 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.090973 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 23:13:30.136150 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.136048 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zh8tf" event={"ID":"fcba1bc2-a90d-4680-b9a4-71239880dae3","Type":"ContainerStarted","Data":"967ce24882840ce5e5946c641d32ac35346360d4f96678164becf6cb6fa2092e"} Apr 20 23:13:30.140700 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.140664 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" event={"ID":"d7d6b9db-ce8b-4cdf-9672-afbd58183b48","Type":"ContainerStarted","Data":"8626a69c17c6a2bcfc5f87bf2d4ec8bc31a6dd521220cdc6188a578f0d3eb081"} Apr 20 23:13:30.142595 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.142571 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-sys\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.142727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.142612 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-metrics-client-ca\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.142727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.142639 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-wtmp\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.142727 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.142715 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnrks\" (UniqueName: \"kubernetes.io/projected/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-kube-api-access-wnrks\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.142924 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.142739 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.142924 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.142757 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-accelerators-collector-config\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.142924 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.142782 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-textfile\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.142924 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.142858 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-root\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.142924 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.142899 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-tls\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.154310 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.154260 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zh8tf" podStartSLOduration=3.179566404 podStartE2EDuration="20.154245725s" podCreationTimestamp="2026-04-20 23:13:10 +0000 UTC" firstStartedPulling="2026-04-20 23:13:11.120121797 +0000 UTC m=+20.791490922" lastFinishedPulling="2026-04-20 23:13:28.094801103 +0000 UTC m=+37.766170243" observedRunningTime="2026-04-20 23:13:30.153986507 +0000 UTC m=+39.825355653" watchObservedRunningTime="2026-04-20 23:13:30.154245725 +0000 UTC m=+39.825614871" Apr 20 23:13:30.243873 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.243737 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-root\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.243873 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.243789 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-tls\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.243873 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.243849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-sys\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.244172 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.243872 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-root\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.244172 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.243901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-metrics-client-ca\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.244172 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.243934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-wtmp\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.244172 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.244052 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnrks\" (UniqueName: \"kubernetes.io/projected/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-kube-api-access-wnrks\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.244172 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.244113 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.244172 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.244163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-accelerators-collector-config\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.244461 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.244197 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-textfile\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.244461 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.244377 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-sys\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.244577 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.244516 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-metrics-client-ca\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.245144 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.244661 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-wtmp\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.245144 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.244961 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-textfile\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.245309 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.245267 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-accelerators-collector-config\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.250080 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.250024 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.250665 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.250621 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-node-exporter-tls\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.252492 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.252467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnrks\" (UniqueName: \"kubernetes.io/projected/9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe-kube-api-access-wnrks\") pod \"node-exporter-tq6lg\" (UID: \"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe\") " pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.399887 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.399802 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tq6lg" Apr 20 23:13:30.408890 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:13:30.408637 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d3f5282_1b79_4bbc_88f1_23ec4a8f6fbe.slice/crio-73dfa4aca53dcb44634379d6c76f37244466ee9ac59c26eeb39b6789182bf777 WatchSource:0}: Error finding container 73dfa4aca53dcb44634379d6c76f37244466ee9ac59c26eeb39b6789182bf777: Status 404 returned error can't find the container with id 73dfa4aca53dcb44634379d6c76f37244466ee9ac59c26eeb39b6789182bf777 Apr 20 23:13:30.915620 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.915415 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:30.915620 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:30.915535 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:30.915867 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:30.915660 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:30.915867 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:30.915770 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:31.140859 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:31.140824 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2mnql" Apr 20 23:13:31.141477 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:31.141424 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2mnql" Apr 20 23:13:31.143822 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:31.143786 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tq6lg" event={"ID":"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe","Type":"ContainerStarted","Data":"73dfa4aca53dcb44634379d6c76f37244466ee9ac59c26eeb39b6789182bf777"} Apr 20 23:13:31.146341 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:31.146287 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" event={"ID":"d7d6b9db-ce8b-4cdf-9672-afbd58183b48","Type":"ContainerStarted","Data":"9eae2bccd3c083c94dbba2e02f0e73f01b133d59ae6c32e7beb9d33a00825b3b"} Apr 20 23:13:31.151011 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:31.150979 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" event={"ID":"da93c8a9-809d-4b57-b6ad-138eab016391","Type":"ContainerStarted","Data":"589226953d5a67f94429395a217c2a3071cedd694e35e0fc7c0dcd33d13c8924"} Apr 20 23:13:31.151616 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:31.151547 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2mnql" Apr 20 23:13:31.152237 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:31.152217 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2mnql" Apr 20 23:13:31.203749 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:31.203683 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4jscz" podStartSLOduration=2.254350532 podStartE2EDuration="21.203662049s" podCreationTimestamp="2026-04-20 23:13:10 +0000 UTC" firstStartedPulling="2026-04-20 23:13:11.233036868 +0000 UTC m=+20.904405992" lastFinishedPulling="2026-04-20 23:13:30.182348367 +0000 UTC m=+39.853717509" observedRunningTime="2026-04-20 23:13:31.203030162 +0000 UTC m=+40.874399310" watchObservedRunningTime="2026-04-20 23:13:31.203662049 +0000 UTC m=+40.875031195" Apr 20 23:13:31.478168 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:31.478094 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 23:13:32.154963 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:32.154830 2577 generic.go:358] "Generic (PLEG): container finished" podID="9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe" containerID="89839c213d8cb1adf5dbac9aa66c6f791716d1239b11edb87418cedfaf56d9e9" exitCode=0 Apr 20 23:13:32.154963 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:32.154921 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tq6lg" event={"ID":"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe","Type":"ContainerDied","Data":"89839c213d8cb1adf5dbac9aa66c6f791716d1239b11edb87418cedfaf56d9e9"} Apr 20 23:13:32.914210 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:32.914169 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:32.914380 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:32.914312 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:32.914444 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:32.914389 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:32.914509 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:32.914491 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:34.164579 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:34.164281 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" event={"ID":"da93c8a9-809d-4b57-b6ad-138eab016391","Type":"ContainerStarted","Data":"ef7f0944bb722c3d53f5927963681616117c187ea3c62992603cf36f12641897"} Apr 20 23:13:34.165291 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:34.164590 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:34.165904 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:34.165882 2577 generic.go:358] "Generic (PLEG): container finished" podID="042233f9-bbe1-4bd7-acbd-b2180ef39cbb" containerID="c7e3c5bf437e4e026155d496436e10fe52d9d751c2d15bad6bd953373c9ca829" exitCode=0 Apr 20 23:13:34.166012 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:34.165965 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2ng5" event={"ID":"042233f9-bbe1-4bd7-acbd-b2180ef39cbb","Type":"ContainerDied","Data":"c7e3c5bf437e4e026155d496436e10fe52d9d751c2d15bad6bd953373c9ca829"} Apr 20 23:13:34.167773 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:34.167751 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tq6lg" event={"ID":"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe","Type":"ContainerStarted","Data":"1e865ca07b1e5d17b7e25b84c6c4a6dd083d8f19ef8853d45a588af7e5928718"} Apr 20 23:13:34.167872 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:34.167778 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tq6lg" event={"ID":"9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe","Type":"ContainerStarted","Data":"6ecc679aa344b0411ce67d40d87af61712186de86442473bdce36a869dbc89af"} Apr 20 23:13:34.186091 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:34.186062 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:34.191341 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:34.191302 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" podStartSLOduration=7.143281646 podStartE2EDuration="24.191291039s" podCreationTimestamp="2026-04-20 23:13:10 +0000 UTC" firstStartedPulling="2026-04-20 23:13:11.130503607 +0000 UTC m=+20.801872733" lastFinishedPulling="2026-04-20 23:13:28.178512987 +0000 UTC m=+37.849882126" observedRunningTime="2026-04-20 23:13:34.191070808 +0000 UTC m=+43.862439956" watchObservedRunningTime="2026-04-20 23:13:34.191291039 +0000 UTC m=+43.862660185" Apr 20 23:13:34.222850 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:34.222819 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:34.230190 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:34.230147 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tq6lg" podStartSLOduration=3.421744022 podStartE2EDuration="4.230132547s" podCreationTimestamp="2026-04-20 23:13:30 +0000 UTC" firstStartedPulling="2026-04-20 23:13:30.410920641 +0000 UTC m=+40.082289765" lastFinishedPulling="2026-04-20 23:13:31.219309153 +0000 UTC m=+40.890678290" observedRunningTime="2026-04-20 23:13:34.22958449 +0000 UTC m=+43.900953637" watchObservedRunningTime="2026-04-20 23:13:34.230132547 +0000 UTC m=+43.901501698" Apr 20 23:13:34.914332 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:34.914296 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:34.914545 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:34.914528 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:34.914641 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:34.914623 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:34.914758 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:34.914737 2577 scope.go:117] "RemoveContainer" containerID="e09f74f7868f52dc9fafb978926c7d4883e2009c4e8f29d2520b661d8579a4e5" Apr 20 23:13:34.914815 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:34.914756 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:35.171287 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:35.171228 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_ff387f4dda566eb1da7627d669fc453f/kube-rbac-proxy-crio/2.log" Apr 20 23:13:35.171832 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:35.171609 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" event={"ID":"ff387f4dda566eb1da7627d669fc453f","Type":"ContainerStarted","Data":"b8c1fb3debe5f25af1a0f4b2d4b4498e74c2df64cb2a973298b5a0047738c63c"} Apr 20 23:13:35.172450 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:35.172431 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:35.187560 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:35.187364 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:13:35.187689 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:35.187453 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal" podStartSLOduration=25.187439568 podStartE2EDuration="25.187439568s" podCreationTimestamp="2026-04-20 23:13:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:13:35.186963681 +0000 UTC m=+44.858332830" watchObservedRunningTime="2026-04-20 23:13:35.187439568 +0000 UTC m=+44.858808714" Apr 20 23:13:35.382811 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:35.382777 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7vff9"] Apr 20 23:13:35.382988 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:35.382929 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:35.383102 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:35.383072 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:35.395216 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:35.395193 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cvv7k"] Apr 20 23:13:35.395333 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:35.395309 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:35.395418 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:35.395389 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:36.175416 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:36.175375 2577 generic.go:358] "Generic (PLEG): container finished" podID="042233f9-bbe1-4bd7-acbd-b2180ef39cbb" containerID="2c85bf81676fa717333a971d9b68757f6a152b43d83c89d9d8b1cd90b4153194" exitCode=0 Apr 20 23:13:36.175883 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:36.175459 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2ng5" event={"ID":"042233f9-bbe1-4bd7-acbd-b2180ef39cbb","Type":"ContainerDied","Data":"2c85bf81676fa717333a971d9b68757f6a152b43d83c89d9d8b1cd90b4153194"} Apr 20 23:13:36.914587 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:36.914550 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:36.914587 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:36.914587 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:36.914778 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:36.914698 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:36.914819 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:36.914801 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:38.181824 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:38.181788 2577 generic.go:358] "Generic (PLEG): container finished" podID="042233f9-bbe1-4bd7-acbd-b2180ef39cbb" containerID="5d0a2c80f697045666d7cda5a3dbd3f53a6b6093d98ca27e0a3d6449e7117a6f" exitCode=0 Apr 20 23:13:38.182386 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:38.181840 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2ng5" event={"ID":"042233f9-bbe1-4bd7-acbd-b2180ef39cbb","Type":"ContainerDied","Data":"5d0a2c80f697045666d7cda5a3dbd3f53a6b6093d98ca27e0a3d6449e7117a6f"} Apr 20 23:13:38.914731 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:38.914690 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:38.914907 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:38.914830 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:38.914907 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:38.914882 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:38.915043 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:38.915022 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:40.916229 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:40.915973 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:40.916675 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:40.916003 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:40.919535 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:40.917152 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7vff9" podUID="25b85e22-f989-497b-a027-9ffb78b0533d" Apr 20 23:13:40.919535 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:40.917327 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvv7k" podUID="dfd7de81-99ac-43ab-b2b7-773e89f49916" Apr 20 23:13:41.189580 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.189547 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-166.ec2.internal" event="NodeReady" Apr 20 23:13:41.189767 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.189719 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 23:13:41.235887 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.235854 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mkjgb"] Apr 20 23:13:41.263607 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.263577 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ll8sp"] Apr 20 23:13:41.263784 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.263738 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mkjgb" Apr 20 23:13:41.266254 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.266113 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-4dh2j\"" Apr 20 23:13:41.266254 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.266133 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 23:13:41.266448 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.266280 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 23:13:41.266510 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.266481 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 23:13:41.278828 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.278761 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mkjgb"] Apr 20 23:13:41.278959 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.278855 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ll8sp"] Apr 20 23:13:41.278959 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.278888 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:41.281223 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.281174 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 23:13:41.281346 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.281227 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 23:13:41.281346 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.281254 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2fmt6\"" Apr 20 23:13:41.339228 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.339188 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-c62gd"] Apr 20 23:13:41.357477 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.357446 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c62gd"] Apr 20 23:13:41.357626 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.357531 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.360157 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.360130 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-clbx8\"" Apr 20 23:13:41.360300 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.360181 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 23:13:41.360300 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.360130 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 23:13:41.360300 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.360131 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 23:13:41.360300 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.360239 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 23:13:41.433333 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.433298 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scbgw\" (UniqueName: \"kubernetes.io/projected/93d27a11-204e-4737-b1e5-95f56ec3a768-kube-api-access-scbgw\") pod \"ingress-canary-mkjgb\" (UID: \"93d27a11-204e-4737-b1e5-95f56ec3a768\") " pod="openshift-ingress-canary/ingress-canary-mkjgb" Apr 20 23:13:41.433507 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.433357 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75tbf\" (UniqueName: \"kubernetes.io/projected/b48ecb2f-b835-4a84-b0ca-b01696b7e237-kube-api-access-75tbf\") pod \"dns-default-ll8sp\" (UID: \"b48ecb2f-b835-4a84-b0ca-b01696b7e237\") " pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:41.433507 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.433399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b48ecb2f-b835-4a84-b0ca-b01696b7e237-metrics-tls\") pod \"dns-default-ll8sp\" (UID: \"b48ecb2f-b835-4a84-b0ca-b01696b7e237\") " pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:41.433507 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.433418 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7880a2ec-8385-4990-a775-f9d711d92cc3-crio-socket\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.433507 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.433446 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7880a2ec-8385-4990-a775-f9d711d92cc3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.433507 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.433493 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7880a2ec-8385-4990-a775-f9d711d92cc3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.433737 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.433512 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7880a2ec-8385-4990-a775-f9d711d92cc3-data-volume\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.433737 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.433531 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2q6t\" (UniqueName: \"kubernetes.io/projected/7880a2ec-8385-4990-a775-f9d711d92cc3-kube-api-access-s2q6t\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.433737 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.433600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b48ecb2f-b835-4a84-b0ca-b01696b7e237-config-volume\") pod \"dns-default-ll8sp\" (UID: \"b48ecb2f-b835-4a84-b0ca-b01696b7e237\") " pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:41.433737 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.433638 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93d27a11-204e-4737-b1e5-95f56ec3a768-cert\") pod \"ingress-canary-mkjgb\" (UID: \"93d27a11-204e-4737-b1e5-95f56ec3a768\") " pod="openshift-ingress-canary/ingress-canary-mkjgb" Apr 20 23:13:41.433737 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.433665 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b48ecb2f-b835-4a84-b0ca-b01696b7e237-tmp-dir\") pod \"dns-default-ll8sp\" (UID: \"b48ecb2f-b835-4a84-b0ca-b01696b7e237\") " pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:41.534419 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.534319 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scbgw\" (UniqueName: \"kubernetes.io/projected/93d27a11-204e-4737-b1e5-95f56ec3a768-kube-api-access-scbgw\") pod \"ingress-canary-mkjgb\" (UID: \"93d27a11-204e-4737-b1e5-95f56ec3a768\") " pod="openshift-ingress-canary/ingress-canary-mkjgb" Apr 20 23:13:41.534419 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.534396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75tbf\" (UniqueName: \"kubernetes.io/projected/b48ecb2f-b835-4a84-b0ca-b01696b7e237-kube-api-access-75tbf\") pod \"dns-default-ll8sp\" (UID: \"b48ecb2f-b835-4a84-b0ca-b01696b7e237\") " pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:41.534665 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.534450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b48ecb2f-b835-4a84-b0ca-b01696b7e237-metrics-tls\") pod \"dns-default-ll8sp\" (UID: \"b48ecb2f-b835-4a84-b0ca-b01696b7e237\") " pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:41.534665 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.534480 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7880a2ec-8385-4990-a775-f9d711d92cc3-crio-socket\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.534665 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.534500 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7880a2ec-8385-4990-a775-f9d711d92cc3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.534665 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.534541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7880a2ec-8385-4990-a775-f9d711d92cc3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.534665 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.534568 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7880a2ec-8385-4990-a775-f9d711d92cc3-data-volume\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.534665 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.534595 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2q6t\" (UniqueName: \"kubernetes.io/projected/7880a2ec-8385-4990-a775-f9d711d92cc3-kube-api-access-s2q6t\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.534665 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.534626 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b48ecb2f-b835-4a84-b0ca-b01696b7e237-config-volume\") pod \"dns-default-ll8sp\" (UID: \"b48ecb2f-b835-4a84-b0ca-b01696b7e237\") " pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:41.534665 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.534656 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93d27a11-204e-4737-b1e5-95f56ec3a768-cert\") pod \"ingress-canary-mkjgb\" (UID: \"93d27a11-204e-4737-b1e5-95f56ec3a768\") " pod="openshift-ingress-canary/ingress-canary-mkjgb" Apr 20 23:13:41.535066 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.534679 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b48ecb2f-b835-4a84-b0ca-b01696b7e237-tmp-dir\") pod \"dns-default-ll8sp\" (UID: \"b48ecb2f-b835-4a84-b0ca-b01696b7e237\") " pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:41.535066 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.534717 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7880a2ec-8385-4990-a775-f9d711d92cc3-crio-socket\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.535066 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.535054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b48ecb2f-b835-4a84-b0ca-b01696b7e237-tmp-dir\") pod \"dns-default-ll8sp\" (UID: \"b48ecb2f-b835-4a84-b0ca-b01696b7e237\") " pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:41.535195 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.535139 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7880a2ec-8385-4990-a775-f9d711d92cc3-data-volume\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.535458 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.535439 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b48ecb2f-b835-4a84-b0ca-b01696b7e237-config-volume\") pod \"dns-default-ll8sp\" (UID: \"b48ecb2f-b835-4a84-b0ca-b01696b7e237\") " pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:41.539353 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.539327 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7880a2ec-8385-4990-a775-f9d711d92cc3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.539353 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.539310 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b48ecb2f-b835-4a84-b0ca-b01696b7e237-metrics-tls\") pod \"dns-default-ll8sp\" (UID: \"b48ecb2f-b835-4a84-b0ca-b01696b7e237\") " pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:41.539554 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.539363 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/93d27a11-204e-4737-b1e5-95f56ec3a768-cert\") pod \"ingress-canary-mkjgb\" (UID: \"93d27a11-204e-4737-b1e5-95f56ec3a768\") " pod="openshift-ingress-canary/ingress-canary-mkjgb" Apr 20 23:13:41.542483 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.542457 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75tbf\" (UniqueName: \"kubernetes.io/projected/b48ecb2f-b835-4a84-b0ca-b01696b7e237-kube-api-access-75tbf\") pod \"dns-default-ll8sp\" (UID: \"b48ecb2f-b835-4a84-b0ca-b01696b7e237\") " pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:41.542639 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.542620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2q6t\" (UniqueName: \"kubernetes.io/projected/7880a2ec-8385-4990-a775-f9d711d92cc3-kube-api-access-s2q6t\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.542694 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.542678 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7880a2ec-8385-4990-a775-f9d711d92cc3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c62gd\" (UID: \"7880a2ec-8385-4990-a775-f9d711d92cc3\") " pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:41.542930 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.542906 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scbgw\" (UniqueName: \"kubernetes.io/projected/93d27a11-204e-4737-b1e5-95f56ec3a768-kube-api-access-scbgw\") pod \"ingress-canary-mkjgb\" (UID: \"93d27a11-204e-4737-b1e5-95f56ec3a768\") " pod="openshift-ingress-canary/ingress-canary-mkjgb" Apr 20 23:13:41.575885 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.575845 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mkjgb" Apr 20 23:13:41.588740 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.588705 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:41.666886 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:41.666846 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c62gd" Apr 20 23:13:42.643702 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:42.643663 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48wx7\" (UniqueName: \"kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7\") pod \"network-check-target-cvv7k\" (UID: \"dfd7de81-99ac-43ab-b2b7-773e89f49916\") " pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:42.644437 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:42.643850 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 23:13:42.644437 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:42.643874 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 23:13:42.644437 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:42.643886 2577 projected.go:194] Error preparing data for projected volume kube-api-access-48wx7 for pod openshift-network-diagnostics/network-check-target-cvv7k: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:42.644437 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:42.643978 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7 podName:dfd7de81-99ac-43ab-b2b7-773e89f49916 nodeName:}" failed. No retries permitted until 2026-04-20 23:14:14.643937802 +0000 UTC m=+84.315306930 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-48wx7" (UniqueName: "kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7") pod "network-check-target-cvv7k" (UID: "dfd7de81-99ac-43ab-b2b7-773e89f49916") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 23:13:42.745282 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:42.745242 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:42.745443 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:13:42.745340 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs podName:25b85e22-f989-497b-a027-9ffb78b0533d nodeName:}" failed. No retries permitted until 2026-04-20 23:14:14.745320577 +0000 UTC m=+84.416689720 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs") pod "network-metrics-daemon-7vff9" (UID: "25b85e22-f989-497b-a027-9ffb78b0533d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 23:13:42.747415 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:42.745618 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs\") pod \"network-metrics-daemon-7vff9\" (UID: \"25b85e22-f989-497b-a027-9ffb78b0533d\") " pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:42.914640 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:42.914545 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:13:42.914640 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:42.914575 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:13:42.917793 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:42.917765 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 23:13:42.917927 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:42.917874 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-g6bnb\"" Apr 20 23:13:42.917927 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:42.917880 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 23:13:42.917927 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:42.917919 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 23:13:42.918119 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:42.917963 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sk2w6\"" Apr 20 23:13:43.783364 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:43.783333 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c62gd"] Apr 20 23:13:43.786382 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:43.786357 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ll8sp"] Apr 20 23:13:43.790300 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:13:43.790054 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb48ecb2f_b835_4a84_b0ca_b01696b7e237.slice/crio-7fac3adbdc25796fd9ef15b7c573ef20368c26a83dcadfa166ccaf967c773430 WatchSource:0}: Error finding container 7fac3adbdc25796fd9ef15b7c573ef20368c26a83dcadfa166ccaf967c773430: Status 404 returned error can't find the container with id 7fac3adbdc25796fd9ef15b7c573ef20368c26a83dcadfa166ccaf967c773430 Apr 20 23:13:43.790595 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:43.790576 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mkjgb"] Apr 20 23:13:43.973898 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:13:43.973871 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93d27a11_204e_4737_b1e5_95f56ec3a768.slice/crio-cc7810c73456ec76dbba67db55ee6ce1743dedc5ab451a463fa01dea70419844 WatchSource:0}: Error finding container cc7810c73456ec76dbba67db55ee6ce1743dedc5ab451a463fa01dea70419844: Status 404 returned error can't find the container with id cc7810c73456ec76dbba67db55ee6ce1743dedc5ab451a463fa01dea70419844 Apr 20 23:13:44.197041 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:44.196988 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2ng5" event={"ID":"042233f9-bbe1-4bd7-acbd-b2180ef39cbb","Type":"ContainerStarted","Data":"afac4948ba33a9ebac4f4a330be1fc8771b1219ba354e35eb6fff202aa9c2ef1"} Apr 20 23:13:44.198694 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:44.198663 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c62gd" event={"ID":"7880a2ec-8385-4990-a775-f9d711d92cc3","Type":"ContainerStarted","Data":"5e1e8b7348212dc4cf17f784d98d6ff65a73f5c558fa5c0fc63399059d81c81e"} Apr 20 23:13:44.198813 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:44.198706 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c62gd" event={"ID":"7880a2ec-8385-4990-a775-f9d711d92cc3","Type":"ContainerStarted","Data":"70bdf17d38f28d753793503fa5bd2d5f5c80c23bad23495cf2371897b24f917e"} Apr 20 23:13:44.199777 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:44.199751 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ll8sp" event={"ID":"b48ecb2f-b835-4a84-b0ca-b01696b7e237","Type":"ContainerStarted","Data":"7fac3adbdc25796fd9ef15b7c573ef20368c26a83dcadfa166ccaf967c773430"} Apr 20 23:13:44.201053 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:44.201024 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mkjgb" event={"ID":"93d27a11-204e-4737-b1e5-95f56ec3a768","Type":"ContainerStarted","Data":"cc7810c73456ec76dbba67db55ee6ce1743dedc5ab451a463fa01dea70419844"} Apr 20 23:13:45.206222 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:45.206186 2577 generic.go:358] "Generic (PLEG): container finished" podID="042233f9-bbe1-4bd7-acbd-b2180ef39cbb" containerID="afac4948ba33a9ebac4f4a330be1fc8771b1219ba354e35eb6fff202aa9c2ef1" exitCode=0 Apr 20 23:13:45.206681 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:45.206268 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2ng5" event={"ID":"042233f9-bbe1-4bd7-acbd-b2180ef39cbb","Type":"ContainerDied","Data":"afac4948ba33a9ebac4f4a330be1fc8771b1219ba354e35eb6fff202aa9c2ef1"} Apr 20 23:13:47.214012 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:47.213970 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ll8sp" event={"ID":"b48ecb2f-b835-4a84-b0ca-b01696b7e237","Type":"ContainerStarted","Data":"0c044a07b49f40e3e19c54591f0727b9a02091ca6d9c5453b0b8048117f50ae0"} Apr 20 23:13:47.214012 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:47.214015 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ll8sp" event={"ID":"b48ecb2f-b835-4a84-b0ca-b01696b7e237","Type":"ContainerStarted","Data":"741077db727feb4fb6d0d15b104772567c79b6beabb32bbc70af0a5fe7b61ea5"} Apr 20 23:13:47.214533 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:47.214124 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ll8sp" Apr 20 23:13:47.215382 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:47.215358 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mkjgb" event={"ID":"93d27a11-204e-4737-b1e5-95f56ec3a768","Type":"ContainerStarted","Data":"481c8c4992b72a62e4fb0990786bf46d25edf57d3e3e6614c16d8cef262af594"} Apr 20 23:13:47.218121 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:47.218092 2577 generic.go:358] "Generic (PLEG): container finished" podID="042233f9-bbe1-4bd7-acbd-b2180ef39cbb" containerID="2114a7cc7dad739fa912d7a4f2d722435e57e10e35ef6fd5e9ec83753f0cb172" exitCode=0 Apr 20 23:13:47.218262 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:47.218121 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2ng5" event={"ID":"042233f9-bbe1-4bd7-acbd-b2180ef39cbb","Type":"ContainerDied","Data":"2114a7cc7dad739fa912d7a4f2d722435e57e10e35ef6fd5e9ec83753f0cb172"} Apr 20 23:13:47.219890 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:47.219867 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c62gd" event={"ID":"7880a2ec-8385-4990-a775-f9d711d92cc3","Type":"ContainerStarted","Data":"00e9fac32d1f124f1fa9496a9df692f28a9df510c08798e9d8df90e91cfc7a0f"} Apr 20 23:13:47.234904 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:47.234587 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ll8sp" podStartSLOduration=3.786762369 podStartE2EDuration="6.234573343s" podCreationTimestamp="2026-04-20 23:13:41 +0000 UTC" firstStartedPulling="2026-04-20 23:13:43.972366324 +0000 UTC m=+53.643735449" lastFinishedPulling="2026-04-20 23:13:46.420177291 +0000 UTC m=+56.091546423" observedRunningTime="2026-04-20 23:13:47.234126565 +0000 UTC m=+56.905495713" watchObservedRunningTime="2026-04-20 23:13:47.234573343 +0000 UTC m=+56.905942576" Apr 20 23:13:47.254713 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:47.254648 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mkjgb" podStartSLOduration=3.819642746 podStartE2EDuration="6.254628861s" podCreationTimestamp="2026-04-20 23:13:41 +0000 UTC" firstStartedPulling="2026-04-20 23:13:43.989637754 +0000 UTC m=+53.661006879" lastFinishedPulling="2026-04-20 23:13:46.424623854 +0000 UTC m=+56.095992994" observedRunningTime="2026-04-20 23:13:47.253484918 +0000 UTC m=+56.924854066" watchObservedRunningTime="2026-04-20 23:13:47.254628861 +0000 UTC m=+56.925998009" Apr 20 23:13:48.227248 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:48.227144 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g2ng5" event={"ID":"042233f9-bbe1-4bd7-acbd-b2180ef39cbb","Type":"ContainerStarted","Data":"72c9bb97f3066f938624239c36f93ba8ee21da24d8d636d6ca2129a599c6bb75"} Apr 20 23:13:48.228937 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:48.228903 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c62gd" event={"ID":"7880a2ec-8385-4990-a775-f9d711d92cc3","Type":"ContainerStarted","Data":"c8557e8bd3b09781137cb6d6cac82a69d46e24232331a4f48d6e111979a5ab85"} Apr 20 23:13:48.252446 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:48.252385 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g2ng5" podStartSLOduration=5.499701644 podStartE2EDuration="38.252370225s" podCreationTimestamp="2026-04-20 23:13:10 +0000 UTC" firstStartedPulling="2026-04-20 23:13:11.260644379 +0000 UTC m=+20.932013507" lastFinishedPulling="2026-04-20 23:13:44.01331296 +0000 UTC m=+53.684682088" observedRunningTime="2026-04-20 23:13:48.250925544 +0000 UTC m=+57.922294691" watchObservedRunningTime="2026-04-20 23:13:48.252370225 +0000 UTC m=+57.923739371" Apr 20 23:13:48.272620 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:48.272571 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-c62gd" podStartSLOduration=3.486403984 podStartE2EDuration="7.272556828s" podCreationTimestamp="2026-04-20 23:13:41 +0000 UTC" firstStartedPulling="2026-04-20 23:13:44.075322746 +0000 UTC m=+53.746691871" lastFinishedPulling="2026-04-20 23:13:47.861475588 +0000 UTC m=+57.532844715" observedRunningTime="2026-04-20 23:13:48.271788307 +0000 UTC m=+57.943157455" watchObservedRunningTime="2026-04-20 23:13:48.272556828 +0000 UTC m=+57.943925974" Apr 20 23:13:57.231992 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:13:57.231960 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ll8sp" Apr 20 23:14:07.193739 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:07.193617 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kbfps" Apr 20 23:14:14.679360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:14.679314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48wx7\" (UniqueName: \"kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7\") pod \"network-check-target-cvv7k\" (UID: \"dfd7de81-99ac-43ab-b2b7-773e89f49916\") " pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:14:14.681872 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:14.681849 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 23:14:14.692131 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:14.692107 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 23:14:14.703666 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:14.703633 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48wx7\" (UniqueName: \"kubernetes.io/projected/dfd7de81-99ac-43ab-b2b7-773e89f49916-kube-api-access-48wx7\") pod \"network-check-target-cvv7k\" (UID: \"dfd7de81-99ac-43ab-b2b7-773e89f49916\") " pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:14:14.734639 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:14.734608 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-g6bnb\"" Apr 20 23:14:14.743399 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:14.743362 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:14:14.780269 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:14.780235 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs\") pod \"network-metrics-daemon-7vff9\" (UID: \"25b85e22-f989-497b-a027-9ffb78b0533d\") " pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:14:14.782737 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:14.782709 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 23:14:14.793374 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:14.793348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/25b85e22-f989-497b-a027-9ffb78b0533d-metrics-certs\") pod \"network-metrics-daemon-7vff9\" (UID: \"25b85e22-f989-497b-a027-9ffb78b0533d\") " pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:14:14.859696 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:14.859659 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cvv7k"] Apr 20 23:14:15.029159 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:15.029080 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-sk2w6\"" Apr 20 23:14:15.037021 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:15.036996 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7vff9" Apr 20 23:14:15.149136 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:15.149104 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7vff9"] Apr 20 23:14:15.151993 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:14:15.151961 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25b85e22_f989_497b_a027_9ffb78b0533d.slice/crio-c6fe3cc3707890aa01c68ab145bb1969150066546c36b999081687d89ebcecdb WatchSource:0}: Error finding container c6fe3cc3707890aa01c68ab145bb1969150066546c36b999081687d89ebcecdb: Status 404 returned error can't find the container with id c6fe3cc3707890aa01c68ab145bb1969150066546c36b999081687d89ebcecdb Apr 20 23:14:15.297249 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:15.297150 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cvv7k" event={"ID":"dfd7de81-99ac-43ab-b2b7-773e89f49916","Type":"ContainerStarted","Data":"32c5f3bf6d233a5603966da359bb217bbc76bdf3adf7296aa5d9ce9859de1eb9"} Apr 20 23:14:15.298314 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:15.298284 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7vff9" event={"ID":"25b85e22-f989-497b-a027-9ffb78b0533d","Type":"ContainerStarted","Data":"c6fe3cc3707890aa01c68ab145bb1969150066546c36b999081687d89ebcecdb"} Apr 20 23:14:17.307600 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:17.307547 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7vff9" event={"ID":"25b85e22-f989-497b-a027-9ffb78b0533d","Type":"ContainerStarted","Data":"a9d39c82457415b619b48559f7031a9fd5420c2f2f95bed1c5f91c7303bbc671"} Apr 20 23:14:17.307600 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:17.307599 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7vff9" event={"ID":"25b85e22-f989-497b-a027-9ffb78b0533d","Type":"ContainerStarted","Data":"25ac6b19dbae47209ffc015d7657d75e5db24318fc3af60969f5c1dd29618ab7"} Apr 20 23:14:17.324361 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:17.324292 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7vff9" podStartSLOduration=66.273392045 podStartE2EDuration="1m7.324271746s" podCreationTimestamp="2026-04-20 23:13:10 +0000 UTC" firstStartedPulling="2026-04-20 23:14:15.153805531 +0000 UTC m=+84.825174656" lastFinishedPulling="2026-04-20 23:14:16.204685229 +0000 UTC m=+85.876054357" observedRunningTime="2026-04-20 23:14:17.322861592 +0000 UTC m=+86.994230735" watchObservedRunningTime="2026-04-20 23:14:17.324271746 +0000 UTC m=+86.995640895" Apr 20 23:14:18.310743 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:18.310703 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cvv7k" event={"ID":"dfd7de81-99ac-43ab-b2b7-773e89f49916","Type":"ContainerStarted","Data":"4f883a3da99737a75f7b2b265fae44aa6c57bc26b76d521a94cd36923585d839"} Apr 20 23:14:18.311302 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:18.310840 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:14:18.325071 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:18.325012 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cvv7k" podStartSLOduration=65.630741793 podStartE2EDuration="1m8.32499555s" podCreationTimestamp="2026-04-20 23:13:10 +0000 UTC" firstStartedPulling="2026-04-20 23:14:14.865415491 +0000 UTC m=+84.536784622" lastFinishedPulling="2026-04-20 23:14:17.559669254 +0000 UTC m=+87.231038379" observedRunningTime="2026-04-20 23:14:18.324671119 +0000 UTC m=+87.996040267" watchObservedRunningTime="2026-04-20 23:14:18.32499555 +0000 UTC m=+87.996364697" Apr 20 23:14:49.316013 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:49.315974 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cvv7k" Apr 20 23:14:54.392096 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.392059 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-68888c5bd7-mzktx"] Apr 20 23:14:54.397240 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.397216 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.399632 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.399607 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 23:14:54.399777 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.399642 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 23:14:54.399777 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.399753 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 23:14:54.399894 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.399882 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 23:14:54.399970 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.399911 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-bc4ms\"" Apr 20 23:14:54.402722 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.402705 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 23:14:54.411664 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.411638 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-68888c5bd7-mzktx"] Apr 20 23:14:54.422168 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.422144 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 23:14:54.549138 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.549093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f615191a-9efd-4021-9ffc-8045826ad131-metrics-client-ca\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.549138 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.549137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f615191a-9efd-4021-9ffc-8045826ad131-federate-client-tls\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.549369 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.549165 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppzqb\" (UniqueName: \"kubernetes.io/projected/f615191a-9efd-4021-9ffc-8045826ad131-kube-api-access-ppzqb\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.549369 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.549189 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f615191a-9efd-4021-9ffc-8045826ad131-serving-certs-ca-bundle\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.549369 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.549254 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f615191a-9efd-4021-9ffc-8045826ad131-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.549369 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.549299 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f615191a-9efd-4021-9ffc-8045826ad131-telemeter-trusted-ca-bundle\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.549369 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.549337 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f615191a-9efd-4021-9ffc-8045826ad131-secret-telemeter-client\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.549369 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.549363 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f615191a-9efd-4021-9ffc-8045826ad131-telemeter-client-tls\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.650421 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.650340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f615191a-9efd-4021-9ffc-8045826ad131-telemeter-trusted-ca-bundle\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.650421 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.650385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f615191a-9efd-4021-9ffc-8045826ad131-secret-telemeter-client\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.650421 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.650404 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f615191a-9efd-4021-9ffc-8045826ad131-telemeter-client-tls\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.650653 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.650444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f615191a-9efd-4021-9ffc-8045826ad131-metrics-client-ca\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.650653 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.650473 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f615191a-9efd-4021-9ffc-8045826ad131-federate-client-tls\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.650653 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.650498 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppzqb\" (UniqueName: \"kubernetes.io/projected/f615191a-9efd-4021-9ffc-8045826ad131-kube-api-access-ppzqb\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.650653 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.650525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f615191a-9efd-4021-9ffc-8045826ad131-serving-certs-ca-bundle\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.650653 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.650561 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f615191a-9efd-4021-9ffc-8045826ad131-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.651373 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.651344 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f615191a-9efd-4021-9ffc-8045826ad131-metrics-client-ca\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.651495 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.651407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f615191a-9efd-4021-9ffc-8045826ad131-serving-certs-ca-bundle\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.651643 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.651621 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f615191a-9efd-4021-9ffc-8045826ad131-telemeter-trusted-ca-bundle\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.653230 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.653207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/f615191a-9efd-4021-9ffc-8045826ad131-telemeter-client-tls\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.653321 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.653207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f615191a-9efd-4021-9ffc-8045826ad131-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.653600 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.653580 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/f615191a-9efd-4021-9ffc-8045826ad131-secret-telemeter-client\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.653651 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.653580 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/f615191a-9efd-4021-9ffc-8045826ad131-federate-client-tls\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.658008 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.657988 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppzqb\" (UniqueName: \"kubernetes.io/projected/f615191a-9efd-4021-9ffc-8045826ad131-kube-api-access-ppzqb\") pod \"telemeter-client-68888c5bd7-mzktx\" (UID: \"f615191a-9efd-4021-9ffc-8045826ad131\") " pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.706021 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.705989 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" Apr 20 23:14:54.827164 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:54.827139 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-68888c5bd7-mzktx"] Apr 20 23:14:54.829014 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:14:54.828984 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf615191a_9efd_4021_9ffc_8045826ad131.slice/crio-468b30de5ed3ca8d18fde5076bd8c03bd97f6e5443b01f01459fc2759fc106f9 WatchSource:0}: Error finding container 468b30de5ed3ca8d18fde5076bd8c03bd97f6e5443b01f01459fc2759fc106f9: Status 404 returned error can't find the container with id 468b30de5ed3ca8d18fde5076bd8c03bd97f6e5443b01f01459fc2759fc106f9 Apr 20 23:14:55.402461 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:55.402429 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" event={"ID":"f615191a-9efd-4021-9ffc-8045826ad131","Type":"ContainerStarted","Data":"468b30de5ed3ca8d18fde5076bd8c03bd97f6e5443b01f01459fc2759fc106f9"} Apr 20 23:14:57.408622 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:57.408586 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" event={"ID":"f615191a-9efd-4021-9ffc-8045826ad131","Type":"ContainerStarted","Data":"36c3adc45ab2a0a016c1b516ae72f55bab2cbb9bdc7b00bed82df855fbcf3b4f"} Apr 20 23:14:58.412536 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:58.412499 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" event={"ID":"f615191a-9efd-4021-9ffc-8045826ad131","Type":"ContainerStarted","Data":"256d7e8b9d73851a59f5c4c89de72ad7ebeed0cb9e772318d014763a8ecef1fd"} Apr 20 23:14:58.412536 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:58.412536 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" event={"ID":"f615191a-9efd-4021-9ffc-8045826ad131","Type":"ContainerStarted","Data":"32591bdad6e606646ef539deb137d586d3196ab3e6497911a2e29519b4cee176"} Apr 20 23:14:58.433723 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:14:58.433674 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-68888c5bd7-mzktx" podStartSLOduration=1.435154824 podStartE2EDuration="4.433661056s" podCreationTimestamp="2026-04-20 23:14:54 +0000 UTC" firstStartedPulling="2026-04-20 23:14:54.831527713 +0000 UTC m=+124.502896853" lastFinishedPulling="2026-04-20 23:14:57.830033955 +0000 UTC m=+127.501403085" observedRunningTime="2026-04-20 23:14:58.431856589 +0000 UTC m=+128.103225735" watchObservedRunningTime="2026-04-20 23:14:58.433661056 +0000 UTC m=+128.105030202" Apr 20 23:16:42.462847 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.462806 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9znkk"] Apr 20 23:16:42.465906 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.465880 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9znkk" Apr 20 23:16:42.468363 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.468339 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 23:16:42.473118 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.473097 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9znkk"] Apr 20 23:16:42.575543 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.575503 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6bc92d9d-0606-47a0-bac5-0b39b85308a8-kubelet-config\") pod \"global-pull-secret-syncer-9znkk\" (UID: \"6bc92d9d-0606-47a0-bac5-0b39b85308a8\") " pod="kube-system/global-pull-secret-syncer-9znkk" Apr 20 23:16:42.575543 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.575551 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6bc92d9d-0606-47a0-bac5-0b39b85308a8-dbus\") pod \"global-pull-secret-syncer-9znkk\" (UID: \"6bc92d9d-0606-47a0-bac5-0b39b85308a8\") " pod="kube-system/global-pull-secret-syncer-9znkk" Apr 20 23:16:42.575774 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.575582 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6bc92d9d-0606-47a0-bac5-0b39b85308a8-original-pull-secret\") pod \"global-pull-secret-syncer-9znkk\" (UID: \"6bc92d9d-0606-47a0-bac5-0b39b85308a8\") " pod="kube-system/global-pull-secret-syncer-9znkk" Apr 20 23:16:42.676328 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.676296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6bc92d9d-0606-47a0-bac5-0b39b85308a8-kubelet-config\") pod \"global-pull-secret-syncer-9znkk\" (UID: \"6bc92d9d-0606-47a0-bac5-0b39b85308a8\") " pod="kube-system/global-pull-secret-syncer-9znkk" Apr 20 23:16:42.676328 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.676335 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6bc92d9d-0606-47a0-bac5-0b39b85308a8-dbus\") pod \"global-pull-secret-syncer-9znkk\" (UID: \"6bc92d9d-0606-47a0-bac5-0b39b85308a8\") " pod="kube-system/global-pull-secret-syncer-9znkk" Apr 20 23:16:42.676546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.676355 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6bc92d9d-0606-47a0-bac5-0b39b85308a8-original-pull-secret\") pod \"global-pull-secret-syncer-9znkk\" (UID: \"6bc92d9d-0606-47a0-bac5-0b39b85308a8\") " pod="kube-system/global-pull-secret-syncer-9znkk" Apr 20 23:16:42.676546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.676433 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6bc92d9d-0606-47a0-bac5-0b39b85308a8-kubelet-config\") pod \"global-pull-secret-syncer-9znkk\" (UID: \"6bc92d9d-0606-47a0-bac5-0b39b85308a8\") " pod="kube-system/global-pull-secret-syncer-9znkk" Apr 20 23:16:42.676546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.676494 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6bc92d9d-0606-47a0-bac5-0b39b85308a8-dbus\") pod \"global-pull-secret-syncer-9znkk\" (UID: \"6bc92d9d-0606-47a0-bac5-0b39b85308a8\") " pod="kube-system/global-pull-secret-syncer-9znkk" Apr 20 23:16:42.678657 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.678630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6bc92d9d-0606-47a0-bac5-0b39b85308a8-original-pull-secret\") pod \"global-pull-secret-syncer-9znkk\" (UID: \"6bc92d9d-0606-47a0-bac5-0b39b85308a8\") " pod="kube-system/global-pull-secret-syncer-9znkk" Apr 20 23:16:42.775886 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.775785 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9znkk" Apr 20 23:16:42.896802 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:42.896720 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9znkk"] Apr 20 23:16:42.899275 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:16:42.899243 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bc92d9d_0606_47a0_bac5_0b39b85308a8.slice/crio-e6a600779a0f3f6a21d668bf0c516575eb8d3cb47cc7a0d8c2be26c4e899f2db WatchSource:0}: Error finding container e6a600779a0f3f6a21d668bf0c516575eb8d3cb47cc7a0d8c2be26c4e899f2db: Status 404 returned error can't find the container with id e6a600779a0f3f6a21d668bf0c516575eb8d3cb47cc7a0d8c2be26c4e899f2db Apr 20 23:16:43.667993 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:43.667929 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9znkk" event={"ID":"6bc92d9d-0606-47a0-bac5-0b39b85308a8","Type":"ContainerStarted","Data":"e6a600779a0f3f6a21d668bf0c516575eb8d3cb47cc7a0d8c2be26c4e899f2db"} Apr 20 23:16:46.676870 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:46.676831 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9znkk" event={"ID":"6bc92d9d-0606-47a0-bac5-0b39b85308a8","Type":"ContainerStarted","Data":"c122be7b9ff6d243de626d31f7d09f54a6e1ae50341c22de8ee487f700168754"} Apr 20 23:16:46.693397 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:46.693325 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9znkk" podStartSLOduration=1.059776425 podStartE2EDuration="4.693307126s" podCreationTimestamp="2026-04-20 23:16:42 +0000 UTC" firstStartedPulling="2026-04-20 23:16:42.901006421 +0000 UTC m=+232.572375550" lastFinishedPulling="2026-04-20 23:16:46.534537124 +0000 UTC m=+236.205906251" observedRunningTime="2026-04-20 23:16:46.692684465 +0000 UTC m=+236.364053613" watchObservedRunningTime="2026-04-20 23:16:46.693307126 +0000 UTC m=+236.364676276" Apr 20 23:16:55.277238 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.277202 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c"] Apr 20 23:16:55.280873 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.280856 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" Apr 20 23:16:55.284895 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.284872 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-zhcjn\"" Apr 20 23:16:55.285042 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.284878 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 23:16:55.285042 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.284965 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 23:16:55.292012 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.291981 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c"] Apr 20 23:16:55.366792 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.366753 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80b18208-c44e-480e-8252-c7864f931646-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c\" (UID: \"80b18208-c44e-480e-8252-c7864f931646\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" Apr 20 23:16:55.367006 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.366825 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80b18208-c44e-480e-8252-c7864f931646-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c\" (UID: \"80b18208-c44e-480e-8252-c7864f931646\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" Apr 20 23:16:55.367006 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.366850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d85wt\" (UniqueName: \"kubernetes.io/projected/80b18208-c44e-480e-8252-c7864f931646-kube-api-access-d85wt\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c\" (UID: \"80b18208-c44e-480e-8252-c7864f931646\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" Apr 20 23:16:55.467413 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.467373 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80b18208-c44e-480e-8252-c7864f931646-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c\" (UID: \"80b18208-c44e-480e-8252-c7864f931646\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" Apr 20 23:16:55.467413 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.467417 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d85wt\" (UniqueName: \"kubernetes.io/projected/80b18208-c44e-480e-8252-c7864f931646-kube-api-access-d85wt\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c\" (UID: \"80b18208-c44e-480e-8252-c7864f931646\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" Apr 20 23:16:55.467664 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.467463 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80b18208-c44e-480e-8252-c7864f931646-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c\" (UID: \"80b18208-c44e-480e-8252-c7864f931646\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" Apr 20 23:16:55.467906 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.467883 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80b18208-c44e-480e-8252-c7864f931646-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c\" (UID: \"80b18208-c44e-480e-8252-c7864f931646\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" Apr 20 23:16:55.468019 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.467973 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80b18208-c44e-480e-8252-c7864f931646-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c\" (UID: \"80b18208-c44e-480e-8252-c7864f931646\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" Apr 20 23:16:55.476460 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.476433 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d85wt\" (UniqueName: \"kubernetes.io/projected/80b18208-c44e-480e-8252-c7864f931646-kube-api-access-d85wt\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c\" (UID: \"80b18208-c44e-480e-8252-c7864f931646\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" Apr 20 23:16:55.590161 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.590123 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" Apr 20 23:16:55.712128 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:55.712103 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c"] Apr 20 23:16:55.714686 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:16:55.714652 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80b18208_c44e_480e_8252_c7864f931646.slice/crio-8d688dd806efc5e381ec29bb38076998965b7c36eae093c53ccae7094e73ed4e WatchSource:0}: Error finding container 8d688dd806efc5e381ec29bb38076998965b7c36eae093c53ccae7094e73ed4e: Status 404 returned error can't find the container with id 8d688dd806efc5e381ec29bb38076998965b7c36eae093c53ccae7094e73ed4e Apr 20 23:16:56.704408 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:16:56.704367 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" event={"ID":"80b18208-c44e-480e-8252-c7864f931646","Type":"ContainerStarted","Data":"8d688dd806efc5e381ec29bb38076998965b7c36eae093c53ccae7094e73ed4e"} Apr 20 23:17:02.721556 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:02.721507 2577 generic.go:358] "Generic (PLEG): container finished" podID="80b18208-c44e-480e-8252-c7864f931646" containerID="7e40e3efcaeecc3f09ed6582127579cad72963ef898492d4570b9e8da5de2b44" exitCode=0 Apr 20 23:17:02.721991 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:02.721594 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" event={"ID":"80b18208-c44e-480e-8252-c7864f931646","Type":"ContainerDied","Data":"7e40e3efcaeecc3f09ed6582127579cad72963ef898492d4570b9e8da5de2b44"} Apr 20 23:17:05.731789 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:05.731759 2577 generic.go:358] "Generic (PLEG): container finished" podID="80b18208-c44e-480e-8252-c7864f931646" containerID="6567111f6f33885c4e8e4ea392bef55175ba6b55a8ba1f2540e58ca6d20d0cef" exitCode=0 Apr 20 23:17:05.732211 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:05.731845 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" event={"ID":"80b18208-c44e-480e-8252-c7864f931646","Type":"ContainerDied","Data":"6567111f6f33885c4e8e4ea392bef55175ba6b55a8ba1f2540e58ca6d20d0cef"} Apr 20 23:17:13.755822 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:13.755788 2577 generic.go:358] "Generic (PLEG): container finished" podID="80b18208-c44e-480e-8252-c7864f931646" containerID="5f063f2056b1534686bb9d1615860adb92dd3753f52bafd3c9dbec39726fd3b5" exitCode=0 Apr 20 23:17:13.756263 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:13.755874 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" event={"ID":"80b18208-c44e-480e-8252-c7864f931646","Type":"ContainerDied","Data":"5f063f2056b1534686bb9d1615860adb92dd3753f52bafd3c9dbec39726fd3b5"} Apr 20 23:17:14.883644 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:14.883618 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" Apr 20 23:17:15.017601 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:15.017503 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80b18208-c44e-480e-8252-c7864f931646-util\") pod \"80b18208-c44e-480e-8252-c7864f931646\" (UID: \"80b18208-c44e-480e-8252-c7864f931646\") " Apr 20 23:17:15.017601 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:15.017543 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80b18208-c44e-480e-8252-c7864f931646-bundle\") pod \"80b18208-c44e-480e-8252-c7864f931646\" (UID: \"80b18208-c44e-480e-8252-c7864f931646\") " Apr 20 23:17:15.017601 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:15.017576 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d85wt\" (UniqueName: \"kubernetes.io/projected/80b18208-c44e-480e-8252-c7864f931646-kube-api-access-d85wt\") pod \"80b18208-c44e-480e-8252-c7864f931646\" (UID: \"80b18208-c44e-480e-8252-c7864f931646\") " Apr 20 23:17:15.018218 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:15.018189 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80b18208-c44e-480e-8252-c7864f931646-bundle" (OuterVolumeSpecName: "bundle") pod "80b18208-c44e-480e-8252-c7864f931646" (UID: "80b18208-c44e-480e-8252-c7864f931646"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:17:15.019912 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:15.019887 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80b18208-c44e-480e-8252-c7864f931646-kube-api-access-d85wt" (OuterVolumeSpecName: "kube-api-access-d85wt") pod "80b18208-c44e-480e-8252-c7864f931646" (UID: "80b18208-c44e-480e-8252-c7864f931646"). InnerVolumeSpecName "kube-api-access-d85wt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:17:15.022214 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:15.022185 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80b18208-c44e-480e-8252-c7864f931646-util" (OuterVolumeSpecName: "util") pod "80b18208-c44e-480e-8252-c7864f931646" (UID: "80b18208-c44e-480e-8252-c7864f931646"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:17:15.118449 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:15.118395 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d85wt\" (UniqueName: \"kubernetes.io/projected/80b18208-c44e-480e-8252-c7864f931646-kube-api-access-d85wt\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:17:15.118449 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:15.118444 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80b18208-c44e-480e-8252-c7864f931646-util\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:17:15.118449 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:15.118458 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80b18208-c44e-480e-8252-c7864f931646-bundle\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:17:15.762351 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:15.762313 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" event={"ID":"80b18208-c44e-480e-8252-c7864f931646","Type":"ContainerDied","Data":"8d688dd806efc5e381ec29bb38076998965b7c36eae093c53ccae7094e73ed4e"} Apr 20 23:17:15.762351 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:15.762349 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d688dd806efc5e381ec29bb38076998965b7c36eae093c53ccae7094e73ed4e" Apr 20 23:17:15.762351 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:15.762330 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19ddsv7c" Apr 20 23:17:23.344219 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.344185 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-n5rtz"] Apr 20 23:17:23.344806 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.344514 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80b18208-c44e-480e-8252-c7864f931646" containerName="extract" Apr 20 23:17:23.344806 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.344533 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b18208-c44e-480e-8252-c7864f931646" containerName="extract" Apr 20 23:17:23.344806 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.344550 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80b18208-c44e-480e-8252-c7864f931646" containerName="util" Apr 20 23:17:23.344806 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.344560 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b18208-c44e-480e-8252-c7864f931646" containerName="util" Apr 20 23:17:23.344806 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.344570 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80b18208-c44e-480e-8252-c7864f931646" containerName="pull" Apr 20 23:17:23.344806 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.344578 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b18208-c44e-480e-8252-c7864f931646" containerName="pull" Apr 20 23:17:23.344806 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.344629 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="80b18208-c44e-480e-8252-c7864f931646" containerName="extract" Apr 20 23:17:23.348710 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.348687 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-n5rtz" Apr 20 23:17:23.353157 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.353133 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 23:17:23.353430 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.353414 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-v2nq9\"" Apr 20 23:17:23.353699 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.353677 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 23:17:23.365212 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.365183 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-n5rtz"] Apr 20 23:17:23.477130 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.477087 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfpqz\" (UniqueName: \"kubernetes.io/projected/e6f274ec-6abc-41f6-9915-060100dfdb54-kube-api-access-lfpqz\") pod \"cert-manager-operator-controller-manager-54b9655956-n5rtz\" (UID: \"e6f274ec-6abc-41f6-9915-060100dfdb54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-n5rtz" Apr 20 23:17:23.477325 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.477140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6f274ec-6abc-41f6-9915-060100dfdb54-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-n5rtz\" (UID: \"e6f274ec-6abc-41f6-9915-060100dfdb54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-n5rtz" Apr 20 23:17:23.578389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.578347 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfpqz\" (UniqueName: \"kubernetes.io/projected/e6f274ec-6abc-41f6-9915-060100dfdb54-kube-api-access-lfpqz\") pod \"cert-manager-operator-controller-manager-54b9655956-n5rtz\" (UID: \"e6f274ec-6abc-41f6-9915-060100dfdb54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-n5rtz" Apr 20 23:17:23.578560 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.578407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6f274ec-6abc-41f6-9915-060100dfdb54-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-n5rtz\" (UID: \"e6f274ec-6abc-41f6-9915-060100dfdb54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-n5rtz" Apr 20 23:17:23.578793 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.578772 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e6f274ec-6abc-41f6-9915-060100dfdb54-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-n5rtz\" (UID: \"e6f274ec-6abc-41f6-9915-060100dfdb54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-n5rtz" Apr 20 23:17:23.592959 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.592908 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfpqz\" (UniqueName: \"kubernetes.io/projected/e6f274ec-6abc-41f6-9915-060100dfdb54-kube-api-access-lfpqz\") pod \"cert-manager-operator-controller-manager-54b9655956-n5rtz\" (UID: \"e6f274ec-6abc-41f6-9915-060100dfdb54\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-n5rtz" Apr 20 23:17:23.657719 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.657623 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-n5rtz" Apr 20 23:17:23.786235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:23.786205 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-n5rtz"] Apr 20 23:17:23.790749 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:17:23.790700 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6f274ec_6abc_41f6_9915_060100dfdb54.slice/crio-272acb41a17e3cf2227e1639350b00fc414e29a7868be22532c8cfb9bf6b7c76 WatchSource:0}: Error finding container 272acb41a17e3cf2227e1639350b00fc414e29a7868be22532c8cfb9bf6b7c76: Status 404 returned error can't find the container with id 272acb41a17e3cf2227e1639350b00fc414e29a7868be22532c8cfb9bf6b7c76 Apr 20 23:17:24.787291 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:24.787247 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-n5rtz" event={"ID":"e6f274ec-6abc-41f6-9915-060100dfdb54","Type":"ContainerStarted","Data":"272acb41a17e3cf2227e1639350b00fc414e29a7868be22532c8cfb9bf6b7c76"} Apr 20 23:17:25.791799 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:25.791697 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-n5rtz" event={"ID":"e6f274ec-6abc-41f6-9915-060100dfdb54","Type":"ContainerStarted","Data":"2e9a7126ca8c6070168fcbca6f5b1a3fab97eb9b571181661bc505d379335685"} Apr 20 23:17:25.831092 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:25.831034 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-n5rtz" podStartSLOduration=1.25931177 podStartE2EDuration="2.831016921s" podCreationTimestamp="2026-04-20 23:17:23 +0000 UTC" firstStartedPulling="2026-04-20 23:17:23.794481261 +0000 UTC m=+273.465850390" lastFinishedPulling="2026-04-20 23:17:25.366186402 +0000 UTC m=+275.037555541" observedRunningTime="2026-04-20 23:17:25.829873539 +0000 UTC m=+275.501242685" watchObservedRunningTime="2026-04-20 23:17:25.831016921 +0000 UTC m=+275.502386067" Apr 20 23:17:27.617097 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.617060 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx"] Apr 20 23:17:27.620238 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.620222 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" Apr 20 23:17:27.622732 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.622710 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 23:17:27.623849 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.623832 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-zhcjn\"" Apr 20 23:17:27.623897 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.623838 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 23:17:27.632031 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.632007 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx"] Apr 20 23:17:27.807954 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.807923 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx\" (UID: \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" Apr 20 23:17:27.808139 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.807976 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx\" (UID: \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" Apr 20 23:17:27.808139 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.808073 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snkrx\" (UniqueName: \"kubernetes.io/projected/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-kube-api-access-snkrx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx\" (UID: \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" Apr 20 23:17:27.908897 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.908814 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx\" (UID: \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" Apr 20 23:17:27.908897 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.908852 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx\" (UID: \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" Apr 20 23:17:27.909092 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.908901 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snkrx\" (UniqueName: \"kubernetes.io/projected/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-kube-api-access-snkrx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx\" (UID: \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" Apr 20 23:17:27.909218 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.909200 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx\" (UID: \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" Apr 20 23:17:27.909264 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.909227 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx\" (UID: \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" Apr 20 23:17:27.917200 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.917177 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snkrx\" (UniqueName: \"kubernetes.io/projected/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-kube-api-access-snkrx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx\" (UID: \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" Apr 20 23:17:27.929218 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:27.929189 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" Apr 20 23:17:28.046490 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.046460 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx"] Apr 20 23:17:28.048789 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:17:28.048762 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1712ae83_ff68_4bfb_864b_617e4a8ef8c2.slice/crio-24eea286594eee6812a30c975cecdc4e8a11261ea9d311ed2dfdad4317d1247d WatchSource:0}: Error finding container 24eea286594eee6812a30c975cecdc4e8a11261ea9d311ed2dfdad4317d1247d: Status 404 returned error can't find the container with id 24eea286594eee6812a30c975cecdc4e8a11261ea9d311ed2dfdad4317d1247d Apr 20 23:17:28.231259 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.231181 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-nbshl"] Apr 20 23:17:28.234158 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.234141 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-nbshl" Apr 20 23:17:28.236722 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.236696 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 23:17:28.236853 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.236696 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-m65x5\"" Apr 20 23:17:28.236853 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.236739 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 23:17:28.242149 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.242123 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-nbshl"] Apr 20 23:17:28.412726 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.412686 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a085bf8-f854-4e24-9bf4-095b0b462f1f-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-nbshl\" (UID: \"6a085bf8-f854-4e24-9bf4-095b0b462f1f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nbshl" Apr 20 23:17:28.412892 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.412762 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb2qm\" (UniqueName: \"kubernetes.io/projected/6a085bf8-f854-4e24-9bf4-095b0b462f1f-kube-api-access-nb2qm\") pod \"cert-manager-webhook-587ccfb98-nbshl\" (UID: \"6a085bf8-f854-4e24-9bf4-095b0b462f1f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nbshl" Apr 20 23:17:28.513393 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.513302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a085bf8-f854-4e24-9bf4-095b0b462f1f-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-nbshl\" (UID: \"6a085bf8-f854-4e24-9bf4-095b0b462f1f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nbshl" Apr 20 23:17:28.513393 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.513368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nb2qm\" (UniqueName: \"kubernetes.io/projected/6a085bf8-f854-4e24-9bf4-095b0b462f1f-kube-api-access-nb2qm\") pod \"cert-manager-webhook-587ccfb98-nbshl\" (UID: \"6a085bf8-f854-4e24-9bf4-095b0b462f1f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nbshl" Apr 20 23:17:28.524544 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.524513 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6a085bf8-f854-4e24-9bf4-095b0b462f1f-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-nbshl\" (UID: \"6a085bf8-f854-4e24-9bf4-095b0b462f1f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nbshl" Apr 20 23:17:28.524659 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.524569 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb2qm\" (UniqueName: \"kubernetes.io/projected/6a085bf8-f854-4e24-9bf4-095b0b462f1f-kube-api-access-nb2qm\") pod \"cert-manager-webhook-587ccfb98-nbshl\" (UID: \"6a085bf8-f854-4e24-9bf4-095b0b462f1f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-nbshl" Apr 20 23:17:28.550676 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.550644 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-nbshl" Apr 20 23:17:28.671160 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.671136 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-nbshl"] Apr 20 23:17:28.673302 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:17:28.673277 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a085bf8_f854_4e24_9bf4_095b0b462f1f.slice/crio-d61ac644de03f97a8769763bbf1c04dd730a24386a13b13fbfdd90cc6ef0d099 WatchSource:0}: Error finding container d61ac644de03f97a8769763bbf1c04dd730a24386a13b13fbfdd90cc6ef0d099: Status 404 returned error can't find the container with id d61ac644de03f97a8769763bbf1c04dd730a24386a13b13fbfdd90cc6ef0d099 Apr 20 23:17:28.801019 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.800911 2577 generic.go:358] "Generic (PLEG): container finished" podID="1712ae83-ff68-4bfb-864b-617e4a8ef8c2" containerID="26c16244259056dca32fc076fb7b4919dbfa20e4441bad770a042467a4be23f9" exitCode=0 Apr 20 23:17:28.801019 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.801004 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" event={"ID":"1712ae83-ff68-4bfb-864b-617e4a8ef8c2","Type":"ContainerDied","Data":"26c16244259056dca32fc076fb7b4919dbfa20e4441bad770a042467a4be23f9"} Apr 20 23:17:28.801219 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.801042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" event={"ID":"1712ae83-ff68-4bfb-864b-617e4a8ef8c2","Type":"ContainerStarted","Data":"24eea286594eee6812a30c975cecdc4e8a11261ea9d311ed2dfdad4317d1247d"} Apr 20 23:17:28.802084 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:28.802063 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-nbshl" event={"ID":"6a085bf8-f854-4e24-9bf4-095b0b462f1f","Type":"ContainerStarted","Data":"d61ac644de03f97a8769763bbf1c04dd730a24386a13b13fbfdd90cc6ef0d099"} Apr 20 23:17:29.736119 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:29.735985 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-9jz6p"] Apr 20 23:17:29.739530 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:29.739327 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-9jz6p" Apr 20 23:17:29.742519 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:29.742490 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-w5hg7\"" Apr 20 23:17:29.752147 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:29.752101 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-9jz6p"] Apr 20 23:17:29.924487 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:29.924446 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbs2w\" (UniqueName: \"kubernetes.io/projected/d3b1ca82-10ba-4e9e-8f7f-c9eae4c8ebc6-kube-api-access-tbs2w\") pod \"cert-manager-cainjector-68b757865b-9jz6p\" (UID: \"d3b1ca82-10ba-4e9e-8f7f-c9eae4c8ebc6\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9jz6p" Apr 20 23:17:29.924683 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:29.924498 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3b1ca82-10ba-4e9e-8f7f-c9eae4c8ebc6-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-9jz6p\" (UID: \"d3b1ca82-10ba-4e9e-8f7f-c9eae4c8ebc6\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9jz6p" Apr 20 23:17:30.025787 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:30.025239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbs2w\" (UniqueName: \"kubernetes.io/projected/d3b1ca82-10ba-4e9e-8f7f-c9eae4c8ebc6-kube-api-access-tbs2w\") pod \"cert-manager-cainjector-68b757865b-9jz6p\" (UID: \"d3b1ca82-10ba-4e9e-8f7f-c9eae4c8ebc6\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9jz6p" Apr 20 23:17:30.025787 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:30.025310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3b1ca82-10ba-4e9e-8f7f-c9eae4c8ebc6-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-9jz6p\" (UID: \"d3b1ca82-10ba-4e9e-8f7f-c9eae4c8ebc6\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9jz6p" Apr 20 23:17:30.035304 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:30.034594 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3b1ca82-10ba-4e9e-8f7f-c9eae4c8ebc6-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-9jz6p\" (UID: \"d3b1ca82-10ba-4e9e-8f7f-c9eae4c8ebc6\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9jz6p" Apr 20 23:17:30.035304 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:30.035145 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbs2w\" (UniqueName: \"kubernetes.io/projected/d3b1ca82-10ba-4e9e-8f7f-c9eae4c8ebc6-kube-api-access-tbs2w\") pod \"cert-manager-cainjector-68b757865b-9jz6p\" (UID: \"d3b1ca82-10ba-4e9e-8f7f-c9eae4c8ebc6\") " pod="cert-manager/cert-manager-cainjector-68b757865b-9jz6p" Apr 20 23:17:30.053784 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:30.053749 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-9jz6p" Apr 20 23:17:30.202755 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:30.202712 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-9jz6p"] Apr 20 23:17:30.206919 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:17:30.206886 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3b1ca82_10ba_4e9e_8f7f_c9eae4c8ebc6.slice/crio-8ca6cfbe4c9fbbdbb6ccc21d9b565d979ad20f231f85b59a2add0cc1e608c4ac WatchSource:0}: Error finding container 8ca6cfbe4c9fbbdbb6ccc21d9b565d979ad20f231f85b59a2add0cc1e608c4ac: Status 404 returned error can't find the container with id 8ca6cfbe4c9fbbdbb6ccc21d9b565d979ad20f231f85b59a2add0cc1e608c4ac Apr 20 23:17:30.809744 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:30.809699 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-9jz6p" event={"ID":"d3b1ca82-10ba-4e9e-8f7f-c9eae4c8ebc6","Type":"ContainerStarted","Data":"8ca6cfbe4c9fbbdbb6ccc21d9b565d979ad20f231f85b59a2add0cc1e608c4ac"} Apr 20 23:17:32.819073 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:32.818926 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" event={"ID":"1712ae83-ff68-4bfb-864b-617e4a8ef8c2","Type":"ContainerStarted","Data":"b209ffeecfe431e8682cb0edf614928a1513fa7eff726a4c73c47556d4305a7f"} Apr 20 23:17:32.820333 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:32.820305 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-nbshl" event={"ID":"6a085bf8-f854-4e24-9bf4-095b0b462f1f","Type":"ContainerStarted","Data":"af784737b9005d450698df8709d67a4fd2724c8d302c1626c1319828c645aae7"} Apr 20 23:17:32.820828 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:32.820807 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-nbshl" Apr 20 23:17:32.822156 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:32.822130 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-9jz6p" event={"ID":"d3b1ca82-10ba-4e9e-8f7f-c9eae4c8ebc6","Type":"ContainerStarted","Data":"a307e2b0c0150f1b030c5cb52b8b7cc5a2a56d1d738350021ef722e0b2dbc15f"} Apr 20 23:17:32.838748 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:32.838689 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-nbshl" podStartSLOduration=0.888601289 podStartE2EDuration="4.838671657s" podCreationTimestamp="2026-04-20 23:17:28 +0000 UTC" firstStartedPulling="2026-04-20 23:17:28.675151276 +0000 UTC m=+278.346520401" lastFinishedPulling="2026-04-20 23:17:32.625221645 +0000 UTC m=+282.296590769" observedRunningTime="2026-04-20 23:17:32.837891919 +0000 UTC m=+282.509261077" watchObservedRunningTime="2026-04-20 23:17:32.838671657 +0000 UTC m=+282.510040817" Apr 20 23:17:32.854601 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:32.854550 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-9jz6p" podStartSLOduration=1.438088414 podStartE2EDuration="3.85453578s" podCreationTimestamp="2026-04-20 23:17:29 +0000 UTC" firstStartedPulling="2026-04-20 23:17:30.20980545 +0000 UTC m=+279.881174578" lastFinishedPulling="2026-04-20 23:17:32.626252815 +0000 UTC m=+282.297621944" observedRunningTime="2026-04-20 23:17:32.853400663 +0000 UTC m=+282.524769822" watchObservedRunningTime="2026-04-20 23:17:32.85453578 +0000 UTC m=+282.525904927" Apr 20 23:17:33.826186 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:33.826154 2577 generic.go:358] "Generic (PLEG): container finished" podID="1712ae83-ff68-4bfb-864b-617e4a8ef8c2" containerID="b209ffeecfe431e8682cb0edf614928a1513fa7eff726a4c73c47556d4305a7f" exitCode=0 Apr 20 23:17:33.826615 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:33.826243 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" event={"ID":"1712ae83-ff68-4bfb-864b-617e4a8ef8c2","Type":"ContainerDied","Data":"b209ffeecfe431e8682cb0edf614928a1513fa7eff726a4c73c47556d4305a7f"} Apr 20 23:17:34.831721 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:34.831681 2577 generic.go:358] "Generic (PLEG): container finished" podID="1712ae83-ff68-4bfb-864b-617e4a8ef8c2" containerID="3bb816c6c320773f7298fc53010e9e5cecaf8af937cac73d8705da27c2ea3fda" exitCode=0 Apr 20 23:17:34.832243 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:34.831764 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" event={"ID":"1712ae83-ff68-4bfb-864b-617e4a8ef8c2","Type":"ContainerDied","Data":"3bb816c6c320773f7298fc53010e9e5cecaf8af937cac73d8705da27c2ea3fda"} Apr 20 23:17:35.953380 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:35.953358 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" Apr 20 23:17:36.068990 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:36.068923 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-util\") pod \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\" (UID: \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\") " Apr 20 23:17:36.069160 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:36.069027 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-bundle\") pod \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\" (UID: \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\") " Apr 20 23:17:36.069160 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:36.069049 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snkrx\" (UniqueName: \"kubernetes.io/projected/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-kube-api-access-snkrx\") pod \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\" (UID: \"1712ae83-ff68-4bfb-864b-617e4a8ef8c2\") " Apr 20 23:17:36.069498 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:36.069466 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-bundle" (OuterVolumeSpecName: "bundle") pod "1712ae83-ff68-4bfb-864b-617e4a8ef8c2" (UID: "1712ae83-ff68-4bfb-864b-617e4a8ef8c2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:17:36.071299 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:36.071266 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-kube-api-access-snkrx" (OuterVolumeSpecName: "kube-api-access-snkrx") pod "1712ae83-ff68-4bfb-864b-617e4a8ef8c2" (UID: "1712ae83-ff68-4bfb-864b-617e4a8ef8c2"). InnerVolumeSpecName "kube-api-access-snkrx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:17:36.073801 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:36.073764 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-util" (OuterVolumeSpecName: "util") pod "1712ae83-ff68-4bfb-864b-617e4a8ef8c2" (UID: "1712ae83-ff68-4bfb-864b-617e4a8ef8c2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:17:36.169976 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:36.169858 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-bundle\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:17:36.169976 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:36.169894 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-snkrx\" (UniqueName: \"kubernetes.io/projected/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-kube-api-access-snkrx\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:17:36.169976 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:36.169903 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1712ae83-ff68-4bfb-864b-617e4a8ef8c2-util\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:17:36.838877 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:36.838836 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" event={"ID":"1712ae83-ff68-4bfb-864b-617e4a8ef8c2","Type":"ContainerDied","Data":"24eea286594eee6812a30c975cecdc4e8a11261ea9d311ed2dfdad4317d1247d"} Apr 20 23:17:36.838877 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:36.838878 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24eea286594eee6812a30c975cecdc4e8a11261ea9d311ed2dfdad4317d1247d" Apr 20 23:17:36.839094 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:36.838880 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fjvkmx" Apr 20 23:17:39.834578 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:39.834542 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-nbshl" Apr 20 23:17:44.220157 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.220121 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-srfxd"] Apr 20 23:17:44.220546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.220344 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1712ae83-ff68-4bfb-864b-617e4a8ef8c2" containerName="extract" Apr 20 23:17:44.220546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.220356 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1712ae83-ff68-4bfb-864b-617e4a8ef8c2" containerName="extract" Apr 20 23:17:44.220546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.220373 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1712ae83-ff68-4bfb-864b-617e4a8ef8c2" containerName="pull" Apr 20 23:17:44.220546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.220379 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1712ae83-ff68-4bfb-864b-617e4a8ef8c2" containerName="pull" Apr 20 23:17:44.220546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.220389 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1712ae83-ff68-4bfb-864b-617e4a8ef8c2" containerName="util" Apr 20 23:17:44.220546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.220394 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1712ae83-ff68-4bfb-864b-617e4a8ef8c2" containerName="util" Apr 20 23:17:44.220546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.220451 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="1712ae83-ff68-4bfb-864b-617e4a8ef8c2" containerName="extract" Apr 20 23:17:44.226674 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.226649 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-srfxd" Apr 20 23:17:44.230807 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.230777 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 23:17:44.231498 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.231479 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 23:17:44.231598 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.231495 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-b9nbv\"" Apr 20 23:17:44.234895 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.234870 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-srfxd"] Apr 20 23:17:44.320619 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.320581 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rxv2\" (UniqueName: \"kubernetes.io/projected/63c92598-680b-47e6-987b-58661192e3f0-kube-api-access-5rxv2\") pod \"openshift-lws-operator-bfc7f696d-srfxd\" (UID: \"63c92598-680b-47e6-987b-58661192e3f0\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-srfxd" Apr 20 23:17:44.320807 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.320647 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63c92598-680b-47e6-987b-58661192e3f0-tmp\") pod \"openshift-lws-operator-bfc7f696d-srfxd\" (UID: \"63c92598-680b-47e6-987b-58661192e3f0\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-srfxd" Apr 20 23:17:44.420939 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.420902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rxv2\" (UniqueName: \"kubernetes.io/projected/63c92598-680b-47e6-987b-58661192e3f0-kube-api-access-5rxv2\") pod \"openshift-lws-operator-bfc7f696d-srfxd\" (UID: \"63c92598-680b-47e6-987b-58661192e3f0\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-srfxd" Apr 20 23:17:44.421131 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.420987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63c92598-680b-47e6-987b-58661192e3f0-tmp\") pod \"openshift-lws-operator-bfc7f696d-srfxd\" (UID: \"63c92598-680b-47e6-987b-58661192e3f0\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-srfxd" Apr 20 23:17:44.421313 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.421297 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/63c92598-680b-47e6-987b-58661192e3f0-tmp\") pod \"openshift-lws-operator-bfc7f696d-srfxd\" (UID: \"63c92598-680b-47e6-987b-58661192e3f0\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-srfxd" Apr 20 23:17:44.429724 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.429692 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rxv2\" (UniqueName: \"kubernetes.io/projected/63c92598-680b-47e6-987b-58661192e3f0-kube-api-access-5rxv2\") pod \"openshift-lws-operator-bfc7f696d-srfxd\" (UID: \"63c92598-680b-47e6-987b-58661192e3f0\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-srfxd" Apr 20 23:17:44.536384 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.536290 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-srfxd" Apr 20 23:17:44.658989 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.658856 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-srfxd"] Apr 20 23:17:44.661584 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:17:44.661556 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63c92598_680b_47e6_987b_58661192e3f0.slice/crio-50f4a75483ca6f5e022a635d3242a95edb5187a6bec968ef1dc2ed2216a73438 WatchSource:0}: Error finding container 50f4a75483ca6f5e022a635d3242a95edb5187a6bec968ef1dc2ed2216a73438: Status 404 returned error can't find the container with id 50f4a75483ca6f5e022a635d3242a95edb5187a6bec968ef1dc2ed2216a73438 Apr 20 23:17:44.862610 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:44.862565 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-srfxd" event={"ID":"63c92598-680b-47e6-987b-58661192e3f0","Type":"ContainerStarted","Data":"50f4a75483ca6f5e022a635d3242a95edb5187a6bec968ef1dc2ed2216a73438"} Apr 20 23:17:46.917969 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:46.917927 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-cc58d"] Apr 20 23:17:46.952992 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:46.952966 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-cc58d"] Apr 20 23:17:46.953101 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:46.953052 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-cc58d" Apr 20 23:17:46.955565 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:46.955548 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-hqbk9\"" Apr 20 23:17:47.036468 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:47.036430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65776247-73da-4702-a293-e7f64d8f42bb-bound-sa-token\") pod \"cert-manager-79c8d999ff-cc58d\" (UID: \"65776247-73da-4702-a293-e7f64d8f42bb\") " pod="cert-manager/cert-manager-79c8d999ff-cc58d" Apr 20 23:17:47.036625 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:47.036500 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nr6c\" (UniqueName: \"kubernetes.io/projected/65776247-73da-4702-a293-e7f64d8f42bb-kube-api-access-2nr6c\") pod \"cert-manager-79c8d999ff-cc58d\" (UID: \"65776247-73da-4702-a293-e7f64d8f42bb\") " pod="cert-manager/cert-manager-79c8d999ff-cc58d" Apr 20 23:17:47.137091 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:47.137009 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65776247-73da-4702-a293-e7f64d8f42bb-bound-sa-token\") pod \"cert-manager-79c8d999ff-cc58d\" (UID: \"65776247-73da-4702-a293-e7f64d8f42bb\") " pod="cert-manager/cert-manager-79c8d999ff-cc58d" Apr 20 23:17:47.137091 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:47.137065 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nr6c\" (UniqueName: \"kubernetes.io/projected/65776247-73da-4702-a293-e7f64d8f42bb-kube-api-access-2nr6c\") pod \"cert-manager-79c8d999ff-cc58d\" (UID: \"65776247-73da-4702-a293-e7f64d8f42bb\") " pod="cert-manager/cert-manager-79c8d999ff-cc58d" Apr 20 23:17:47.149700 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:47.149662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65776247-73da-4702-a293-e7f64d8f42bb-bound-sa-token\") pod \"cert-manager-79c8d999ff-cc58d\" (UID: \"65776247-73da-4702-a293-e7f64d8f42bb\") " pod="cert-manager/cert-manager-79c8d999ff-cc58d" Apr 20 23:17:47.149798 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:47.149710 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nr6c\" (UniqueName: \"kubernetes.io/projected/65776247-73da-4702-a293-e7f64d8f42bb-kube-api-access-2nr6c\") pod \"cert-manager-79c8d999ff-cc58d\" (UID: \"65776247-73da-4702-a293-e7f64d8f42bb\") " pod="cert-manager/cert-manager-79c8d999ff-cc58d" Apr 20 23:17:47.261999 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:47.261961 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-cc58d" Apr 20 23:17:47.403566 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:47.403541 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-cc58d"] Apr 20 23:17:47.406415 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:17:47.406374 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65776247_73da_4702_a293_e7f64d8f42bb.slice/crio-0d7b4e1e4ca286da54906fd2496ee80f1380e9b934f8140e246e8ee3558cbce2 WatchSource:0}: Error finding container 0d7b4e1e4ca286da54906fd2496ee80f1380e9b934f8140e246e8ee3558cbce2: Status 404 returned error can't find the container with id 0d7b4e1e4ca286da54906fd2496ee80f1380e9b934f8140e246e8ee3558cbce2 Apr 20 23:17:47.871886 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:47.871849 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-cc58d" event={"ID":"65776247-73da-4702-a293-e7f64d8f42bb","Type":"ContainerStarted","Data":"2a0f339cac9e64e1b9db103d036de511a797721aef6a859d0258ccfcaaa73828"} Apr 20 23:17:47.871886 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:47.871889 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-cc58d" event={"ID":"65776247-73da-4702-a293-e7f64d8f42bb","Type":"ContainerStarted","Data":"0d7b4e1e4ca286da54906fd2496ee80f1380e9b934f8140e246e8ee3558cbce2"} Apr 20 23:17:47.873218 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:47.873194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-srfxd" event={"ID":"63c92598-680b-47e6-987b-58661192e3f0","Type":"ContainerStarted","Data":"f4482cd42d8fe8624e83febb0396033b3234704115016eb3935419f80c413302"} Apr 20 23:17:47.890530 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:47.890492 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-cc58d" podStartSLOduration=1.8904786310000001 podStartE2EDuration="1.890478631s" podCreationTimestamp="2026-04-20 23:17:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:17:47.889052588 +0000 UTC m=+297.560421761" watchObservedRunningTime="2026-04-20 23:17:47.890478631 +0000 UTC m=+297.561847779" Apr 20 23:17:47.907113 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:47.907060 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-srfxd" podStartSLOduration=1.709478743 podStartE2EDuration="3.907042579s" podCreationTimestamp="2026-04-20 23:17:44 +0000 UTC" firstStartedPulling="2026-04-20 23:17:44.662870655 +0000 UTC m=+294.334239779" lastFinishedPulling="2026-04-20 23:17:46.860434476 +0000 UTC m=+296.531803615" observedRunningTime="2026-04-20 23:17:47.905341953 +0000 UTC m=+297.576711100" watchObservedRunningTime="2026-04-20 23:17:47.907042579 +0000 UTC m=+297.578411727" Apr 20 23:17:50.788908 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:50.788881 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_ff387f4dda566eb1da7627d669fc453f/kube-rbac-proxy-crio/2.log" Apr 20 23:17:50.789417 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:50.789229 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_ff387f4dda566eb1da7627d669fc453f/kube-rbac-proxy-crio/2.log" Apr 20 23:17:50.791304 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:50.791286 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 23:17:51.384054 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.384018 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf"] Apr 20 23:17:51.387016 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.387000 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" Apr 20 23:17:51.389634 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.389596 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 23:17:51.390408 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.390384 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 23:17:51.390408 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.390397 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-zhcjn\"" Apr 20 23:17:51.398385 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.398364 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf"] Apr 20 23:17:51.467689 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.467653 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf\" (UID: \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" Apr 20 23:17:51.467877 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.467706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf\" (UID: \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" Apr 20 23:17:51.467877 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.467783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs8kh\" (UniqueName: \"kubernetes.io/projected/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-kube-api-access-gs8kh\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf\" (UID: \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" Apr 20 23:17:51.568161 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.568124 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf\" (UID: \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" Apr 20 23:17:51.568337 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.568208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs8kh\" (UniqueName: \"kubernetes.io/projected/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-kube-api-access-gs8kh\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf\" (UID: \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" Apr 20 23:17:51.568337 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.568250 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf\" (UID: \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" Apr 20 23:17:51.568618 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.568597 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf\" (UID: \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" Apr 20 23:17:51.568662 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.568608 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf\" (UID: \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" Apr 20 23:17:51.577110 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.577077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs8kh\" (UniqueName: \"kubernetes.io/projected/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-kube-api-access-gs8kh\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf\" (UID: \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" Apr 20 23:17:51.696424 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.696348 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" Apr 20 23:17:51.820045 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.820014 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf"] Apr 20 23:17:51.822489 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:17:51.822460 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cc06a42_967d_48d2_bd00_f6a4b73ceb40.slice/crio-e597c65b59ae06951fc05d42d98ae434d8392e8bd0ceb7be677050d6ff5d1f69 WatchSource:0}: Error finding container e597c65b59ae06951fc05d42d98ae434d8392e8bd0ceb7be677050d6ff5d1f69: Status 404 returned error can't find the container with id e597c65b59ae06951fc05d42d98ae434d8392e8bd0ceb7be677050d6ff5d1f69 Apr 20 23:17:51.892668 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:51.892634 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" event={"ID":"6cc06a42-967d-48d2-bd00-f6a4b73ceb40","Type":"ContainerStarted","Data":"e597c65b59ae06951fc05d42d98ae434d8392e8bd0ceb7be677050d6ff5d1f69"} Apr 20 23:17:52.896810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:52.896714 2577 generic.go:358] "Generic (PLEG): container finished" podID="6cc06a42-967d-48d2-bd00-f6a4b73ceb40" containerID="db024b858e5d19e684bc441a7f5e13811a278220f68d8c0119afd8c94bcb943f" exitCode=0 Apr 20 23:17:52.897272 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:52.896803 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" event={"ID":"6cc06a42-967d-48d2-bd00-f6a4b73ceb40","Type":"ContainerDied","Data":"db024b858e5d19e684bc441a7f5e13811a278220f68d8c0119afd8c94bcb943f"} Apr 20 23:17:52.897388 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:52.897370 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 23:17:53.901889 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:53.901855 2577 generic.go:358] "Generic (PLEG): container finished" podID="6cc06a42-967d-48d2-bd00-f6a4b73ceb40" containerID="c77e0c795e93ec025594e7f1495f21e01c1d8c9f779f68b8a4be8d0d5acef4d7" exitCode=0 Apr 20 23:17:53.902291 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:53.901975 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" event={"ID":"6cc06a42-967d-48d2-bd00-f6a4b73ceb40","Type":"ContainerDied","Data":"c77e0c795e93ec025594e7f1495f21e01c1d8c9f779f68b8a4be8d0d5acef4d7"} Apr 20 23:17:54.907048 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:54.907005 2577 generic.go:358] "Generic (PLEG): container finished" podID="6cc06a42-967d-48d2-bd00-f6a4b73ceb40" containerID="40267c743b224f3031bb2138f7f14df7d57e6e85aae2d3a700d95ffeff9b6d3f" exitCode=0 Apr 20 23:17:54.907496 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:54.907068 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" event={"ID":"6cc06a42-967d-48d2-bd00-f6a4b73ceb40","Type":"ContainerDied","Data":"40267c743b224f3031bb2138f7f14df7d57e6e85aae2d3a700d95ffeff9b6d3f"} Apr 20 23:17:56.030600 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:56.030575 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" Apr 20 23:17:56.102157 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:56.102117 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs8kh\" (UniqueName: \"kubernetes.io/projected/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-kube-api-access-gs8kh\") pod \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\" (UID: \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\") " Apr 20 23:17:56.102327 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:56.102186 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-bundle\") pod \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\" (UID: \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\") " Apr 20 23:17:56.102327 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:56.102218 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-util\") pod \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\" (UID: \"6cc06a42-967d-48d2-bd00-f6a4b73ceb40\") " Apr 20 23:17:56.102911 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:56.102879 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-bundle" (OuterVolumeSpecName: "bundle") pod "6cc06a42-967d-48d2-bd00-f6a4b73ceb40" (UID: "6cc06a42-967d-48d2-bd00-f6a4b73ceb40"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:17:56.104305 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:56.104279 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-kube-api-access-gs8kh" (OuterVolumeSpecName: "kube-api-access-gs8kh") pod "6cc06a42-967d-48d2-bd00-f6a4b73ceb40" (UID: "6cc06a42-967d-48d2-bd00-f6a4b73ceb40"). InnerVolumeSpecName "kube-api-access-gs8kh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:17:56.107624 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:56.107592 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-util" (OuterVolumeSpecName: "util") pod "6cc06a42-967d-48d2-bd00-f6a4b73ceb40" (UID: "6cc06a42-967d-48d2-bd00-f6a4b73ceb40"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:17:56.203160 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:56.203067 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gs8kh\" (UniqueName: \"kubernetes.io/projected/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-kube-api-access-gs8kh\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:17:56.203160 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:56.203100 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-bundle\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:17:56.203160 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:56.203109 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6cc06a42-967d-48d2-bd00-f6a4b73ceb40-util\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:17:56.914940 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:56.914913 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" Apr 20 23:17:56.917317 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:56.917288 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c58wpxf" event={"ID":"6cc06a42-967d-48d2-bd00-f6a4b73ceb40","Type":"ContainerDied","Data":"e597c65b59ae06951fc05d42d98ae434d8392e8bd0ceb7be677050d6ff5d1f69"} Apr 20 23:17:56.917317 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:17:56.917318 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e597c65b59ae06951fc05d42d98ae434d8392e8bd0ceb7be677050d6ff5d1f69" Apr 20 23:18:01.596321 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.596284 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm"] Apr 20 23:18:01.596799 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.596629 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cc06a42-967d-48d2-bd00-f6a4b73ceb40" containerName="util" Apr 20 23:18:01.596799 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.596648 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc06a42-967d-48d2-bd00-f6a4b73ceb40" containerName="util" Apr 20 23:18:01.596799 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.596665 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cc06a42-967d-48d2-bd00-f6a4b73ceb40" containerName="pull" Apr 20 23:18:01.596799 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.596673 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc06a42-967d-48d2-bd00-f6a4b73ceb40" containerName="pull" Apr 20 23:18:01.596799 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.596683 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6cc06a42-967d-48d2-bd00-f6a4b73ceb40" containerName="extract" Apr 20 23:18:01.596799 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.596691 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cc06a42-967d-48d2-bd00-f6a4b73ceb40" containerName="extract" Apr 20 23:18:01.596799 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.596758 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="6cc06a42-967d-48d2-bd00-f6a4b73ceb40" containerName="extract" Apr 20 23:18:01.599619 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.599597 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" Apr 20 23:18:01.602154 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.602134 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 23:18:01.602867 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.602850 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-zhcjn\"" Apr 20 23:18:01.602959 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.602853 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 23:18:01.605619 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.605596 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm"] Apr 20 23:18:01.640657 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.640617 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e563299e-e203-4993-83ae-b9a5956e9247-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm\" (UID: \"e563299e-e203-4993-83ae-b9a5956e9247\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" Apr 20 23:18:01.640657 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.640657 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzzmd\" (UniqueName: \"kubernetes.io/projected/e563299e-e203-4993-83ae-b9a5956e9247-kube-api-access-gzzmd\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm\" (UID: \"e563299e-e203-4993-83ae-b9a5956e9247\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" Apr 20 23:18:01.640922 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.640760 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e563299e-e203-4993-83ae-b9a5956e9247-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm\" (UID: \"e563299e-e203-4993-83ae-b9a5956e9247\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" Apr 20 23:18:01.741837 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.741800 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e563299e-e203-4993-83ae-b9a5956e9247-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm\" (UID: \"e563299e-e203-4993-83ae-b9a5956e9247\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" Apr 20 23:18:01.741837 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.741841 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzzmd\" (UniqueName: \"kubernetes.io/projected/e563299e-e203-4993-83ae-b9a5956e9247-kube-api-access-gzzmd\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm\" (UID: \"e563299e-e203-4993-83ae-b9a5956e9247\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" Apr 20 23:18:01.742091 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.741869 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e563299e-e203-4993-83ae-b9a5956e9247-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm\" (UID: \"e563299e-e203-4993-83ae-b9a5956e9247\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" Apr 20 23:18:01.742280 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.742259 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e563299e-e203-4993-83ae-b9a5956e9247-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm\" (UID: \"e563299e-e203-4993-83ae-b9a5956e9247\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" Apr 20 23:18:01.742315 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.742281 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e563299e-e203-4993-83ae-b9a5956e9247-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm\" (UID: \"e563299e-e203-4993-83ae-b9a5956e9247\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" Apr 20 23:18:01.750221 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.750198 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzzmd\" (UniqueName: \"kubernetes.io/projected/e563299e-e203-4993-83ae-b9a5956e9247-kube-api-access-gzzmd\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm\" (UID: \"e563299e-e203-4993-83ae-b9a5956e9247\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" Apr 20 23:18:01.909095 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:01.909004 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" Apr 20 23:18:02.042963 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:02.042913 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm"] Apr 20 23:18:02.045845 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:18:02.045818 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode563299e_e203_4993_83ae_b9a5956e9247.slice/crio-3058025028a8bc54a3095aeb1735669982e40d64ed3a3571dd024c1cdb814ac7 WatchSource:0}: Error finding container 3058025028a8bc54a3095aeb1735669982e40d64ed3a3571dd024c1cdb814ac7: Status 404 returned error can't find the container with id 3058025028a8bc54a3095aeb1735669982e40d64ed3a3571dd024c1cdb814ac7 Apr 20 23:18:02.937601 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:02.937564 2577 generic.go:358] "Generic (PLEG): container finished" podID="e563299e-e203-4993-83ae-b9a5956e9247" containerID="89128aef386e7acefb4f54d2fc8ec7836939fb91df3b8a30f90ff3d5a7ef13ca" exitCode=0 Apr 20 23:18:02.937601 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:02.937605 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" event={"ID":"e563299e-e203-4993-83ae-b9a5956e9247","Type":"ContainerDied","Data":"89128aef386e7acefb4f54d2fc8ec7836939fb91df3b8a30f90ff3d5a7ef13ca"} Apr 20 23:18:02.938021 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:02.937627 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" event={"ID":"e563299e-e203-4993-83ae-b9a5956e9247","Type":"ContainerStarted","Data":"3058025028a8bc54a3095aeb1735669982e40d64ed3a3571dd024c1cdb814ac7"} Apr 20 23:18:03.852686 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:03.852654 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx"] Apr 20 23:18:03.855869 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:03.855846 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" Apr 20 23:18:03.858244 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:03.858218 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 23:18:03.858350 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:03.858247 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 23:18:03.858350 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:03.858222 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 23:18:03.858444 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:03.858373 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-c9lhx\"" Apr 20 23:18:03.860566 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:03.860550 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 23:18:03.899271 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:03.899243 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx"] Apr 20 23:18:03.943904 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:03.943873 2577 generic.go:358] "Generic (PLEG): container finished" podID="e563299e-e203-4993-83ae-b9a5956e9247" containerID="6fef00099d1fcbdb41c8ba6a4f300f5671abf7d86716a204eba77e2f5f10d0fd" exitCode=0 Apr 20 23:18:03.944405 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:03.943977 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" event={"ID":"e563299e-e203-4993-83ae-b9a5956e9247","Type":"ContainerDied","Data":"6fef00099d1fcbdb41c8ba6a4f300f5671abf7d86716a204eba77e2f5f10d0fd"} Apr 20 23:18:03.957060 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:03.957035 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5f9t\" (UniqueName: \"kubernetes.io/projected/6d311889-eb60-4d87-864d-2956f43f404a-kube-api-access-w5f9t\") pod \"opendatahub-operator-controller-manager-5d79c565b7-znsfx\" (UID: \"6d311889-eb60-4d87-864d-2956f43f404a\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" Apr 20 23:18:03.957166 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:03.957101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d311889-eb60-4d87-864d-2956f43f404a-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-znsfx\" (UID: \"6d311889-eb60-4d87-864d-2956f43f404a\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" Apr 20 23:18:03.957166 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:03.957156 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d311889-eb60-4d87-864d-2956f43f404a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-znsfx\" (UID: \"6d311889-eb60-4d87-864d-2956f43f404a\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" Apr 20 23:18:04.057872 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:04.057824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d311889-eb60-4d87-864d-2956f43f404a-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-znsfx\" (UID: \"6d311889-eb60-4d87-864d-2956f43f404a\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" Apr 20 23:18:04.058058 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:04.057892 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d311889-eb60-4d87-864d-2956f43f404a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-znsfx\" (UID: \"6d311889-eb60-4d87-864d-2956f43f404a\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" Apr 20 23:18:04.058058 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:04.057922 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5f9t\" (UniqueName: \"kubernetes.io/projected/6d311889-eb60-4d87-864d-2956f43f404a-kube-api-access-w5f9t\") pod \"opendatahub-operator-controller-manager-5d79c565b7-znsfx\" (UID: \"6d311889-eb60-4d87-864d-2956f43f404a\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" Apr 20 23:18:04.060638 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:04.060560 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d311889-eb60-4d87-864d-2956f43f404a-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-znsfx\" (UID: \"6d311889-eb60-4d87-864d-2956f43f404a\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" Apr 20 23:18:04.060638 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:04.060588 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d311889-eb60-4d87-864d-2956f43f404a-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d79c565b7-znsfx\" (UID: \"6d311889-eb60-4d87-864d-2956f43f404a\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" Apr 20 23:18:04.069127 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:04.069093 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5f9t\" (UniqueName: \"kubernetes.io/projected/6d311889-eb60-4d87-864d-2956f43f404a-kube-api-access-w5f9t\") pod \"opendatahub-operator-controller-manager-5d79c565b7-znsfx\" (UID: \"6d311889-eb60-4d87-864d-2956f43f404a\") " pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" Apr 20 23:18:04.165426 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:04.165331 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" Apr 20 23:18:04.294426 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:04.294397 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx"] Apr 20 23:18:04.296248 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:18:04.296223 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d311889_eb60_4d87_864d_2956f43f404a.slice/crio-cb6abc39a94dc60264c0b8a865ed227c66344919cb5a4f58430cba4e37f296be WatchSource:0}: Error finding container cb6abc39a94dc60264c0b8a865ed227c66344919cb5a4f58430cba4e37f296be: Status 404 returned error can't find the container with id cb6abc39a94dc60264c0b8a865ed227c66344919cb5a4f58430cba4e37f296be Apr 20 23:18:04.949047 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:04.949007 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" event={"ID":"6d311889-eb60-4d87-864d-2956f43f404a","Type":"ContainerStarted","Data":"cb6abc39a94dc60264c0b8a865ed227c66344919cb5a4f58430cba4e37f296be"} Apr 20 23:18:04.951124 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:04.951089 2577 generic.go:358] "Generic (PLEG): container finished" podID="e563299e-e203-4993-83ae-b9a5956e9247" containerID="52f0bb10f0ba03ae29c07225c48bd73e4ddade95e48faa608c24bef8e670c854" exitCode=0 Apr 20 23:18:04.951260 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:04.951151 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" event={"ID":"e563299e-e203-4993-83ae-b9a5956e9247","Type":"ContainerDied","Data":"52f0bb10f0ba03ae29c07225c48bd73e4ddade95e48faa608c24bef8e670c854"} Apr 20 23:18:07.077899 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.077876 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" Apr 20 23:18:07.183026 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.182992 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e563299e-e203-4993-83ae-b9a5956e9247-util\") pod \"e563299e-e203-4993-83ae-b9a5956e9247\" (UID: \"e563299e-e203-4993-83ae-b9a5956e9247\") " Apr 20 23:18:07.183189 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.183089 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e563299e-e203-4993-83ae-b9a5956e9247-bundle\") pod \"e563299e-e203-4993-83ae-b9a5956e9247\" (UID: \"e563299e-e203-4993-83ae-b9a5956e9247\") " Apr 20 23:18:07.183189 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.183148 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzzmd\" (UniqueName: \"kubernetes.io/projected/e563299e-e203-4993-83ae-b9a5956e9247-kube-api-access-gzzmd\") pod \"e563299e-e203-4993-83ae-b9a5956e9247\" (UID: \"e563299e-e203-4993-83ae-b9a5956e9247\") " Apr 20 23:18:07.183863 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.183813 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e563299e-e203-4993-83ae-b9a5956e9247-bundle" (OuterVolumeSpecName: "bundle") pod "e563299e-e203-4993-83ae-b9a5956e9247" (UID: "e563299e-e203-4993-83ae-b9a5956e9247"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:18:07.185133 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.185112 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e563299e-e203-4993-83ae-b9a5956e9247-kube-api-access-gzzmd" (OuterVolumeSpecName: "kube-api-access-gzzmd") pod "e563299e-e203-4993-83ae-b9a5956e9247" (UID: "e563299e-e203-4993-83ae-b9a5956e9247"). InnerVolumeSpecName "kube-api-access-gzzmd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:18:07.189544 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.189517 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e563299e-e203-4993-83ae-b9a5956e9247-util" (OuterVolumeSpecName: "util") pod "e563299e-e203-4993-83ae-b9a5956e9247" (UID: "e563299e-e203-4993-83ae-b9a5956e9247"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:18:07.284303 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.284266 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e563299e-e203-4993-83ae-b9a5956e9247-util\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:18:07.284303 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.284296 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e563299e-e203-4993-83ae-b9a5956e9247-bundle\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:18:07.284303 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.284305 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gzzmd\" (UniqueName: \"kubernetes.io/projected/e563299e-e203-4993-83ae-b9a5956e9247-kube-api-access-gzzmd\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:18:07.962163 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.962126 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" event={"ID":"6d311889-eb60-4d87-864d-2956f43f404a","Type":"ContainerStarted","Data":"f93f385e03f452e2b2d3b14cfef1a1899e9cd53a9f5b1ec73bc286146247c71f"} Apr 20 23:18:07.962342 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.962185 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" Apr 20 23:18:07.963750 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.963724 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" event={"ID":"e563299e-e203-4993-83ae-b9a5956e9247","Type":"ContainerDied","Data":"3058025028a8bc54a3095aeb1735669982e40d64ed3a3571dd024c1cdb814ac7"} Apr 20 23:18:07.963750 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.963751 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3058025028a8bc54a3095aeb1735669982e40d64ed3a3571dd024c1cdb814ac7" Apr 20 23:18:07.963970 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.963772 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9pk7lm" Apr 20 23:18:07.983175 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:07.983129 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" podStartSLOduration=2.169866654 podStartE2EDuration="4.983114346s" podCreationTimestamp="2026-04-20 23:18:03 +0000 UTC" firstStartedPulling="2026-04-20 23:18:04.297915197 +0000 UTC m=+313.969284322" lastFinishedPulling="2026-04-20 23:18:07.111162877 +0000 UTC m=+316.782532014" observedRunningTime="2026-04-20 23:18:07.981578168 +0000 UTC m=+317.652947315" watchObservedRunningTime="2026-04-20 23:18:07.983114346 +0000 UTC m=+317.654483493" Apr 20 23:18:18.968863 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:18.968832 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5d79c565b7-znsfx" Apr 20 23:18:22.257099 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.257062 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw"] Apr 20 23:18:22.257502 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.257301 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e563299e-e203-4993-83ae-b9a5956e9247" containerName="extract" Apr 20 23:18:22.257502 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.257311 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e563299e-e203-4993-83ae-b9a5956e9247" containerName="extract" Apr 20 23:18:22.257502 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.257327 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e563299e-e203-4993-83ae-b9a5956e9247" containerName="pull" Apr 20 23:18:22.257502 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.257333 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e563299e-e203-4993-83ae-b9a5956e9247" containerName="pull" Apr 20 23:18:22.257502 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.257341 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e563299e-e203-4993-83ae-b9a5956e9247" containerName="util" Apr 20 23:18:22.257502 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.257347 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e563299e-e203-4993-83ae-b9a5956e9247" containerName="util" Apr 20 23:18:22.257502 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.257386 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e563299e-e203-4993-83ae-b9a5956e9247" containerName="extract" Apr 20 23:18:22.262798 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.262776 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:22.268040 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.268012 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-bw4cf\"" Apr 20 23:18:22.268040 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.268030 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 23:18:22.268040 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.268011 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 23:18:22.268355 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.268021 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 23:18:22.277037 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.277011 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw"] Apr 20 23:18:22.387868 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.387829 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09285221-0e2f-478e-b8fa-87bff03e5cef-cert\") pod \"lws-controller-manager-6577b568b8-hlssw\" (UID: \"09285221-0e2f-478e-b8fa-87bff03e5cef\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:22.388058 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.387886 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d2gz\" (UniqueName: \"kubernetes.io/projected/09285221-0e2f-478e-b8fa-87bff03e5cef-kube-api-access-4d2gz\") pod \"lws-controller-manager-6577b568b8-hlssw\" (UID: \"09285221-0e2f-478e-b8fa-87bff03e5cef\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:22.388058 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.387939 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/09285221-0e2f-478e-b8fa-87bff03e5cef-manager-config\") pod \"lws-controller-manager-6577b568b8-hlssw\" (UID: \"09285221-0e2f-478e-b8fa-87bff03e5cef\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:22.388058 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.387999 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/09285221-0e2f-478e-b8fa-87bff03e5cef-metrics-cert\") pod \"lws-controller-manager-6577b568b8-hlssw\" (UID: \"09285221-0e2f-478e-b8fa-87bff03e5cef\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:22.489159 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.489114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09285221-0e2f-478e-b8fa-87bff03e5cef-cert\") pod \"lws-controller-manager-6577b568b8-hlssw\" (UID: \"09285221-0e2f-478e-b8fa-87bff03e5cef\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:22.489287 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.489175 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d2gz\" (UniqueName: \"kubernetes.io/projected/09285221-0e2f-478e-b8fa-87bff03e5cef-kube-api-access-4d2gz\") pod \"lws-controller-manager-6577b568b8-hlssw\" (UID: \"09285221-0e2f-478e-b8fa-87bff03e5cef\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:22.489287 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.489209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/09285221-0e2f-478e-b8fa-87bff03e5cef-manager-config\") pod \"lws-controller-manager-6577b568b8-hlssw\" (UID: \"09285221-0e2f-478e-b8fa-87bff03e5cef\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:22.489386 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.489333 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/09285221-0e2f-478e-b8fa-87bff03e5cef-metrics-cert\") pod \"lws-controller-manager-6577b568b8-hlssw\" (UID: \"09285221-0e2f-478e-b8fa-87bff03e5cef\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:22.489834 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.489813 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/09285221-0e2f-478e-b8fa-87bff03e5cef-manager-config\") pod \"lws-controller-manager-6577b568b8-hlssw\" (UID: \"09285221-0e2f-478e-b8fa-87bff03e5cef\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:22.491854 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.491818 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/09285221-0e2f-478e-b8fa-87bff03e5cef-metrics-cert\") pod \"lws-controller-manager-6577b568b8-hlssw\" (UID: \"09285221-0e2f-478e-b8fa-87bff03e5cef\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:22.491854 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.491837 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/09285221-0e2f-478e-b8fa-87bff03e5cef-cert\") pod \"lws-controller-manager-6577b568b8-hlssw\" (UID: \"09285221-0e2f-478e-b8fa-87bff03e5cef\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:22.498336 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.498316 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d2gz\" (UniqueName: \"kubernetes.io/projected/09285221-0e2f-478e-b8fa-87bff03e5cef-kube-api-access-4d2gz\") pod \"lws-controller-manager-6577b568b8-hlssw\" (UID: \"09285221-0e2f-478e-b8fa-87bff03e5cef\") " pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:22.576434 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.576398 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:22.697453 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:22.697364 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw"] Apr 20 23:18:22.699815 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:18:22.699780 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09285221_0e2f_478e_b8fa_87bff03e5cef.slice/crio-27c849331331bb82126526d280f20c9953747443b696ee20378ed0d69ec4ef9d WatchSource:0}: Error finding container 27c849331331bb82126526d280f20c9953747443b696ee20378ed0d69ec4ef9d: Status 404 returned error can't find the container with id 27c849331331bb82126526d280f20c9953747443b696ee20378ed0d69ec4ef9d Apr 20 23:18:23.015966 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:23.015844 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" event={"ID":"09285221-0e2f-478e-b8fa-87bff03e5cef","Type":"ContainerStarted","Data":"27c849331331bb82126526d280f20c9953747443b696ee20378ed0d69ec4ef9d"} Apr 20 23:18:25.024493 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:25.024450 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" event={"ID":"09285221-0e2f-478e-b8fa-87bff03e5cef","Type":"ContainerStarted","Data":"85be1d6db3fc783ae23dfe4506c254af5dbe121588f59275d16a6d7bc87cf852"} Apr 20 23:18:25.024893 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:25.024587 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:25.042908 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:25.042850 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" podStartSLOduration=1.520044901 podStartE2EDuration="3.042830187s" podCreationTimestamp="2026-04-20 23:18:22 +0000 UTC" firstStartedPulling="2026-04-20 23:18:22.70147403 +0000 UTC m=+332.372843155" lastFinishedPulling="2026-04-20 23:18:24.224259317 +0000 UTC m=+333.895628441" observedRunningTime="2026-04-20 23:18:25.041431247 +0000 UTC m=+334.712800395" watchObservedRunningTime="2026-04-20 23:18:25.042830187 +0000 UTC m=+334.714199340" Apr 20 23:18:33.455091 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.455008 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4"] Apr 20 23:18:33.458385 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.458364 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" Apr 20 23:18:33.460842 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.460819 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 23:18:33.460842 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.460836 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-zhcjn\"" Apr 20 23:18:33.461640 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.461620 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 23:18:33.471244 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.471221 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4"] Apr 20 23:18:33.578674 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.578638 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4\" (UID: \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" Apr 20 23:18:33.578674 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.578680 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4\" (UID: \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" Apr 20 23:18:33.578884 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.578709 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-769vm\" (UniqueName: \"kubernetes.io/projected/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-kube-api-access-769vm\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4\" (UID: \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" Apr 20 23:18:33.679660 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.679627 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4\" (UID: \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" Apr 20 23:18:33.679830 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.679666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4\" (UID: \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" Apr 20 23:18:33.679830 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.679706 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-769vm\" (UniqueName: \"kubernetes.io/projected/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-kube-api-access-769vm\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4\" (UID: \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" Apr 20 23:18:33.680120 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.680088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4\" (UID: \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" Apr 20 23:18:33.680120 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.680111 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4\" (UID: \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" Apr 20 23:18:33.687901 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.687876 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-769vm\" (UniqueName: \"kubernetes.io/projected/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-kube-api-access-769vm\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4\" (UID: \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" Apr 20 23:18:33.767071 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.766977 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" Apr 20 23:18:33.905533 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:33.905497 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4"] Apr 20 23:18:33.918044 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:18:33.918003 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ce1d2b6_b4e9_4819_b259_69b20f7315d5.slice/crio-e0bba538665fbcaae798cc442161b4f85203532fe8909198673b4c19a3fd39a4 WatchSource:0}: Error finding container e0bba538665fbcaae798cc442161b4f85203532fe8909198673b4c19a3fd39a4: Status 404 returned error can't find the container with id e0bba538665fbcaae798cc442161b4f85203532fe8909198673b4c19a3fd39a4 Apr 20 23:18:34.053100 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:34.053015 2577 generic.go:358] "Generic (PLEG): container finished" podID="9ce1d2b6-b4e9-4819-b259-69b20f7315d5" containerID="eccc667b5b5640c16503020d2059a4328c2cadd2ae394c715779512ebd819efb" exitCode=0 Apr 20 23:18:34.053238 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:34.053086 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" event={"ID":"9ce1d2b6-b4e9-4819-b259-69b20f7315d5","Type":"ContainerDied","Data":"eccc667b5b5640c16503020d2059a4328c2cadd2ae394c715779512ebd819efb"} Apr 20 23:18:34.053238 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:34.053131 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" event={"ID":"9ce1d2b6-b4e9-4819-b259-69b20f7315d5","Type":"ContainerStarted","Data":"e0bba538665fbcaae798cc442161b4f85203532fe8909198673b4c19a3fd39a4"} Apr 20 23:18:36.029054 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:36.029020 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6577b568b8-hlssw" Apr 20 23:18:36.060804 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:36.060754 2577 generic.go:358] "Generic (PLEG): container finished" podID="9ce1d2b6-b4e9-4819-b259-69b20f7315d5" containerID="cd7a2038908fe85ce16d88c107f46dc32dba80e57dffa5083af710517fdf3dc9" exitCode=0 Apr 20 23:18:36.061019 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:36.060824 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" event={"ID":"9ce1d2b6-b4e9-4819-b259-69b20f7315d5","Type":"ContainerDied","Data":"cd7a2038908fe85ce16d88c107f46dc32dba80e57dffa5083af710517fdf3dc9"} Apr 20 23:18:37.066547 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:37.066517 2577 generic.go:358] "Generic (PLEG): container finished" podID="9ce1d2b6-b4e9-4819-b259-69b20f7315d5" containerID="fa938a19b1cd96564e623e03610f8873dba68660c6abbc0a5af4c89ac1f674f7" exitCode=0 Apr 20 23:18:37.066918 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:37.066602 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" event={"ID":"9ce1d2b6-b4e9-4819-b259-69b20f7315d5","Type":"ContainerDied","Data":"fa938a19b1cd96564e623e03610f8873dba68660c6abbc0a5af4c89ac1f674f7"} Apr 20 23:18:38.196347 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:38.196322 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" Apr 20 23:18:38.317171 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:38.317137 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-util\") pod \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\" (UID: \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\") " Apr 20 23:18:38.317345 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:38.317219 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-769vm\" (UniqueName: \"kubernetes.io/projected/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-kube-api-access-769vm\") pod \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\" (UID: \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\") " Apr 20 23:18:38.317345 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:38.317265 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-bundle\") pod \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\" (UID: \"9ce1d2b6-b4e9-4819-b259-69b20f7315d5\") " Apr 20 23:18:38.318158 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:38.318128 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-bundle" (OuterVolumeSpecName: "bundle") pod "9ce1d2b6-b4e9-4819-b259-69b20f7315d5" (UID: "9ce1d2b6-b4e9-4819-b259-69b20f7315d5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:18:38.319289 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:38.319258 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-kube-api-access-769vm" (OuterVolumeSpecName: "kube-api-access-769vm") pod "9ce1d2b6-b4e9-4819-b259-69b20f7315d5" (UID: "9ce1d2b6-b4e9-4819-b259-69b20f7315d5"). InnerVolumeSpecName "kube-api-access-769vm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:18:38.322762 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:38.322723 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-util" (OuterVolumeSpecName: "util") pod "9ce1d2b6-b4e9-4819-b259-69b20f7315d5" (UID: "9ce1d2b6-b4e9-4819-b259-69b20f7315d5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:18:38.418795 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:38.418757 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-769vm\" (UniqueName: \"kubernetes.io/projected/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-kube-api-access-769vm\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:18:38.418795 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:38.418787 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-bundle\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:18:38.418795 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:38.418796 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ce1d2b6-b4e9-4819-b259-69b20f7315d5-util\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:18:39.078545 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:39.078518 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" Apr 20 23:18:39.078734 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:39.078515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48355t9j4" event={"ID":"9ce1d2b6-b4e9-4819-b259-69b20f7315d5","Type":"ContainerDied","Data":"e0bba538665fbcaae798cc442161b4f85203532fe8909198673b4c19a3fd39a4"} Apr 20 23:18:39.078734 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:39.078625 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0bba538665fbcaae798cc442161b4f85203532fe8909198673b4c19a3fd39a4" Apr 20 23:18:47.644792 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.644747 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql"] Apr 20 23:18:47.645293 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.645127 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ce1d2b6-b4e9-4819-b259-69b20f7315d5" containerName="extract" Apr 20 23:18:47.645293 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.645145 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce1d2b6-b4e9-4819-b259-69b20f7315d5" containerName="extract" Apr 20 23:18:47.645293 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.645164 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ce1d2b6-b4e9-4819-b259-69b20f7315d5" containerName="pull" Apr 20 23:18:47.645293 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.645171 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce1d2b6-b4e9-4819-b259-69b20f7315d5" containerName="pull" Apr 20 23:18:47.645293 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.645182 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ce1d2b6-b4e9-4819-b259-69b20f7315d5" containerName="util" Apr 20 23:18:47.645293 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.645190 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce1d2b6-b4e9-4819-b259-69b20f7315d5" containerName="util" Apr 20 23:18:47.645293 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.645266 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ce1d2b6-b4e9-4819-b259-69b20f7315d5" containerName="extract" Apr 20 23:18:47.654795 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.654761 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" Apr 20 23:18:47.657824 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.657795 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 23:18:47.657997 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.657847 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-zhcjn\"" Apr 20 23:18:47.658141 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.658123 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 23:18:47.687384 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.687351 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql"] Apr 20 23:18:47.794099 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.794052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qw6d\" (UniqueName: \"kubernetes.io/projected/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-kube-api-access-7qw6d\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql\" (UID: \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" Apr 20 23:18:47.794279 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.794111 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql\" (UID: \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" Apr 20 23:18:47.794279 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.794182 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql\" (UID: \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" Apr 20 23:18:47.895512 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.895406 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qw6d\" (UniqueName: \"kubernetes.io/projected/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-kube-api-access-7qw6d\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql\" (UID: \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" Apr 20 23:18:47.895512 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.895464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql\" (UID: \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" Apr 20 23:18:47.895512 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.895482 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql\" (UID: \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" Apr 20 23:18:47.895810 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.895795 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql\" (UID: \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" Apr 20 23:18:47.895852 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.895834 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql\" (UID: \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" Apr 20 23:18:47.905999 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.905975 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qw6d\" (UniqueName: \"kubernetes.io/projected/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-kube-api-access-7qw6d\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql\" (UID: \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" Apr 20 23:18:47.963911 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:47.963871 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" Apr 20 23:18:48.092189 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:48.092161 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql"] Apr 20 23:18:48.095139 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:18:48.095079 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45c8f273_40f1_4df5_9d0a_6ebbfe185a65.slice/crio-9a16bc7bade00f0e9d520d541bc10d6fdf4766a403c0f244920d8d833b51f9c8 WatchSource:0}: Error finding container 9a16bc7bade00f0e9d520d541bc10d6fdf4766a403c0f244920d8d833b51f9c8: Status 404 returned error can't find the container with id 9a16bc7bade00f0e9d520d541bc10d6fdf4766a403c0f244920d8d833b51f9c8 Apr 20 23:18:48.112738 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:48.112700 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" event={"ID":"45c8f273-40f1-4df5-9d0a-6ebbfe185a65","Type":"ContainerStarted","Data":"9a16bc7bade00f0e9d520d541bc10d6fdf4766a403c0f244920d8d833b51f9c8"} Apr 20 23:18:49.117204 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:49.117167 2577 generic.go:358] "Generic (PLEG): container finished" podID="45c8f273-40f1-4df5-9d0a-6ebbfe185a65" containerID="24fe9a5456364c92191d1a0244915dc485d219f5c514c2b2447e23b2c2b422b3" exitCode=0 Apr 20 23:18:49.117620 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:49.117256 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" event={"ID":"45c8f273-40f1-4df5-9d0a-6ebbfe185a65","Type":"ContainerDied","Data":"24fe9a5456364c92191d1a0244915dc485d219f5c514c2b2447e23b2c2b422b3"} Apr 20 23:18:51.125497 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:51.125462 2577 generic.go:358] "Generic (PLEG): container finished" podID="45c8f273-40f1-4df5-9d0a-6ebbfe185a65" containerID="adf9840ed31691f952c3d1d8f7a7bc166d7941541384191b27f6bc526c62463e" exitCode=0 Apr 20 23:18:51.125865 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:51.125539 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" event={"ID":"45c8f273-40f1-4df5-9d0a-6ebbfe185a65","Type":"ContainerDied","Data":"adf9840ed31691f952c3d1d8f7a7bc166d7941541384191b27f6bc526c62463e"} Apr 20 23:18:52.131007 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:52.130972 2577 generic.go:358] "Generic (PLEG): container finished" podID="45c8f273-40f1-4df5-9d0a-6ebbfe185a65" containerID="6267315d81ece430cd4610f95e562f4bb2327c4a84d29b797534993b5b1f051b" exitCode=0 Apr 20 23:18:52.131472 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:52.131025 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" event={"ID":"45c8f273-40f1-4df5-9d0a-6ebbfe185a65","Type":"ContainerDied","Data":"6267315d81ece430cd4610f95e562f4bb2327c4a84d29b797534993b5b1f051b"} Apr 20 23:18:53.248904 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:53.248881 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" Apr 20 23:18:53.340814 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:53.340771 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-util\") pod \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\" (UID: \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\") " Apr 20 23:18:53.341040 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:53.340832 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qw6d\" (UniqueName: \"kubernetes.io/projected/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-kube-api-access-7qw6d\") pod \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\" (UID: \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\") " Apr 20 23:18:53.341040 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:53.340891 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-bundle\") pod \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\" (UID: \"45c8f273-40f1-4df5-9d0a-6ebbfe185a65\") " Apr 20 23:18:53.341862 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:53.341821 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-bundle" (OuterVolumeSpecName: "bundle") pod "45c8f273-40f1-4df5-9d0a-6ebbfe185a65" (UID: "45c8f273-40f1-4df5-9d0a-6ebbfe185a65"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:18:53.343016 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:53.342990 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-kube-api-access-7qw6d" (OuterVolumeSpecName: "kube-api-access-7qw6d") pod "45c8f273-40f1-4df5-9d0a-6ebbfe185a65" (UID: "45c8f273-40f1-4df5-9d0a-6ebbfe185a65"). InnerVolumeSpecName "kube-api-access-7qw6d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:18:53.346578 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:53.346542 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-util" (OuterVolumeSpecName: "util") pod "45c8f273-40f1-4df5-9d0a-6ebbfe185a65" (UID: "45c8f273-40f1-4df5-9d0a-6ebbfe185a65"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:18:53.442037 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:53.441917 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-util\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:18:53.442037 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:53.441981 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7qw6d\" (UniqueName: \"kubernetes.io/projected/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-kube-api-access-7qw6d\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:18:53.442037 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:53.441992 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/45c8f273-40f1-4df5-9d0a-6ebbfe185a65-bundle\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:18:54.139368 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:54.139329 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" event={"ID":"45c8f273-40f1-4df5-9d0a-6ebbfe185a65","Type":"ContainerDied","Data":"9a16bc7bade00f0e9d520d541bc10d6fdf4766a403c0f244920d8d833b51f9c8"} Apr 20 23:18:54.139368 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:54.139366 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a16bc7bade00f0e9d520d541bc10d6fdf4766a403c0f244920d8d833b51f9c8" Apr 20 23:18:54.139573 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:18:54.139378 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bwwql" Apr 20 23:19:09.673263 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.673233 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt"] Apr 20 23:19:09.673802 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.673486 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45c8f273-40f1-4df5-9d0a-6ebbfe185a65" containerName="pull" Apr 20 23:19:09.673802 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.673496 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c8f273-40f1-4df5-9d0a-6ebbfe185a65" containerName="pull" Apr 20 23:19:09.673802 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.673507 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45c8f273-40f1-4df5-9d0a-6ebbfe185a65" containerName="extract" Apr 20 23:19:09.673802 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.673512 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c8f273-40f1-4df5-9d0a-6ebbfe185a65" containerName="extract" Apr 20 23:19:09.673802 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.673523 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45c8f273-40f1-4df5-9d0a-6ebbfe185a65" containerName="util" Apr 20 23:19:09.673802 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.673528 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="45c8f273-40f1-4df5-9d0a-6ebbfe185a65" containerName="util" Apr 20 23:19:09.673802 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.673572 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="45c8f273-40f1-4df5-9d0a-6ebbfe185a65" containerName="extract" Apr 20 23:19:09.676116 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.676090 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.678491 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.678470 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-b6hxn\"" Apr 20 23:19:09.678622 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.678505 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 23:19:09.678622 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.678555 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 23:19:09.678716 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.678637 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 23:19:09.685181 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.685155 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt"] Apr 20 23:19:09.761791 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.761751 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.762024 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.761799 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.762024 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.761850 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzqqs\" (UniqueName: \"kubernetes.io/projected/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-kube-api-access-vzqqs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.762024 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.761890 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.762024 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.761917 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.762024 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.761998 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.762220 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.762038 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.762220 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.762058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.762220 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.762076 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.863099 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.863057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.863099 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.863101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.863353 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.863120 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.863353 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.863137 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.863353 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.863160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.863353 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.863191 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.863353 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.863218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzqqs\" (UniqueName: \"kubernetes.io/projected/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-kube-api-access-vzqqs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.863353 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.863251 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.863661 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.863489 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.863661 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.863541 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.863661 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.863560 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.863814 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.863754 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.863874 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.863819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.864080 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.864060 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.865524 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.865508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.865957 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.865921 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.873531 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.873505 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzqqs\" (UniqueName: \"kubernetes.io/projected/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-kube-api-access-vzqqs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.873656 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.873558 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/3b7c0f57-f6d9-4449-869c-32f8ac8135ed-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt\" (UID: \"3b7c0f57-f6d9-4449-869c-32f8ac8135ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:09.987751 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:09.987644 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:10.110655 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:10.110613 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt"] Apr 20 23:19:10.113882 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:19:10.113845 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b7c0f57_f6d9_4449_869c_32f8ac8135ed.slice/crio-03547ca0656c5fa879b6d7a22fba3c6e570ec50c8c50731a6cf226b603761b9e WatchSource:0}: Error finding container 03547ca0656c5fa879b6d7a22fba3c6e570ec50c8c50731a6cf226b603761b9e: Status 404 returned error can't find the container with id 03547ca0656c5fa879b6d7a22fba3c6e570ec50c8c50731a6cf226b603761b9e Apr 20 23:19:10.196489 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:10.196446 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" event={"ID":"3b7c0f57-f6d9-4449-869c-32f8ac8135ed","Type":"ContainerStarted","Data":"03547ca0656c5fa879b6d7a22fba3c6e570ec50c8c50731a6cf226b603761b9e"} Apr 20 23:19:12.893120 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:12.893073 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 20 23:19:12.893366 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:12.893149 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 20 23:19:12.893366 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:12.893176 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 20 23:19:13.209835 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:13.209753 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" event={"ID":"3b7c0f57-f6d9-4449-869c-32f8ac8135ed","Type":"ContainerStarted","Data":"7990d6e0e7070fb27f4d0bded77748ddb4495c2197f29f239b6d57d5807c35b6"} Apr 20 23:19:13.229795 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:13.229742 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" podStartSLOduration=1.45293953 podStartE2EDuration="4.229724905s" podCreationTimestamp="2026-04-20 23:19:09 +0000 UTC" firstStartedPulling="2026-04-20 23:19:10.116027463 +0000 UTC m=+379.787396590" lastFinishedPulling="2026-04-20 23:19:12.892812837 +0000 UTC m=+382.564181965" observedRunningTime="2026-04-20 23:19:13.227171861 +0000 UTC m=+382.898541008" watchObservedRunningTime="2026-04-20 23:19:13.229724905 +0000 UTC m=+382.901094121" Apr 20 23:19:13.988738 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:13.988694 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:13.993333 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:13.993310 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:14.213019 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:14.212984 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:14.214068 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:14.214040 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt" Apr 20 23:19:22.564288 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:22.564252 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bhz52"] Apr 20 23:19:22.567669 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:22.567651 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bhz52" Apr 20 23:19:22.569912 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:22.569887 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 23:19:22.570766 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:22.570746 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-gzbfg\"" Apr 20 23:19:22.570862 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:22.570753 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 23:19:22.577741 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:22.577712 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bhz52"] Apr 20 23:19:22.662877 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:22.662840 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcc94\" (UniqueName: \"kubernetes.io/projected/e72a4484-6b5c-4ef6-a0e1-2b5ef467b466-kube-api-access-gcc94\") pod \"kuadrant-operator-catalog-bhz52\" (UID: \"e72a4484-6b5c-4ef6-a0e1-2b5ef467b466\") " pod="kuadrant-system/kuadrant-operator-catalog-bhz52" Apr 20 23:19:22.763222 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:22.763166 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcc94\" (UniqueName: \"kubernetes.io/projected/e72a4484-6b5c-4ef6-a0e1-2b5ef467b466-kube-api-access-gcc94\") pod \"kuadrant-operator-catalog-bhz52\" (UID: \"e72a4484-6b5c-4ef6-a0e1-2b5ef467b466\") " pod="kuadrant-system/kuadrant-operator-catalog-bhz52" Apr 20 23:19:22.773897 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:22.773859 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcc94\" (UniqueName: \"kubernetes.io/projected/e72a4484-6b5c-4ef6-a0e1-2b5ef467b466-kube-api-access-gcc94\") pod \"kuadrant-operator-catalog-bhz52\" (UID: \"e72a4484-6b5c-4ef6-a0e1-2b5ef467b466\") " pod="kuadrant-system/kuadrant-operator-catalog-bhz52" Apr 20 23:19:22.878085 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:22.878044 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bhz52" Apr 20 23:19:22.935294 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:22.935261 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bhz52"] Apr 20 23:19:23.003401 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:23.003371 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bhz52"] Apr 20 23:19:23.006050 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:19:23.006017 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode72a4484_6b5c_4ef6_a0e1_2b5ef467b466.slice/crio-f15d74d07b0bc94a8c2f94cebcace8529ba512e9f9a8995bf5fb794bdd8df3c0 WatchSource:0}: Error finding container f15d74d07b0bc94a8c2f94cebcace8529ba512e9f9a8995bf5fb794bdd8df3c0: Status 404 returned error can't find the container with id f15d74d07b0bc94a8c2f94cebcace8529ba512e9f9a8995bf5fb794bdd8df3c0 Apr 20 23:19:23.142360 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:23.142278 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mhh54"] Apr 20 23:19:23.146610 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:23.146590 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-mhh54" Apr 20 23:19:23.151914 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:23.151889 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mhh54"] Apr 20 23:19:23.165703 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:23.165676 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfmsj\" (UniqueName: \"kubernetes.io/projected/99f30f6e-5db1-4fdf-a67f-3d1f77c0ba58-kube-api-access-sfmsj\") pod \"kuadrant-operator-catalog-mhh54\" (UID: \"99f30f6e-5db1-4fdf-a67f-3d1f77c0ba58\") " pod="kuadrant-system/kuadrant-operator-catalog-mhh54" Apr 20 23:19:23.245680 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:23.245639 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bhz52" event={"ID":"e72a4484-6b5c-4ef6-a0e1-2b5ef467b466","Type":"ContainerStarted","Data":"f15d74d07b0bc94a8c2f94cebcace8529ba512e9f9a8995bf5fb794bdd8df3c0"} Apr 20 23:19:23.266260 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:23.266228 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfmsj\" (UniqueName: \"kubernetes.io/projected/99f30f6e-5db1-4fdf-a67f-3d1f77c0ba58-kube-api-access-sfmsj\") pod \"kuadrant-operator-catalog-mhh54\" (UID: \"99f30f6e-5db1-4fdf-a67f-3d1f77c0ba58\") " pod="kuadrant-system/kuadrant-operator-catalog-mhh54" Apr 20 23:19:23.274196 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:23.274168 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfmsj\" (UniqueName: \"kubernetes.io/projected/99f30f6e-5db1-4fdf-a67f-3d1f77c0ba58-kube-api-access-sfmsj\") pod \"kuadrant-operator-catalog-mhh54\" (UID: \"99f30f6e-5db1-4fdf-a67f-3d1f77c0ba58\") " pod="kuadrant-system/kuadrant-operator-catalog-mhh54" Apr 20 23:19:23.457760 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:23.457662 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-mhh54" Apr 20 23:19:23.577313 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:23.577282 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-mhh54"] Apr 20 23:19:23.579382 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:19:23.579354 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99f30f6e_5db1_4fdf_a67f_3d1f77c0ba58.slice/crio-a94c077d73175b33fd877dd5fea3f0780bb74972efa3ed7ff6d73bd702d94887 WatchSource:0}: Error finding container a94c077d73175b33fd877dd5fea3f0780bb74972efa3ed7ff6d73bd702d94887: Status 404 returned error can't find the container with id a94c077d73175b33fd877dd5fea3f0780bb74972efa3ed7ff6d73bd702d94887 Apr 20 23:19:24.250635 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:24.250603 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-mhh54" event={"ID":"99f30f6e-5db1-4fdf-a67f-3d1f77c0ba58","Type":"ContainerStarted","Data":"a94c077d73175b33fd877dd5fea3f0780bb74972efa3ed7ff6d73bd702d94887"} Apr 20 23:19:26.259493 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:26.259454 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bhz52" event={"ID":"e72a4484-6b5c-4ef6-a0e1-2b5ef467b466","Type":"ContainerStarted","Data":"e48ed0b7b4b00a80836a86abd8954410fd41ab5046db99b092f8bc968fe7ea58"} Apr 20 23:19:26.259988 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:26.259567 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-bhz52" podUID="e72a4484-6b5c-4ef6-a0e1-2b5ef467b466" containerName="registry-server" containerID="cri-o://e48ed0b7b4b00a80836a86abd8954410fd41ab5046db99b092f8bc968fe7ea58" gracePeriod=2 Apr 20 23:19:26.260935 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:26.260909 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-mhh54" event={"ID":"99f30f6e-5db1-4fdf-a67f-3d1f77c0ba58","Type":"ContainerStarted","Data":"1584ed0386bb0d1efdef5044eb3a932271907344b38217b5ef244b6ad4817e15"} Apr 20 23:19:26.275279 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:26.275231 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-bhz52" podStartSLOduration=1.939232377 podStartE2EDuration="4.275215855s" podCreationTimestamp="2026-04-20 23:19:22 +0000 UTC" firstStartedPulling="2026-04-20 23:19:23.007252571 +0000 UTC m=+392.678621697" lastFinishedPulling="2026-04-20 23:19:25.343236051 +0000 UTC m=+395.014605175" observedRunningTime="2026-04-20 23:19:26.272335465 +0000 UTC m=+395.943704612" watchObservedRunningTime="2026-04-20 23:19:26.275215855 +0000 UTC m=+395.946585001" Apr 20 23:19:26.288068 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:26.288016 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-mhh54" podStartSLOduration=1.522442759 podStartE2EDuration="3.288002373s" podCreationTimestamp="2026-04-20 23:19:23 +0000 UTC" firstStartedPulling="2026-04-20 23:19:23.58071403 +0000 UTC m=+393.252083156" lastFinishedPulling="2026-04-20 23:19:25.346273642 +0000 UTC m=+395.017642770" observedRunningTime="2026-04-20 23:19:26.286133535 +0000 UTC m=+395.957502682" watchObservedRunningTime="2026-04-20 23:19:26.288002373 +0000 UTC m=+395.959371520" Apr 20 23:19:26.499400 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:26.499378 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bhz52" Apr 20 23:19:26.597265 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:26.597230 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcc94\" (UniqueName: \"kubernetes.io/projected/e72a4484-6b5c-4ef6-a0e1-2b5ef467b466-kube-api-access-gcc94\") pod \"e72a4484-6b5c-4ef6-a0e1-2b5ef467b466\" (UID: \"e72a4484-6b5c-4ef6-a0e1-2b5ef467b466\") " Apr 20 23:19:26.599426 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:26.599401 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e72a4484-6b5c-4ef6-a0e1-2b5ef467b466-kube-api-access-gcc94" (OuterVolumeSpecName: "kube-api-access-gcc94") pod "e72a4484-6b5c-4ef6-a0e1-2b5ef467b466" (UID: "e72a4484-6b5c-4ef6-a0e1-2b5ef467b466"). InnerVolumeSpecName "kube-api-access-gcc94". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:19:26.698738 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:26.698702 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gcc94\" (UniqueName: \"kubernetes.io/projected/e72a4484-6b5c-4ef6-a0e1-2b5ef467b466-kube-api-access-gcc94\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:19:27.265646 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:27.265561 2577 generic.go:358] "Generic (PLEG): container finished" podID="e72a4484-6b5c-4ef6-a0e1-2b5ef467b466" containerID="e48ed0b7b4b00a80836a86abd8954410fd41ab5046db99b092f8bc968fe7ea58" exitCode=0 Apr 20 23:19:27.265646 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:27.265623 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bhz52" Apr 20 23:19:27.266127 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:27.265644 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bhz52" event={"ID":"e72a4484-6b5c-4ef6-a0e1-2b5ef467b466","Type":"ContainerDied","Data":"e48ed0b7b4b00a80836a86abd8954410fd41ab5046db99b092f8bc968fe7ea58"} Apr 20 23:19:27.266127 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:27.265681 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bhz52" event={"ID":"e72a4484-6b5c-4ef6-a0e1-2b5ef467b466","Type":"ContainerDied","Data":"f15d74d07b0bc94a8c2f94cebcace8529ba512e9f9a8995bf5fb794bdd8df3c0"} Apr 20 23:19:27.266127 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:27.265701 2577 scope.go:117] "RemoveContainer" containerID="e48ed0b7b4b00a80836a86abd8954410fd41ab5046db99b092f8bc968fe7ea58" Apr 20 23:19:27.274211 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:27.274192 2577 scope.go:117] "RemoveContainer" containerID="e48ed0b7b4b00a80836a86abd8954410fd41ab5046db99b092f8bc968fe7ea58" Apr 20 23:19:27.274463 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:19:27.274439 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e48ed0b7b4b00a80836a86abd8954410fd41ab5046db99b092f8bc968fe7ea58\": container with ID starting with e48ed0b7b4b00a80836a86abd8954410fd41ab5046db99b092f8bc968fe7ea58 not found: ID does not exist" containerID="e48ed0b7b4b00a80836a86abd8954410fd41ab5046db99b092f8bc968fe7ea58" Apr 20 23:19:27.274527 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:27.274472 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e48ed0b7b4b00a80836a86abd8954410fd41ab5046db99b092f8bc968fe7ea58"} err="failed to get container status \"e48ed0b7b4b00a80836a86abd8954410fd41ab5046db99b092f8bc968fe7ea58\": rpc error: code = NotFound desc = could not find container \"e48ed0b7b4b00a80836a86abd8954410fd41ab5046db99b092f8bc968fe7ea58\": container with ID starting with e48ed0b7b4b00a80836a86abd8954410fd41ab5046db99b092f8bc968fe7ea58 not found: ID does not exist" Apr 20 23:19:27.280725 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:27.280695 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bhz52"] Apr 20 23:19:27.284281 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:27.284257 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bhz52"] Apr 20 23:19:28.918434 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:28.918401 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e72a4484-6b5c-4ef6-a0e1-2b5ef467b466" path="/var/lib/kubelet/pods/e72a4484-6b5c-4ef6-a0e1-2b5ef467b466/volumes" Apr 20 23:19:33.458058 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:33.458017 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-mhh54" Apr 20 23:19:33.458454 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:33.458098 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-mhh54" Apr 20 23:19:33.479816 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:33.479788 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-mhh54" Apr 20 23:19:34.311558 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:34.311528 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-mhh54" Apr 20 23:19:38.373962 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.373913 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh"] Apr 20 23:19:38.374338 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.374205 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e72a4484-6b5c-4ef6-a0e1-2b5ef467b466" containerName="registry-server" Apr 20 23:19:38.374338 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.374216 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="e72a4484-6b5c-4ef6-a0e1-2b5ef467b466" containerName="registry-server" Apr 20 23:19:38.374338 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.374263 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="e72a4484-6b5c-4ef6-a0e1-2b5ef467b466" containerName="registry-server" Apr 20 23:19:38.377321 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.377303 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" Apr 20 23:19:38.379638 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.379616 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-8v2gg\"" Apr 20 23:19:38.383984 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.383954 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh"] Apr 20 23:19:38.497546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.497510 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh\" (UID: \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" Apr 20 23:19:38.497726 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.497564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx4zp\" (UniqueName: \"kubernetes.io/projected/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-kube-api-access-tx4zp\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh\" (UID: \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" Apr 20 23:19:38.497726 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.497664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh\" (UID: \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" Apr 20 23:19:38.598546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.598490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh\" (UID: \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" Apr 20 23:19:38.598730 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.598574 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx4zp\" (UniqueName: \"kubernetes.io/projected/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-kube-api-access-tx4zp\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh\" (UID: \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" Apr 20 23:19:38.598730 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.598619 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh\" (UID: \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" Apr 20 23:19:38.599010 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.598991 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh\" (UID: \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" Apr 20 23:19:38.599056 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.599007 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh\" (UID: \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" Apr 20 23:19:38.607796 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.607770 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx4zp\" (UniqueName: \"kubernetes.io/projected/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-kube-api-access-tx4zp\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh\" (UID: \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" Apr 20 23:19:38.687713 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.687628 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" Apr 20 23:19:38.810373 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:38.810310 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh"] Apr 20 23:19:38.812522 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:19:38.812492 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e6194a3_80c9_4b57_ab03_32d6aaf59c47.slice/crio-de33ee113d1d6b0a06ce13f7e1c8b6911e71f90447a3b04d1719d9bdbf045f64 WatchSource:0}: Error finding container de33ee113d1d6b0a06ce13f7e1c8b6911e71f90447a3b04d1719d9bdbf045f64: Status 404 returned error can't find the container with id de33ee113d1d6b0a06ce13f7e1c8b6911e71f90447a3b04d1719d9bdbf045f64 Apr 20 23:19:39.175382 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.175347 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x"] Apr 20 23:19:39.178758 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.178734 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" Apr 20 23:19:39.192542 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.189810 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x"] Apr 20 23:19:39.304541 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.304497 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x\" (UID: \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" Apr 20 23:19:39.304691 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.304567 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x\" (UID: \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" Apr 20 23:19:39.304691 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.304646 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsfch\" (UniqueName: \"kubernetes.io/projected/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-kube-api-access-gsfch\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x\" (UID: \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" Apr 20 23:19:39.309661 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.309627 2577 generic.go:358] "Generic (PLEG): container finished" podID="1e6194a3-80c9-4b57-ab03-32d6aaf59c47" containerID="db49b918cc69aaf1ba8d8641f1336aa6c55c882c8f84c7f5014545ffb60ab536" exitCode=0 Apr 20 23:19:39.309787 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.309672 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" event={"ID":"1e6194a3-80c9-4b57-ab03-32d6aaf59c47","Type":"ContainerDied","Data":"db49b918cc69aaf1ba8d8641f1336aa6c55c882c8f84c7f5014545ffb60ab536"} Apr 20 23:19:39.309787 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.309700 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" event={"ID":"1e6194a3-80c9-4b57-ab03-32d6aaf59c47","Type":"ContainerStarted","Data":"de33ee113d1d6b0a06ce13f7e1c8b6911e71f90447a3b04d1719d9bdbf045f64"} Apr 20 23:19:39.405530 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.405491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsfch\" (UniqueName: \"kubernetes.io/projected/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-kube-api-access-gsfch\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x\" (UID: \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" Apr 20 23:19:39.405940 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.405540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x\" (UID: \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" Apr 20 23:19:39.405940 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.405604 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x\" (UID: \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" Apr 20 23:19:39.406071 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.406050 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x\" (UID: \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" Apr 20 23:19:39.406110 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.406065 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x\" (UID: \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" Apr 20 23:19:39.418209 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.418185 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsfch\" (UniqueName: \"kubernetes.io/projected/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-kube-api-access-gsfch\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x\" (UID: \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" Apr 20 23:19:39.489656 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.489558 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" Apr 20 23:19:39.575960 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.575904 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r"] Apr 20 23:19:39.580876 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.580849 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" Apr 20 23:19:39.589283 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.589105 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r"] Apr 20 23:19:39.613832 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.613798 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x"] Apr 20 23:19:39.614694 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:19:39.614657 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2b42f5d_467d_4e2f_bd50_980d960dbe2f.slice/crio-17aaf4b5b9815379e7797222506ce1d893770a5cfad13659bea674e1fee123b3 WatchSource:0}: Error finding container 17aaf4b5b9815379e7797222506ce1d893770a5cfad13659bea674e1fee123b3: Status 404 returned error can't find the container with id 17aaf4b5b9815379e7797222506ce1d893770a5cfad13659bea674e1fee123b3 Apr 20 23:19:39.708358 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.708322 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r\" (UID: \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" Apr 20 23:19:39.708358 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.708362 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r\" (UID: \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" Apr 20 23:19:39.708571 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.708435 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6q6r\" (UniqueName: \"kubernetes.io/projected/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-kube-api-access-b6q6r\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r\" (UID: \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" Apr 20 23:19:39.809886 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.809799 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6q6r\" (UniqueName: \"kubernetes.io/projected/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-kube-api-access-b6q6r\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r\" (UID: \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" Apr 20 23:19:39.809886 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.809870 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r\" (UID: \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" Apr 20 23:19:39.810127 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.809894 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r\" (UID: \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" Apr 20 23:19:39.810321 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.810299 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r\" (UID: \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" Apr 20 23:19:39.810377 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.810319 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r\" (UID: \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" Apr 20 23:19:39.818036 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.818011 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6q6r\" (UniqueName: \"kubernetes.io/projected/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-kube-api-access-b6q6r\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r\" (UID: \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" Apr 20 23:19:39.892245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.892208 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" Apr 20 23:19:39.981759 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.981726 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4"] Apr 20 23:19:39.985260 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.985222 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" Apr 20 23:19:39.994788 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:39.994736 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4"] Apr 20 23:19:40.021385 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.021285 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r"] Apr 20 23:19:40.053625 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:19:40.053588 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bd464c7_cddf_45d2_8b39_67d3bbb399e5.slice/crio-6964ea325d57bce001655f8f0f21e4ccc1cf9f51cd74ac5fb35edb409253b063 WatchSource:0}: Error finding container 6964ea325d57bce001655f8f0f21e4ccc1cf9f51cd74ac5fb35edb409253b063: Status 404 returned error can't find the container with id 6964ea325d57bce001655f8f0f21e4ccc1cf9f51cd74ac5fb35edb409253b063 Apr 20 23:19:40.113404 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.113372 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj2vd\" (UniqueName: \"kubernetes.io/projected/5a276cc4-8103-401a-8e8c-2965d47b3cfc-kube-api-access-jj2vd\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4\" (UID: \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" Apr 20 23:19:40.113518 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.113418 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a276cc4-8103-401a-8e8c-2965d47b3cfc-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4\" (UID: \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" Apr 20 23:19:40.113518 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.113475 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a276cc4-8103-401a-8e8c-2965d47b3cfc-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4\" (UID: \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" Apr 20 23:19:40.214256 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.214222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a276cc4-8103-401a-8e8c-2965d47b3cfc-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4\" (UID: \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" Apr 20 23:19:40.214388 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.214301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jj2vd\" (UniqueName: \"kubernetes.io/projected/5a276cc4-8103-401a-8e8c-2965d47b3cfc-kube-api-access-jj2vd\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4\" (UID: \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" Apr 20 23:19:40.214388 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.214332 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a276cc4-8103-401a-8e8c-2965d47b3cfc-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4\" (UID: \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" Apr 20 23:19:40.214585 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.214563 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a276cc4-8103-401a-8e8c-2965d47b3cfc-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4\" (UID: \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" Apr 20 23:19:40.214623 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.214589 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a276cc4-8103-401a-8e8c-2965d47b3cfc-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4\" (UID: \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" Apr 20 23:19:40.222796 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.222772 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj2vd\" (UniqueName: \"kubernetes.io/projected/5a276cc4-8103-401a-8e8c-2965d47b3cfc-kube-api-access-jj2vd\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4\" (UID: \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" Apr 20 23:19:40.298974 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.298911 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" Apr 20 23:19:40.315620 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.315589 2577 generic.go:358] "Generic (PLEG): container finished" podID="5bd464c7-cddf-45d2-8b39-67d3bbb399e5" containerID="c19469658de7a21f3dcf35133695f8875516fee7e6c80e6119a28098dd0cbf05" exitCode=0 Apr 20 23:19:40.315773 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.315678 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" event={"ID":"5bd464c7-cddf-45d2-8b39-67d3bbb399e5","Type":"ContainerDied","Data":"c19469658de7a21f3dcf35133695f8875516fee7e6c80e6119a28098dd0cbf05"} Apr 20 23:19:40.315773 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.315717 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" event={"ID":"5bd464c7-cddf-45d2-8b39-67d3bbb399e5","Type":"ContainerStarted","Data":"6964ea325d57bce001655f8f0f21e4ccc1cf9f51cd74ac5fb35edb409253b063"} Apr 20 23:19:40.317134 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.317112 2577 generic.go:358] "Generic (PLEG): container finished" podID="c2b42f5d-467d-4e2f-bd50-980d960dbe2f" containerID="96169de1ac0c07280f39d44df75caace5ba7b0b3bab1d1dffcaf174643ed9573" exitCode=0 Apr 20 23:19:40.317246 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.317188 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" event={"ID":"c2b42f5d-467d-4e2f-bd50-980d960dbe2f","Type":"ContainerDied","Data":"96169de1ac0c07280f39d44df75caace5ba7b0b3bab1d1dffcaf174643ed9573"} Apr 20 23:19:40.317246 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.317221 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" event={"ID":"c2b42f5d-467d-4e2f-bd50-980d960dbe2f","Type":"ContainerStarted","Data":"17aaf4b5b9815379e7797222506ce1d893770a5cfad13659bea674e1fee123b3"} Apr 20 23:19:40.318970 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.318881 2577 generic.go:358] "Generic (PLEG): container finished" podID="1e6194a3-80c9-4b57-ab03-32d6aaf59c47" containerID="e40fbedcfe81f359e5a757165b0d88ed8998aca6afd87fb767d4f515e98c3908" exitCode=0 Apr 20 23:19:40.318970 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.318926 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" event={"ID":"1e6194a3-80c9-4b57-ab03-32d6aaf59c47","Type":"ContainerDied","Data":"e40fbedcfe81f359e5a757165b0d88ed8998aca6afd87fb767d4f515e98c3908"} Apr 20 23:19:40.424632 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:40.424608 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4"] Apr 20 23:19:40.427223 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:19:40.427191 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a276cc4_8103_401a_8e8c_2965d47b3cfc.slice/crio-5374726c678b944557a43c5ece30418ea201e45d30258163da02115b0750693f WatchSource:0}: Error finding container 5374726c678b944557a43c5ece30418ea201e45d30258163da02115b0750693f: Status 404 returned error can't find the container with id 5374726c678b944557a43c5ece30418ea201e45d30258163da02115b0750693f Apr 20 23:19:41.323661 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:41.323626 2577 generic.go:358] "Generic (PLEG): container finished" podID="5a276cc4-8103-401a-8e8c-2965d47b3cfc" containerID="328d0827172fa194d0830d1a6c2c65027c7ec704d2f87e4f9f961b7229c35bb0" exitCode=0 Apr 20 23:19:41.323877 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:41.323717 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" event={"ID":"5a276cc4-8103-401a-8e8c-2965d47b3cfc","Type":"ContainerDied","Data":"328d0827172fa194d0830d1a6c2c65027c7ec704d2f87e4f9f961b7229c35bb0"} Apr 20 23:19:41.323877 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:41.323762 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" event={"ID":"5a276cc4-8103-401a-8e8c-2965d47b3cfc","Type":"ContainerStarted","Data":"5374726c678b944557a43c5ece30418ea201e45d30258163da02115b0750693f"} Apr 20 23:19:41.325523 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:41.325501 2577 generic.go:358] "Generic (PLEG): container finished" podID="c2b42f5d-467d-4e2f-bd50-980d960dbe2f" containerID="d13694b08ed35e47d73ec12272b1fe3fb94ef98a6713187fff782da35d919f0f" exitCode=0 Apr 20 23:19:41.325655 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:41.325568 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" event={"ID":"c2b42f5d-467d-4e2f-bd50-980d960dbe2f","Type":"ContainerDied","Data":"d13694b08ed35e47d73ec12272b1fe3fb94ef98a6713187fff782da35d919f0f"} Apr 20 23:19:41.327461 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:41.327426 2577 generic.go:358] "Generic (PLEG): container finished" podID="1e6194a3-80c9-4b57-ab03-32d6aaf59c47" containerID="234188c9816f59e8008db3cb560b6f802ce8bf1a917a7113ed2f17a9bad019fd" exitCode=0 Apr 20 23:19:41.327569 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:41.327477 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" event={"ID":"1e6194a3-80c9-4b57-ab03-32d6aaf59c47","Type":"ContainerDied","Data":"234188c9816f59e8008db3cb560b6f802ce8bf1a917a7113ed2f17a9bad019fd"} Apr 20 23:19:41.329185 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:41.329164 2577 generic.go:358] "Generic (PLEG): container finished" podID="5bd464c7-cddf-45d2-8b39-67d3bbb399e5" containerID="36bee63c7ffe2956d4b0b81c259377ecb374f478e90a6ae7546dedb258a57920" exitCode=0 Apr 20 23:19:41.329287 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:41.329193 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" event={"ID":"5bd464c7-cddf-45d2-8b39-67d3bbb399e5","Type":"ContainerDied","Data":"36bee63c7ffe2956d4b0b81c259377ecb374f478e90a6ae7546dedb258a57920"} Apr 20 23:19:42.334463 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.334423 2577 generic.go:358] "Generic (PLEG): container finished" podID="5a276cc4-8103-401a-8e8c-2965d47b3cfc" containerID="3d70ce6b628449114a0648adfba915a9670300510d2f540965600a16f919ae5a" exitCode=0 Apr 20 23:19:42.334932 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.334513 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" event={"ID":"5a276cc4-8103-401a-8e8c-2965d47b3cfc","Type":"ContainerDied","Data":"3d70ce6b628449114a0648adfba915a9670300510d2f540965600a16f919ae5a"} Apr 20 23:19:42.336617 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.336594 2577 generic.go:358] "Generic (PLEG): container finished" podID="c2b42f5d-467d-4e2f-bd50-980d960dbe2f" containerID="f1414a97ef99f74edba180d3574a69960d6dcca9a73102bc378c34945ae7d7d3" exitCode=0 Apr 20 23:19:42.336725 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.336678 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" event={"ID":"c2b42f5d-467d-4e2f-bd50-980d960dbe2f","Type":"ContainerDied","Data":"f1414a97ef99f74edba180d3574a69960d6dcca9a73102bc378c34945ae7d7d3"} Apr 20 23:19:42.338578 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.338555 2577 generic.go:358] "Generic (PLEG): container finished" podID="5bd464c7-cddf-45d2-8b39-67d3bbb399e5" containerID="1315e3e2bccc7b98b61bac5137f1b83fa8e0f9247ee64a8b7513789d09dad495" exitCode=0 Apr 20 23:19:42.338668 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.338627 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" event={"ID":"5bd464c7-cddf-45d2-8b39-67d3bbb399e5","Type":"ContainerDied","Data":"1315e3e2bccc7b98b61bac5137f1b83fa8e0f9247ee64a8b7513789d09dad495"} Apr 20 23:19:42.464157 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.464133 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" Apr 20 23:19:42.533176 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.533145 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-util\") pod \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\" (UID: \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\") " Apr 20 23:19:42.533374 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.533202 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx4zp\" (UniqueName: \"kubernetes.io/projected/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-kube-api-access-tx4zp\") pod \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\" (UID: \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\") " Apr 20 23:19:42.533374 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.533229 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-bundle\") pod \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\" (UID: \"1e6194a3-80c9-4b57-ab03-32d6aaf59c47\") " Apr 20 23:19:42.533680 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.533653 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-bundle" (OuterVolumeSpecName: "bundle") pod "1e6194a3-80c9-4b57-ab03-32d6aaf59c47" (UID: "1e6194a3-80c9-4b57-ab03-32d6aaf59c47"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:19:42.535574 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.535545 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-kube-api-access-tx4zp" (OuterVolumeSpecName: "kube-api-access-tx4zp") pod "1e6194a3-80c9-4b57-ab03-32d6aaf59c47" (UID: "1e6194a3-80c9-4b57-ab03-32d6aaf59c47"). InnerVolumeSpecName "kube-api-access-tx4zp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:19:42.538113 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.538094 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-util" (OuterVolumeSpecName: "util") pod "1e6194a3-80c9-4b57-ab03-32d6aaf59c47" (UID: "1e6194a3-80c9-4b57-ab03-32d6aaf59c47"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:19:42.634407 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.634349 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-util\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:19:42.634407 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.634398 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tx4zp\" (UniqueName: \"kubernetes.io/projected/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-kube-api-access-tx4zp\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:19:42.634407 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:42.634409 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1e6194a3-80c9-4b57-ab03-32d6aaf59c47-bundle\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:19:43.343608 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.343568 2577 generic.go:358] "Generic (PLEG): container finished" podID="5a276cc4-8103-401a-8e8c-2965d47b3cfc" containerID="b44584843c0ebb7fca804c9849eb8ecb5ffbeb6506d451fac23a27a4727db340" exitCode=0 Apr 20 23:19:43.344102 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.343659 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" event={"ID":"5a276cc4-8103-401a-8e8c-2965d47b3cfc","Type":"ContainerDied","Data":"b44584843c0ebb7fca804c9849eb8ecb5ffbeb6506d451fac23a27a4727db340"} Apr 20 23:19:43.345245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.345224 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" Apr 20 23:19:43.345376 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.345247 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh" event={"ID":"1e6194a3-80c9-4b57-ab03-32d6aaf59c47","Type":"ContainerDied","Data":"de33ee113d1d6b0a06ce13f7e1c8b6911e71f90447a3b04d1719d9bdbf045f64"} Apr 20 23:19:43.345376 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.345281 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de33ee113d1d6b0a06ce13f7e1c8b6911e71f90447a3b04d1719d9bdbf045f64" Apr 20 23:19:43.484461 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.484436 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" Apr 20 23:19:43.497561 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.497538 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" Apr 20 23:19:43.541572 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.541537 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsfch\" (UniqueName: \"kubernetes.io/projected/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-kube-api-access-gsfch\") pod \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\" (UID: \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\") " Apr 20 23:19:43.541744 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.541609 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-bundle\") pod \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\" (UID: \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\") " Apr 20 23:19:43.541744 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.541665 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-util\") pod \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\" (UID: \"c2b42f5d-467d-4e2f-bd50-980d960dbe2f\") " Apr 20 23:19:43.542194 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.542165 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-bundle" (OuterVolumeSpecName: "bundle") pod "c2b42f5d-467d-4e2f-bd50-980d960dbe2f" (UID: "c2b42f5d-467d-4e2f-bd50-980d960dbe2f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:19:43.543660 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.543638 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-kube-api-access-gsfch" (OuterVolumeSpecName: "kube-api-access-gsfch") pod "c2b42f5d-467d-4e2f-bd50-980d960dbe2f" (UID: "c2b42f5d-467d-4e2f-bd50-980d960dbe2f"). InnerVolumeSpecName "kube-api-access-gsfch". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:19:43.546888 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.546861 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-util" (OuterVolumeSpecName: "util") pod "c2b42f5d-467d-4e2f-bd50-980d960dbe2f" (UID: "c2b42f5d-467d-4e2f-bd50-980d960dbe2f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:19:43.642118 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.642082 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6q6r\" (UniqueName: \"kubernetes.io/projected/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-kube-api-access-b6q6r\") pod \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\" (UID: \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\") " Apr 20 23:19:43.642118 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.642126 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-bundle\") pod \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\" (UID: \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\") " Apr 20 23:19:43.642355 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.642145 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-util\") pod \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\" (UID: \"5bd464c7-cddf-45d2-8b39-67d3bbb399e5\") " Apr 20 23:19:43.642355 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.642301 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-bundle\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:19:43.642355 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.642311 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-util\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:19:43.642355 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.642319 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gsfch\" (UniqueName: \"kubernetes.io/projected/c2b42f5d-467d-4e2f-bd50-980d960dbe2f-kube-api-access-gsfch\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:19:43.642752 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.642707 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-bundle" (OuterVolumeSpecName: "bundle") pod "5bd464c7-cddf-45d2-8b39-67d3bbb399e5" (UID: "5bd464c7-cddf-45d2-8b39-67d3bbb399e5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:19:43.644349 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.644319 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-kube-api-access-b6q6r" (OuterVolumeSpecName: "kube-api-access-b6q6r") pod "5bd464c7-cddf-45d2-8b39-67d3bbb399e5" (UID: "5bd464c7-cddf-45d2-8b39-67d3bbb399e5"). InnerVolumeSpecName "kube-api-access-b6q6r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:19:43.647579 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.647558 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-util" (OuterVolumeSpecName: "util") pod "5bd464c7-cddf-45d2-8b39-67d3bbb399e5" (UID: "5bd464c7-cddf-45d2-8b39-67d3bbb399e5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:19:43.743078 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.743035 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b6q6r\" (UniqueName: \"kubernetes.io/projected/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-kube-api-access-b6q6r\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:19:43.743078 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.743071 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-bundle\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:19:43.743078 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:43.743081 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5bd464c7-cddf-45d2-8b39-67d3bbb399e5-util\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:19:44.350853 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.350823 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" Apr 20 23:19:44.350853 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.350827 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r" event={"ID":"5bd464c7-cddf-45d2-8b39-67d3bbb399e5","Type":"ContainerDied","Data":"6964ea325d57bce001655f8f0f21e4ccc1cf9f51cd74ac5fb35edb409253b063"} Apr 20 23:19:44.350853 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.350864 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6964ea325d57bce001655f8f0f21e4ccc1cf9f51cd74ac5fb35edb409253b063" Apr 20 23:19:44.352636 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.352610 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" event={"ID":"c2b42f5d-467d-4e2f-bd50-980d960dbe2f","Type":"ContainerDied","Data":"17aaf4b5b9815379e7797222506ce1d893770a5cfad13659bea674e1fee123b3"} Apr 20 23:19:44.352636 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.352636 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17aaf4b5b9815379e7797222506ce1d893770a5cfad13659bea674e1fee123b3" Apr 20 23:19:44.352839 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.352787 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x" Apr 20 23:19:44.480737 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.480715 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" Apr 20 23:19:44.548819 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.548780 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a276cc4-8103-401a-8e8c-2965d47b3cfc-bundle\") pod \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\" (UID: \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\") " Apr 20 23:19:44.549039 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.548829 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a276cc4-8103-401a-8e8c-2965d47b3cfc-util\") pod \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\" (UID: \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\") " Apr 20 23:19:44.549039 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.548880 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj2vd\" (UniqueName: \"kubernetes.io/projected/5a276cc4-8103-401a-8e8c-2965d47b3cfc-kube-api-access-jj2vd\") pod \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\" (UID: \"5a276cc4-8103-401a-8e8c-2965d47b3cfc\") " Apr 20 23:19:44.549404 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.549372 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a276cc4-8103-401a-8e8c-2965d47b3cfc-bundle" (OuterVolumeSpecName: "bundle") pod "5a276cc4-8103-401a-8e8c-2965d47b3cfc" (UID: "5a276cc4-8103-401a-8e8c-2965d47b3cfc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:19:44.551116 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.551096 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a276cc4-8103-401a-8e8c-2965d47b3cfc-kube-api-access-jj2vd" (OuterVolumeSpecName: "kube-api-access-jj2vd") pod "5a276cc4-8103-401a-8e8c-2965d47b3cfc" (UID: "5a276cc4-8103-401a-8e8c-2965d47b3cfc"). InnerVolumeSpecName "kube-api-access-jj2vd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:19:44.554418 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.554378 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a276cc4-8103-401a-8e8c-2965d47b3cfc-util" (OuterVolumeSpecName: "util") pod "5a276cc4-8103-401a-8e8c-2965d47b3cfc" (UID: "5a276cc4-8103-401a-8e8c-2965d47b3cfc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:19:44.649680 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.649571 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5a276cc4-8103-401a-8e8c-2965d47b3cfc-bundle\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:19:44.649680 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.649616 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5a276cc4-8103-401a-8e8c-2965d47b3cfc-util\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:19:44.649680 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:44.649629 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jj2vd\" (UniqueName: \"kubernetes.io/projected/5a276cc4-8103-401a-8e8c-2965d47b3cfc-kube-api-access-jj2vd\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:19:45.358399 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:45.358364 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" Apr 20 23:19:45.358878 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:45.358364 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4" event={"ID":"5a276cc4-8103-401a-8e8c-2965d47b3cfc","Type":"ContainerDied","Data":"5374726c678b944557a43c5ece30418ea201e45d30258163da02115b0750693f"} Apr 20 23:19:45.358878 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:45.358472 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5374726c678b944557a43c5ece30418ea201e45d30258163da02115b0750693f" Apr 20 23:19:54.814836 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.814793 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8"] Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815133 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2b42f5d-467d-4e2f-bd50-980d960dbe2f" containerName="pull" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815145 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b42f5d-467d-4e2f-bd50-980d960dbe2f" containerName="pull" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815157 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a276cc4-8103-401a-8e8c-2965d47b3cfc" containerName="pull" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815165 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a276cc4-8103-401a-8e8c-2965d47b3cfc" containerName="pull" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815174 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e6194a3-80c9-4b57-ab03-32d6aaf59c47" containerName="util" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815179 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6194a3-80c9-4b57-ab03-32d6aaf59c47" containerName="util" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815185 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2b42f5d-467d-4e2f-bd50-980d960dbe2f" containerName="util" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815191 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b42f5d-467d-4e2f-bd50-980d960dbe2f" containerName="util" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815197 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5bd464c7-cddf-45d2-8b39-67d3bbb399e5" containerName="util" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815204 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd464c7-cddf-45d2-8b39-67d3bbb399e5" containerName="util" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815211 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5bd464c7-cddf-45d2-8b39-67d3bbb399e5" containerName="extract" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815216 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd464c7-cddf-45d2-8b39-67d3bbb399e5" containerName="extract" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815226 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5bd464c7-cddf-45d2-8b39-67d3bbb399e5" containerName="pull" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815233 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd464c7-cddf-45d2-8b39-67d3bbb399e5" containerName="pull" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815245 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2b42f5d-467d-4e2f-bd50-980d960dbe2f" containerName="extract" Apr 20 23:19:54.815245 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815250 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b42f5d-467d-4e2f-bd50-980d960dbe2f" containerName="extract" Apr 20 23:19:54.815709 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815258 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e6194a3-80c9-4b57-ab03-32d6aaf59c47" containerName="extract" Apr 20 23:19:54.815709 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815263 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6194a3-80c9-4b57-ab03-32d6aaf59c47" containerName="extract" Apr 20 23:19:54.815709 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815271 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e6194a3-80c9-4b57-ab03-32d6aaf59c47" containerName="pull" Apr 20 23:19:54.815709 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815276 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e6194a3-80c9-4b57-ab03-32d6aaf59c47" containerName="pull" Apr 20 23:19:54.815709 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815282 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a276cc4-8103-401a-8e8c-2965d47b3cfc" containerName="extract" Apr 20 23:19:54.815709 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815287 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a276cc4-8103-401a-8e8c-2965d47b3cfc" containerName="extract" Apr 20 23:19:54.815709 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815294 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a276cc4-8103-401a-8e8c-2965d47b3cfc" containerName="util" Apr 20 23:19:54.815709 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815299 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a276cc4-8103-401a-8e8c-2965d47b3cfc" containerName="util" Apr 20 23:19:54.815709 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815344 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2b42f5d-467d-4e2f-bd50-980d960dbe2f" containerName="extract" Apr 20 23:19:54.815709 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815352 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a276cc4-8103-401a-8e8c-2965d47b3cfc" containerName="extract" Apr 20 23:19:54.815709 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815359 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e6194a3-80c9-4b57-ab03-32d6aaf59c47" containerName="extract" Apr 20 23:19:54.815709 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.815366 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5bd464c7-cddf-45d2-8b39-67d3bbb399e5" containerName="extract" Apr 20 23:19:54.824312 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.824289 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" Apr 20 23:19:54.827424 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.827397 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-5dqcl\"" Apr 20 23:19:54.828362 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.828339 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8"] Apr 20 23:19:54.927981 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:54.927925 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrk9g\" (UniqueName: \"kubernetes.io/projected/b129d16b-bbf4-407f-ac87-63fbf2d8d8e0-kube-api-access-vrk9g\") pod \"limitador-operator-controller-manager-85c4996f8c-rchn8\" (UID: \"b129d16b-bbf4-407f-ac87-63fbf2d8d8e0\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" Apr 20 23:19:55.028399 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:55.028356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrk9g\" (UniqueName: \"kubernetes.io/projected/b129d16b-bbf4-407f-ac87-63fbf2d8d8e0-kube-api-access-vrk9g\") pod \"limitador-operator-controller-manager-85c4996f8c-rchn8\" (UID: \"b129d16b-bbf4-407f-ac87-63fbf2d8d8e0\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" Apr 20 23:19:55.038266 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:55.038233 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrk9g\" (UniqueName: \"kubernetes.io/projected/b129d16b-bbf4-407f-ac87-63fbf2d8d8e0-kube-api-access-vrk9g\") pod \"limitador-operator-controller-manager-85c4996f8c-rchn8\" (UID: \"b129d16b-bbf4-407f-ac87-63fbf2d8d8e0\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" Apr 20 23:19:55.135249 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:55.135209 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" Apr 20 23:19:55.259219 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:55.259189 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8"] Apr 20 23:19:55.260787 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:19:55.260752 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb129d16b_bbf4_407f_ac87_63fbf2d8d8e0.slice/crio-dc2a319244643a5de9e4a09aa2dfbb6ef7c182344aff9f8d34bedaaa56d8fcf9 WatchSource:0}: Error finding container dc2a319244643a5de9e4a09aa2dfbb6ef7c182344aff9f8d34bedaaa56d8fcf9: Status 404 returned error can't find the container with id dc2a319244643a5de9e4a09aa2dfbb6ef7c182344aff9f8d34bedaaa56d8fcf9 Apr 20 23:19:55.394139 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:55.394054 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" event={"ID":"b129d16b-bbf4-407f-ac87-63fbf2d8d8e0","Type":"ContainerStarted","Data":"dc2a319244643a5de9e4a09aa2dfbb6ef7c182344aff9f8d34bedaaa56d8fcf9"} Apr 20 23:19:57.404750 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:57.404709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" event={"ID":"b129d16b-bbf4-407f-ac87-63fbf2d8d8e0","Type":"ContainerStarted","Data":"6a719499afa0be02b39ebfe65c4965eaefe85102cbef2f6f5498792866228080"} Apr 20 23:19:57.405162 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:57.404781 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" Apr 20 23:19:57.423062 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:19:57.423008 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" podStartSLOduration=1.753360363 podStartE2EDuration="3.422995125s" podCreationTimestamp="2026-04-20 23:19:54 +0000 UTC" firstStartedPulling="2026-04-20 23:19:55.262675723 +0000 UTC m=+424.934044848" lastFinishedPulling="2026-04-20 23:19:56.932310482 +0000 UTC m=+426.603679610" observedRunningTime="2026-04-20 23:19:57.421901388 +0000 UTC m=+427.093270534" watchObservedRunningTime="2026-04-20 23:19:57.422995125 +0000 UTC m=+427.094364286" Apr 20 23:20:00.163909 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:00.163866 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx"] Apr 20 23:20:00.167184 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:00.167168 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" Apr 20 23:20:00.169598 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:00.169578 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-ws4dn\"" Apr 20 23:20:00.177386 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:00.177358 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx"] Apr 20 23:20:00.271586 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:00.271545 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds2bm\" (UniqueName: \"kubernetes.io/projected/904b80c7-25f6-4ff9-91ef-e1810595b719-kube-api-access-ds2bm\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7p5gx\" (UID: \"904b80c7-25f6-4ff9-91ef-e1810595b719\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" Apr 20 23:20:00.271760 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:00.271624 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/904b80c7-25f6-4ff9-91ef-e1810595b719-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7p5gx\" (UID: \"904b80c7-25f6-4ff9-91ef-e1810595b719\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" Apr 20 23:20:00.372118 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:00.372038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ds2bm\" (UniqueName: \"kubernetes.io/projected/904b80c7-25f6-4ff9-91ef-e1810595b719-kube-api-access-ds2bm\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7p5gx\" (UID: \"904b80c7-25f6-4ff9-91ef-e1810595b719\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" Apr 20 23:20:00.372118 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:00.372114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/904b80c7-25f6-4ff9-91ef-e1810595b719-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7p5gx\" (UID: \"904b80c7-25f6-4ff9-91ef-e1810595b719\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" Apr 20 23:20:00.372548 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:00.372528 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/904b80c7-25f6-4ff9-91ef-e1810595b719-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7p5gx\" (UID: \"904b80c7-25f6-4ff9-91ef-e1810595b719\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" Apr 20 23:20:00.386220 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:00.386192 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds2bm\" (UniqueName: \"kubernetes.io/projected/904b80c7-25f6-4ff9-91ef-e1810595b719-kube-api-access-ds2bm\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-7p5gx\" (UID: \"904b80c7-25f6-4ff9-91ef-e1810595b719\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" Apr 20 23:20:00.477358 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:00.477272 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" Apr 20 23:20:00.612511 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:00.612485 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx"] Apr 20 23:20:00.614484 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:20:00.614454 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod904b80c7_25f6_4ff9_91ef_e1810595b719.slice/crio-bd8fa72ef123de8f0db03d4d9cac32185e02336b8e377417a4f2a925af184d9d WatchSource:0}: Error finding container bd8fa72ef123de8f0db03d4d9cac32185e02336b8e377417a4f2a925af184d9d: Status 404 returned error can't find the container with id bd8fa72ef123de8f0db03d4d9cac32185e02336b8e377417a4f2a925af184d9d Apr 20 23:20:01.420694 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:01.420642 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" event={"ID":"904b80c7-25f6-4ff9-91ef-e1810595b719","Type":"ContainerStarted","Data":"bd8fa72ef123de8f0db03d4d9cac32185e02336b8e377417a4f2a925af184d9d"} Apr 20 23:20:06.443790 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:06.443753 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" event={"ID":"904b80c7-25f6-4ff9-91ef-e1810595b719","Type":"ContainerStarted","Data":"b7532da179b7ad1ea1f1640f6b947e7e6d3b49fbd1c3bfacd528c277f888c546"} Apr 20 23:20:06.444353 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:06.443843 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" Apr 20 23:20:06.463921 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:06.463870 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" podStartSLOduration=1.480470342 podStartE2EDuration="6.463854465s" podCreationTimestamp="2026-04-20 23:20:00 +0000 UTC" firstStartedPulling="2026-04-20 23:20:00.61695788 +0000 UTC m=+430.288327018" lastFinishedPulling="2026-04-20 23:20:05.600342012 +0000 UTC m=+435.271711141" observedRunningTime="2026-04-20 23:20:06.461926239 +0000 UTC m=+436.133295385" watchObservedRunningTime="2026-04-20 23:20:06.463854465 +0000 UTC m=+436.135223612" Apr 20 23:20:08.411018 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.410985 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" Apr 20 23:20:08.488766 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.488727 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs"] Apr 20 23:20:08.492163 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.492144 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs" Apr 20 23:20:08.494342 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.494318 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 23:20:08.494494 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.494346 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-8v2gg\"" Apr 20 23:20:08.494494 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.494398 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 23:20:08.499663 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.499326 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs"] Apr 20 23:20:08.649343 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.649307 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dac056d3-bd56-4772-80bc-fce96d35e020-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-ppwgs\" (UID: \"dac056d3-bd56-4772-80bc-fce96d35e020\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs" Apr 20 23:20:08.649506 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.649361 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zhq7\" (UniqueName: \"kubernetes.io/projected/dac056d3-bd56-4772-80bc-fce96d35e020-kube-api-access-4zhq7\") pod \"kuadrant-console-plugin-6cb54b5c86-ppwgs\" (UID: \"dac056d3-bd56-4772-80bc-fce96d35e020\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs" Apr 20 23:20:08.649506 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.649413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dac056d3-bd56-4772-80bc-fce96d35e020-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-ppwgs\" (UID: \"dac056d3-bd56-4772-80bc-fce96d35e020\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs" Apr 20 23:20:08.750720 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.750627 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dac056d3-bd56-4772-80bc-fce96d35e020-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-ppwgs\" (UID: \"dac056d3-bd56-4772-80bc-fce96d35e020\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs" Apr 20 23:20:08.750720 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.750678 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zhq7\" (UniqueName: \"kubernetes.io/projected/dac056d3-bd56-4772-80bc-fce96d35e020-kube-api-access-4zhq7\") pod \"kuadrant-console-plugin-6cb54b5c86-ppwgs\" (UID: \"dac056d3-bd56-4772-80bc-fce96d35e020\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs" Apr 20 23:20:08.750720 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.750719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dac056d3-bd56-4772-80bc-fce96d35e020-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-ppwgs\" (UID: \"dac056d3-bd56-4772-80bc-fce96d35e020\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs" Apr 20 23:20:08.751566 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.751540 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/dac056d3-bd56-4772-80bc-fce96d35e020-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-ppwgs\" (UID: \"dac056d3-bd56-4772-80bc-fce96d35e020\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs" Apr 20 23:20:08.753203 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.753165 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/dac056d3-bd56-4772-80bc-fce96d35e020-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-ppwgs\" (UID: \"dac056d3-bd56-4772-80bc-fce96d35e020\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs" Apr 20 23:20:08.758595 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.758567 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zhq7\" (UniqueName: \"kubernetes.io/projected/dac056d3-bd56-4772-80bc-fce96d35e020-kube-api-access-4zhq7\") pod \"kuadrant-console-plugin-6cb54b5c86-ppwgs\" (UID: \"dac056d3-bd56-4772-80bc-fce96d35e020\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs" Apr 20 23:20:08.801913 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.801876 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs" Apr 20 23:20:08.928861 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:08.928838 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs"] Apr 20 23:20:08.931001 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:20:08.930975 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddac056d3_bd56_4772_80bc_fce96d35e020.slice/crio-f24209e82540ed3235db89f27535b7bcfe237abb4d46cf49f058f4db14c6e7fa WatchSource:0}: Error finding container f24209e82540ed3235db89f27535b7bcfe237abb4d46cf49f058f4db14c6e7fa: Status 404 returned error can't find the container with id f24209e82540ed3235db89f27535b7bcfe237abb4d46cf49f058f4db14c6e7fa Apr 20 23:20:09.455069 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:09.455032 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs" event={"ID":"dac056d3-bd56-4772-80bc-fce96d35e020","Type":"ContainerStarted","Data":"f24209e82540ed3235db89f27535b7bcfe237abb4d46cf49f058f4db14c6e7fa"} Apr 20 23:20:17.450908 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:17.450879 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" Apr 20 23:20:19.355188 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.355149 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx"] Apr 20 23:20:19.355611 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.355384 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" podUID="904b80c7-25f6-4ff9-91ef-e1810595b719" containerName="manager" containerID="cri-o://b7532da179b7ad1ea1f1640f6b947e7e6d3b49fbd1c3bfacd528c277f888c546" gracePeriod=2 Apr 20 23:20:19.367460 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.367421 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g"] Apr 20 23:20:19.382809 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.382758 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx"] Apr 20 23:20:19.383269 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.383158 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g" Apr 20 23:20:19.387187 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.387148 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g"] Apr 20 23:20:19.414249 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.414215 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8"] Apr 20 23:20:19.414482 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.414457 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" podUID="b129d16b-bbf4-407f-ac87-63fbf2d8d8e0" containerName="manager" containerID="cri-o://6a719499afa0be02b39ebfe65c4965eaefe85102cbef2f6f5498792866228080" gracePeriod=2 Apr 20 23:20:19.429154 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.429057 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8"] Apr 20 23:20:19.434865 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.434835 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd4mg\" (UniqueName: \"kubernetes.io/projected/16f36687-4eac-470f-9725-842eeb9bef3f-kube-api-access-bd4mg\") pod \"kuadrant-operator-controller-manager-55c7f4c975-k444g\" (UID: \"16f36687-4eac-470f-9725-842eeb9bef3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g" Apr 20 23:20:19.435027 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.434896 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/16f36687-4eac-470f-9725-842eeb9bef3f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-k444g\" (UID: \"16f36687-4eac-470f-9725-842eeb9bef3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g" Apr 20 23:20:19.439289 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.439262 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pfdcl"] Apr 20 23:20:19.439586 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.439572 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="904b80c7-25f6-4ff9-91ef-e1810595b719" containerName="manager" Apr 20 23:20:19.439661 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.439588 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="904b80c7-25f6-4ff9-91ef-e1810595b719" containerName="manager" Apr 20 23:20:19.439661 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.439608 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b129d16b-bbf4-407f-ac87-63fbf2d8d8e0" containerName="manager" Apr 20 23:20:19.439661 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.439614 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b129d16b-bbf4-407f-ac87-63fbf2d8d8e0" containerName="manager" Apr 20 23:20:19.439820 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.439675 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="904b80c7-25f6-4ff9-91ef-e1810595b719" containerName="manager" Apr 20 23:20:19.439820 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.439688 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b129d16b-bbf4-407f-ac87-63fbf2d8d8e0" containerName="manager" Apr 20 23:20:19.444417 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.444395 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pfdcl" Apr 20 23:20:19.449044 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.449014 2577 status_manager.go:895] "Failed to get status for pod" podUID="b129d16b-bbf4-407f-ac87-63fbf2d8d8e0" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" err="pods \"limitador-operator-controller-manager-85c4996f8c-rchn8\" is forbidden: User \"system:node:ip-10-0-134-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-166.ec2.internal' and this object" Apr 20 23:20:19.454653 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.454625 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pfdcl"] Apr 20 23:20:19.535330 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.535282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/16f36687-4eac-470f-9725-842eeb9bef3f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-k444g\" (UID: \"16f36687-4eac-470f-9725-842eeb9bef3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g" Apr 20 23:20:19.535527 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.535369 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bd4mg\" (UniqueName: \"kubernetes.io/projected/16f36687-4eac-470f-9725-842eeb9bef3f-kube-api-access-bd4mg\") pod \"kuadrant-operator-controller-manager-55c7f4c975-k444g\" (UID: \"16f36687-4eac-470f-9725-842eeb9bef3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g" Apr 20 23:20:19.535527 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.535404 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7qn6\" (UniqueName: \"kubernetes.io/projected/5ab29ae2-f438-409a-a976-948fc8bc749a-kube-api-access-x7qn6\") pod \"limitador-operator-controller-manager-85c4996f8c-pfdcl\" (UID: \"5ab29ae2-f438-409a-a976-948fc8bc749a\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pfdcl" Apr 20 23:20:19.535753 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.535729 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/16f36687-4eac-470f-9725-842eeb9bef3f-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-k444g\" (UID: \"16f36687-4eac-470f-9725-842eeb9bef3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g" Apr 20 23:20:19.544981 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.544928 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd4mg\" (UniqueName: \"kubernetes.io/projected/16f36687-4eac-470f-9725-842eeb9bef3f-kube-api-access-bd4mg\") pod \"kuadrant-operator-controller-manager-55c7f4c975-k444g\" (UID: \"16f36687-4eac-470f-9725-842eeb9bef3f\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g" Apr 20 23:20:19.635953 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.635846 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7qn6\" (UniqueName: \"kubernetes.io/projected/5ab29ae2-f438-409a-a976-948fc8bc749a-kube-api-access-x7qn6\") pod \"limitador-operator-controller-manager-85c4996f8c-pfdcl\" (UID: \"5ab29ae2-f438-409a-a976-948fc8bc749a\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pfdcl" Apr 20 23:20:19.644678 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.644647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7qn6\" (UniqueName: \"kubernetes.io/projected/5ab29ae2-f438-409a-a976-948fc8bc749a-kube-api-access-x7qn6\") pod \"limitador-operator-controller-manager-85c4996f8c-pfdcl\" (UID: \"5ab29ae2-f438-409a-a976-948fc8bc749a\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pfdcl" Apr 20 23:20:19.796872 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.796818 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g" Apr 20 23:20:19.804736 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:19.804702 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pfdcl" Apr 20 23:20:20.921071 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:20.921021 2577 status_manager.go:895] "Failed to get status for pod" podUID="b129d16b-bbf4-407f-ac87-63fbf2d8d8e0" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" err="pods \"limitador-operator-controller-manager-85c4996f8c-rchn8\" is forbidden: User \"system:node:ip-10-0-134-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-166.ec2.internal' and this object" Apr 20 23:20:32.169830 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.169533 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" Apr 20 23:20:32.172223 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.172181 2577 status_manager.go:895] "Failed to get status for pod" podUID="904b80c7-25f6-4ff9-91ef-e1810595b719" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-7p5gx\" is forbidden: User \"system:node:ip-10-0-134-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-166.ec2.internal' and this object" Apr 20 23:20:32.174660 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.174637 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" Apr 20 23:20:32.178020 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.177993 2577 status_manager.go:895] "Failed to get status for pod" podUID="904b80c7-25f6-4ff9-91ef-e1810595b719" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-7p5gx\" is forbidden: User \"system:node:ip-10-0-134-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-166.ec2.internal' and this object" Apr 20 23:20:32.179744 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.179715 2577 status_manager.go:895] "Failed to get status for pod" podUID="b129d16b-bbf4-407f-ac87-63fbf2d8d8e0" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" err="pods \"limitador-operator-controller-manager-85c4996f8c-rchn8\" is forbidden: User \"system:node:ip-10-0-134-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-166.ec2.internal' and this object" Apr 20 23:20:32.220071 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.220039 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g"] Apr 20 23:20:32.220559 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:20:32.220532 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16f36687_4eac_470f_9725_842eeb9bef3f.slice/crio-72dfa1c343ca8a7fd87d17929ad31147e96cc080e675228c80029be1be66f6d3 WatchSource:0}: Error finding container 72dfa1c343ca8a7fd87d17929ad31147e96cc080e675228c80029be1be66f6d3: Status 404 returned error can't find the container with id 72dfa1c343ca8a7fd87d17929ad31147e96cc080e675228c80029be1be66f6d3 Apr 20 23:20:32.235862 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.235832 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pfdcl"] Apr 20 23:20:32.236823 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:20:32.236791 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ab29ae2_f438_409a_a976_948fc8bc749a.slice/crio-d3a7a6b91596a7f5d429772a94e3dd0fea0ad6aebc4a49742614a72dfac88b59 WatchSource:0}: Error finding container d3a7a6b91596a7f5d429772a94e3dd0fea0ad6aebc4a49742614a72dfac88b59: Status 404 returned error can't find the container with id d3a7a6b91596a7f5d429772a94e3dd0fea0ad6aebc4a49742614a72dfac88b59 Apr 20 23:20:32.239284 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.239259 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/904b80c7-25f6-4ff9-91ef-e1810595b719-extensions-socket-volume\") pod \"904b80c7-25f6-4ff9-91ef-e1810595b719\" (UID: \"904b80c7-25f6-4ff9-91ef-e1810595b719\") " Apr 20 23:20:32.239385 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.239310 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds2bm\" (UniqueName: \"kubernetes.io/projected/904b80c7-25f6-4ff9-91ef-e1810595b719-kube-api-access-ds2bm\") pod \"904b80c7-25f6-4ff9-91ef-e1810595b719\" (UID: \"904b80c7-25f6-4ff9-91ef-e1810595b719\") " Apr 20 23:20:32.239385 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.239367 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrk9g\" (UniqueName: \"kubernetes.io/projected/b129d16b-bbf4-407f-ac87-63fbf2d8d8e0-kube-api-access-vrk9g\") pod \"b129d16b-bbf4-407f-ac87-63fbf2d8d8e0\" (UID: \"b129d16b-bbf4-407f-ac87-63fbf2d8d8e0\") " Apr 20 23:20:32.239783 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.239756 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/904b80c7-25f6-4ff9-91ef-e1810595b719-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "904b80c7-25f6-4ff9-91ef-e1810595b719" (UID: "904b80c7-25f6-4ff9-91ef-e1810595b719"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 23:20:32.241471 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.241440 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/904b80c7-25f6-4ff9-91ef-e1810595b719-kube-api-access-ds2bm" (OuterVolumeSpecName: "kube-api-access-ds2bm") pod "904b80c7-25f6-4ff9-91ef-e1810595b719" (UID: "904b80c7-25f6-4ff9-91ef-e1810595b719"). InnerVolumeSpecName "kube-api-access-ds2bm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:20:32.241562 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.241498 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b129d16b-bbf4-407f-ac87-63fbf2d8d8e0-kube-api-access-vrk9g" (OuterVolumeSpecName: "kube-api-access-vrk9g") pod "b129d16b-bbf4-407f-ac87-63fbf2d8d8e0" (UID: "b129d16b-bbf4-407f-ac87-63fbf2d8d8e0"). InnerVolumeSpecName "kube-api-access-vrk9g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:20:32.341107 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.341064 2577 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/904b80c7-25f6-4ff9-91ef-e1810595b719-extensions-socket-volume\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:20:32.341107 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.341109 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ds2bm\" (UniqueName: \"kubernetes.io/projected/904b80c7-25f6-4ff9-91ef-e1810595b719-kube-api-access-ds2bm\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:20:32.341315 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.341124 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vrk9g\" (UniqueName: \"kubernetes.io/projected/b129d16b-bbf4-407f-ac87-63fbf2d8d8e0-kube-api-access-vrk9g\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:20:32.551838 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.551738 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pfdcl" event={"ID":"5ab29ae2-f438-409a-a976-948fc8bc749a","Type":"ContainerStarted","Data":"6d670171267eabbc53f6abb5b3457279dd3211b55078aaa454acb9b291817590"} Apr 20 23:20:32.551838 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.551789 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pfdcl" event={"ID":"5ab29ae2-f438-409a-a976-948fc8bc749a","Type":"ContainerStarted","Data":"d3a7a6b91596a7f5d429772a94e3dd0fea0ad6aebc4a49742614a72dfac88b59"} Apr 20 23:20:32.552094 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.551847 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pfdcl" Apr 20 23:20:32.552882 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.552856 2577 generic.go:358] "Generic (PLEG): container finished" podID="904b80c7-25f6-4ff9-91ef-e1810595b719" containerID="b7532da179b7ad1ea1f1640f6b947e7e6d3b49fbd1c3bfacd528c277f888c546" exitCode=0 Apr 20 23:20:32.553013 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.552916 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" Apr 20 23:20:32.553013 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.552956 2577 scope.go:117] "RemoveContainer" containerID="b7532da179b7ad1ea1f1640f6b947e7e6d3b49fbd1c3bfacd528c277f888c546" Apr 20 23:20:32.554073 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.554018 2577 status_manager.go:895] "Failed to get status for pod" podUID="904b80c7-25f6-4ff9-91ef-e1810595b719" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-7p5gx\" is forbidden: User \"system:node:ip-10-0-134-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-166.ec2.internal' and this object" Apr 20 23:20:32.554247 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.554223 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs" event={"ID":"dac056d3-bd56-4772-80bc-fce96d35e020","Type":"ContainerStarted","Data":"61b239d7d7e92314292ca0bc2a167a5322d080a4035a2d12ec599bc3b40835a9"} Apr 20 23:20:32.555703 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.555680 2577 status_manager.go:895] "Failed to get status for pod" podUID="b129d16b-bbf4-407f-ac87-63fbf2d8d8e0" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" err="pods \"limitador-operator-controller-manager-85c4996f8c-rchn8\" is forbidden: User \"system:node:ip-10-0-134-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-166.ec2.internal' and this object" Apr 20 23:20:32.555829 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.555776 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g" event={"ID":"16f36687-4eac-470f-9725-842eeb9bef3f","Type":"ContainerStarted","Data":"c1ff93888a30d29bdb68a7457ff039cba4fb3c2c44a13adc1445938cdc403adb"} Apr 20 23:20:32.555829 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.555808 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g" event={"ID":"16f36687-4eac-470f-9725-842eeb9bef3f","Type":"ContainerStarted","Data":"72dfa1c343ca8a7fd87d17929ad31147e96cc080e675228c80029be1be66f6d3"} Apr 20 23:20:32.555959 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.555890 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g" Apr 20 23:20:32.556980 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.556937 2577 generic.go:358] "Generic (PLEG): container finished" podID="b129d16b-bbf4-407f-ac87-63fbf2d8d8e0" containerID="6a719499afa0be02b39ebfe65c4965eaefe85102cbef2f6f5498792866228080" exitCode=0 Apr 20 23:20:32.557080 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.556997 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" Apr 20 23:20:32.562183 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.562161 2577 scope.go:117] "RemoveContainer" containerID="b7532da179b7ad1ea1f1640f6b947e7e6d3b49fbd1c3bfacd528c277f888c546" Apr 20 23:20:32.562462 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:20:32.562430 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7532da179b7ad1ea1f1640f6b947e7e6d3b49fbd1c3bfacd528c277f888c546\": container with ID starting with b7532da179b7ad1ea1f1640f6b947e7e6d3b49fbd1c3bfacd528c277f888c546 not found: ID does not exist" containerID="b7532da179b7ad1ea1f1640f6b947e7e6d3b49fbd1c3bfacd528c277f888c546" Apr 20 23:20:32.562569 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.562462 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7532da179b7ad1ea1f1640f6b947e7e6d3b49fbd1c3bfacd528c277f888c546"} err="failed to get container status \"b7532da179b7ad1ea1f1640f6b947e7e6d3b49fbd1c3bfacd528c277f888c546\": rpc error: code = NotFound desc = could not find container \"b7532da179b7ad1ea1f1640f6b947e7e6d3b49fbd1c3bfacd528c277f888c546\": container with ID starting with b7532da179b7ad1ea1f1640f6b947e7e6d3b49fbd1c3bfacd528c277f888c546 not found: ID does not exist" Apr 20 23:20:32.562569 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.562486 2577 scope.go:117] "RemoveContainer" containerID="6a719499afa0be02b39ebfe65c4965eaefe85102cbef2f6f5498792866228080" Apr 20 23:20:32.570566 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.570520 2577 scope.go:117] "RemoveContainer" containerID="6a719499afa0be02b39ebfe65c4965eaefe85102cbef2f6f5498792866228080" Apr 20 23:20:32.570968 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:20:32.570923 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a719499afa0be02b39ebfe65c4965eaefe85102cbef2f6f5498792866228080\": container with ID starting with 6a719499afa0be02b39ebfe65c4965eaefe85102cbef2f6f5498792866228080 not found: ID does not exist" containerID="6a719499afa0be02b39ebfe65c4965eaefe85102cbef2f6f5498792866228080" Apr 20 23:20:32.571074 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.570989 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a719499afa0be02b39ebfe65c4965eaefe85102cbef2f6f5498792866228080"} err="failed to get container status \"6a719499afa0be02b39ebfe65c4965eaefe85102cbef2f6f5498792866228080\": rpc error: code = NotFound desc = could not find container \"6a719499afa0be02b39ebfe65c4965eaefe85102cbef2f6f5498792866228080\": container with ID starting with 6a719499afa0be02b39ebfe65c4965eaefe85102cbef2f6f5498792866228080 not found: ID does not exist" Apr 20 23:20:32.572292 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.572267 2577 status_manager.go:895] "Failed to get status for pod" podUID="904b80c7-25f6-4ff9-91ef-e1810595b719" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-7p5gx" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-7p5gx\" is forbidden: User \"system:node:ip-10-0-134-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-166.ec2.internal' and this object" Apr 20 23:20:32.572724 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.572688 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pfdcl" podStartSLOduration=13.572676607 podStartE2EDuration="13.572676607s" podCreationTimestamp="2026-04-20 23:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:20:32.570396533 +0000 UTC m=+462.241765679" watchObservedRunningTime="2026-04-20 23:20:32.572676607 +0000 UTC m=+462.244045754" Apr 20 23:20:32.573845 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.573820 2577 status_manager.go:895] "Failed to get status for pod" podUID="b129d16b-bbf4-407f-ac87-63fbf2d8d8e0" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rchn8" err="pods \"limitador-operator-controller-manager-85c4996f8c-rchn8\" is forbidden: User \"system:node:ip-10-0-134-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-166.ec2.internal' and this object" Apr 20 23:20:32.589887 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.589839 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ppwgs" podStartSLOduration=1.40858106 podStartE2EDuration="24.589823428s" podCreationTimestamp="2026-04-20 23:20:08 +0000 UTC" firstStartedPulling="2026-04-20 23:20:08.932407444 +0000 UTC m=+438.603776572" lastFinishedPulling="2026-04-20 23:20:32.113649811 +0000 UTC m=+461.785018940" observedRunningTime="2026-04-20 23:20:32.587541535 +0000 UTC m=+462.258910681" watchObservedRunningTime="2026-04-20 23:20:32.589823428 +0000 UTC m=+462.261192574" Apr 20 23:20:32.604131 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.604082 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g" podStartSLOduration=13.604066261 podStartE2EDuration="13.604066261s" podCreationTimestamp="2026-04-20 23:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:20:32.602795286 +0000 UTC m=+462.274164434" watchObservedRunningTime="2026-04-20 23:20:32.604066261 +0000 UTC m=+462.275435472" Apr 20 23:20:32.919903 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.919867 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="904b80c7-25f6-4ff9-91ef-e1810595b719" path="/var/lib/kubelet/pods/904b80c7-25f6-4ff9-91ef-e1810595b719/volumes" Apr 20 23:20:32.920225 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:32.920212 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b129d16b-bbf4-407f-ac87-63fbf2d8d8e0" path="/var/lib/kubelet/pods/b129d16b-bbf4-407f-ac87-63fbf2d8d8e0/volumes" Apr 20 23:20:43.565723 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:43.565688 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-pfdcl" Apr 20 23:20:43.566116 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:43.565743 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-k444g" Apr 20 23:20:59.899824 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:59.899785 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp"] Apr 20 23:20:59.991194 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:59.991153 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp"] Apr 20 23:20:59.991375 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:59.991319 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:20:59.993618 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:20:59.993595 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-phq4h\"" Apr 20 23:21:00.070110 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.070066 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.070110 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.070108 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.070342 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.070172 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1fcfc911-4199-4661-947f-23b4b9e69300-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.070342 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.070304 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcvnl\" (UniqueName: \"kubernetes.io/projected/1fcfc911-4199-4661-947f-23b4b9e69300-kube-api-access-jcvnl\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.070432 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.070355 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1fcfc911-4199-4661-947f-23b4b9e69300-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.070432 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.070411 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.070503 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.070439 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1fcfc911-4199-4661-947f-23b4b9e69300-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.070503 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.070478 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.070572 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.070508 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.171330 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.171231 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1fcfc911-4199-4661-947f-23b4b9e69300-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.171330 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.171284 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.171330 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.171310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1fcfc911-4199-4661-947f-23b4b9e69300-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.171630 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.171346 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.171630 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.171381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.171630 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.171413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.171630 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.171449 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.171630 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.171495 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1fcfc911-4199-4661-947f-23b4b9e69300-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.171630 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.171541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcvnl\" (UniqueName: \"kubernetes.io/projected/1fcfc911-4199-4661-947f-23b4b9e69300-kube-api-access-jcvnl\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.171905 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.171838 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.172005 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.171980 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.172076 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.172001 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.172131 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.172093 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.172308 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.172286 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/1fcfc911-4199-4661-947f-23b4b9e69300-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.173932 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.173912 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/1fcfc911-4199-4661-947f-23b4b9e69300-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.174020 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.173999 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/1fcfc911-4199-4661-947f-23b4b9e69300-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.180215 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.180171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/1fcfc911-4199-4661-947f-23b4b9e69300-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.180336 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.180244 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcvnl\" (UniqueName: \"kubernetes.io/projected/1fcfc911-4199-4661-947f-23b4b9e69300-kube-api-access-jcvnl\") pod \"maas-default-gateway-openshift-default-58b6f876-95dhp\" (UID: \"1fcfc911-4199-4661-947f-23b4b9e69300\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.302005 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.301933 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:00.431685 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.431607 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp"] Apr 20 23:21:00.434726 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:21:00.434689 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fcfc911_4199_4661_947f_23b4b9e69300.slice/crio-fd0b9ef86686e47c4a5a95e3d8f21e6d9e1a8a2a62a9226de21da3a660275700 WatchSource:0}: Error finding container fd0b9ef86686e47c4a5a95e3d8f21e6d9e1a8a2a62a9226de21da3a660275700: Status 404 returned error can't find the container with id fd0b9ef86686e47c4a5a95e3d8f21e6d9e1a8a2a62a9226de21da3a660275700 Apr 20 23:21:00.436891 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.436852 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 20 23:21:00.436997 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.436966 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 20 23:21:00.437041 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.437002 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 20 23:21:00.670455 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.670421 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" event={"ID":"1fcfc911-4199-4661-947f-23b4b9e69300","Type":"ContainerStarted","Data":"eb682f5976684e457ade772e750d508f96e06cec3eadb2aee01017d9f0fe0060"} Apr 20 23:21:00.670455 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.670461 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" event={"ID":"1fcfc911-4199-4661-947f-23b4b9e69300","Type":"ContainerStarted","Data":"fd0b9ef86686e47c4a5a95e3d8f21e6d9e1a8a2a62a9226de21da3a660275700"} Apr 20 23:21:00.704517 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:00.703467 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" podStartSLOduration=1.703446468 podStartE2EDuration="1.703446468s" podCreationTimestamp="2026-04-20 23:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:21:00.695546115 +0000 UTC m=+490.366915261" watchObservedRunningTime="2026-04-20 23:21:00.703446468 +0000 UTC m=+490.374815618" Apr 20 23:21:01.302383 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:01.302339 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:01.307604 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:01.307578 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:01.674327 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:01.674294 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:21:01.675535 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:21:01.675511 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-95dhp" Apr 20 23:22:50.821782 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:22:50.821750 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_ff387f4dda566eb1da7627d669fc453f/kube-rbac-proxy-crio/2.log" Apr 20 23:22:50.822354 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:22:50.822169 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_ff387f4dda566eb1da7627d669fc453f/kube-rbac-proxy-crio/2.log" Apr 20 23:27:34.061775 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.061686 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-d94mw"] Apr 20 23:27:34.065150 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.065125 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-d94mw" Apr 20 23:27:34.068080 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.067780 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wl44v\"" Apr 20 23:27:34.070259 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.070234 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-d94mw"] Apr 20 23:27:34.145791 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.145553 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qk6p\" (UniqueName: \"kubernetes.io/projected/c242e56f-dc46-4205-97c2-6cd996b11cb0-kube-api-access-5qk6p\") pod \"authorino-f99f4b5cd-d94mw\" (UID: \"c242e56f-dc46-4205-97c2-6cd996b11cb0\") " pod="kuadrant-system/authorino-f99f4b5cd-d94mw" Apr 20 23:27:34.247534 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.247105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qk6p\" (UniqueName: \"kubernetes.io/projected/c242e56f-dc46-4205-97c2-6cd996b11cb0-kube-api-access-5qk6p\") pod \"authorino-f99f4b5cd-d94mw\" (UID: \"c242e56f-dc46-4205-97c2-6cd996b11cb0\") " pod="kuadrant-system/authorino-f99f4b5cd-d94mw" Apr 20 23:27:34.255808 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.255778 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qk6p\" (UniqueName: \"kubernetes.io/projected/c242e56f-dc46-4205-97c2-6cd996b11cb0-kube-api-access-5qk6p\") pod \"authorino-f99f4b5cd-d94mw\" (UID: \"c242e56f-dc46-4205-97c2-6cd996b11cb0\") " pod="kuadrant-system/authorino-f99f4b5cd-d94mw" Apr 20 23:27:34.397306 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.397270 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-d94mw" Apr 20 23:27:34.491017 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.490986 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-48hbc"] Apr 20 23:27:34.494230 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.494212 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-48hbc" Apr 20 23:27:34.501097 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.500758 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-48hbc"] Apr 20 23:27:34.551969 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.550401 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4tr7\" (UniqueName: \"kubernetes.io/projected/d2d8e2e6-16e5-42a9-bbcd-0d567bacf009-kube-api-access-l4tr7\") pod \"authorino-7498df8756-48hbc\" (UID: \"d2d8e2e6-16e5-42a9-bbcd-0d567bacf009\") " pod="kuadrant-system/authorino-7498df8756-48hbc" Apr 20 23:27:34.576613 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.576584 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-d94mw"] Apr 20 23:27:34.578086 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:27:34.578056 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc242e56f_dc46_4205_97c2_6cd996b11cb0.slice/crio-f6aebea7ddee84d41c36e17846d1cf903f2d957bd9806d4bfb49f2727ee0a989 WatchSource:0}: Error finding container f6aebea7ddee84d41c36e17846d1cf903f2d957bd9806d4bfb49f2727ee0a989: Status 404 returned error can't find the container with id f6aebea7ddee84d41c36e17846d1cf903f2d957bd9806d4bfb49f2727ee0a989 Apr 20 23:27:34.579235 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.579213 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 23:27:34.652549 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.652444 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4tr7\" (UniqueName: \"kubernetes.io/projected/d2d8e2e6-16e5-42a9-bbcd-0d567bacf009-kube-api-access-l4tr7\") pod \"authorino-7498df8756-48hbc\" (UID: \"d2d8e2e6-16e5-42a9-bbcd-0d567bacf009\") " pod="kuadrant-system/authorino-7498df8756-48hbc" Apr 20 23:27:34.661156 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.661125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4tr7\" (UniqueName: \"kubernetes.io/projected/d2d8e2e6-16e5-42a9-bbcd-0d567bacf009-kube-api-access-l4tr7\") pod \"authorino-7498df8756-48hbc\" (UID: \"d2d8e2e6-16e5-42a9-bbcd-0d567bacf009\") " pod="kuadrant-system/authorino-7498df8756-48hbc" Apr 20 23:27:34.807128 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.807088 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-48hbc" Apr 20 23:27:34.956986 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:34.956958 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-48hbc"] Apr 20 23:27:34.958477 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:27:34.958445 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2d8e2e6_16e5_42a9_bbcd_0d567bacf009.slice/crio-ee321cef0658d426f88d9a48246b2509ba5656074721b53d397917f8e725264d WatchSource:0}: Error finding container ee321cef0658d426f88d9a48246b2509ba5656074721b53d397917f8e725264d: Status 404 returned error can't find the container with id ee321cef0658d426f88d9a48246b2509ba5656074721b53d397917f8e725264d Apr 20 23:27:35.107577 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:35.107534 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-48hbc" event={"ID":"d2d8e2e6-16e5-42a9-bbcd-0d567bacf009","Type":"ContainerStarted","Data":"ee321cef0658d426f88d9a48246b2509ba5656074721b53d397917f8e725264d"} Apr 20 23:27:35.108478 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:35.108459 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-d94mw" event={"ID":"c242e56f-dc46-4205-97c2-6cd996b11cb0","Type":"ContainerStarted","Data":"f6aebea7ddee84d41c36e17846d1cf903f2d957bd9806d4bfb49f2727ee0a989"} Apr 20 23:27:39.133512 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:39.133474 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-48hbc" event={"ID":"d2d8e2e6-16e5-42a9-bbcd-0d567bacf009","Type":"ContainerStarted","Data":"dd61b11cab33371ddbb671a4a7e3d21999761d2f76ba402fc379e43a023d0585"} Apr 20 23:27:39.134900 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:39.134874 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-d94mw" event={"ID":"c242e56f-dc46-4205-97c2-6cd996b11cb0","Type":"ContainerStarted","Data":"eaf6e07fe62e4db7b4c662bfdff71a609954170fbb91c69d62a0e4ca6ac5825a"} Apr 20 23:27:39.148616 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:39.148570 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-48hbc" podStartSLOduration=1.834320522 podStartE2EDuration="5.148556178s" podCreationTimestamp="2026-04-20 23:27:34 +0000 UTC" firstStartedPulling="2026-04-20 23:27:34.960145377 +0000 UTC m=+884.631514502" lastFinishedPulling="2026-04-20 23:27:38.274381034 +0000 UTC m=+887.945750158" observedRunningTime="2026-04-20 23:27:39.147901613 +0000 UTC m=+888.819270752" watchObservedRunningTime="2026-04-20 23:27:39.148556178 +0000 UTC m=+888.819925322" Apr 20 23:27:39.162507 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:39.162449 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-d94mw" podStartSLOduration=1.4783638319999999 podStartE2EDuration="5.162432895s" podCreationTimestamp="2026-04-20 23:27:34 +0000 UTC" firstStartedPulling="2026-04-20 23:27:34.5794189 +0000 UTC m=+884.250788029" lastFinishedPulling="2026-04-20 23:27:38.263487953 +0000 UTC m=+887.934857092" observedRunningTime="2026-04-20 23:27:39.160242518 +0000 UTC m=+888.831611664" watchObservedRunningTime="2026-04-20 23:27:39.162432895 +0000 UTC m=+888.833802042" Apr 20 23:27:39.188246 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:39.188210 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-d94mw"] Apr 20 23:27:41.142459 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:41.142420 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-d94mw" podUID="c242e56f-dc46-4205-97c2-6cd996b11cb0" containerName="authorino" containerID="cri-o://eaf6e07fe62e4db7b4c662bfdff71a609954170fbb91c69d62a0e4ca6ac5825a" gracePeriod=30 Apr 20 23:27:41.385682 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:41.385658 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-d94mw" Apr 20 23:27:41.512740 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:41.512653 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qk6p\" (UniqueName: \"kubernetes.io/projected/c242e56f-dc46-4205-97c2-6cd996b11cb0-kube-api-access-5qk6p\") pod \"c242e56f-dc46-4205-97c2-6cd996b11cb0\" (UID: \"c242e56f-dc46-4205-97c2-6cd996b11cb0\") " Apr 20 23:27:41.514773 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:41.514750 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c242e56f-dc46-4205-97c2-6cd996b11cb0-kube-api-access-5qk6p" (OuterVolumeSpecName: "kube-api-access-5qk6p") pod "c242e56f-dc46-4205-97c2-6cd996b11cb0" (UID: "c242e56f-dc46-4205-97c2-6cd996b11cb0"). InnerVolumeSpecName "kube-api-access-5qk6p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:27:41.614135 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:41.614095 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5qk6p\" (UniqueName: \"kubernetes.io/projected/c242e56f-dc46-4205-97c2-6cd996b11cb0-kube-api-access-5qk6p\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:27:42.147571 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:42.147533 2577 generic.go:358] "Generic (PLEG): container finished" podID="c242e56f-dc46-4205-97c2-6cd996b11cb0" containerID="eaf6e07fe62e4db7b4c662bfdff71a609954170fbb91c69d62a0e4ca6ac5825a" exitCode=0 Apr 20 23:27:42.148045 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:42.147577 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-d94mw" event={"ID":"c242e56f-dc46-4205-97c2-6cd996b11cb0","Type":"ContainerDied","Data":"eaf6e07fe62e4db7b4c662bfdff71a609954170fbb91c69d62a0e4ca6ac5825a"} Apr 20 23:27:42.148045 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:42.147581 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-d94mw" Apr 20 23:27:42.148045 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:42.147599 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-d94mw" event={"ID":"c242e56f-dc46-4205-97c2-6cd996b11cb0","Type":"ContainerDied","Data":"f6aebea7ddee84d41c36e17846d1cf903f2d957bd9806d4bfb49f2727ee0a989"} Apr 20 23:27:42.148045 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:42.147614 2577 scope.go:117] "RemoveContainer" containerID="eaf6e07fe62e4db7b4c662bfdff71a609954170fbb91c69d62a0e4ca6ac5825a" Apr 20 23:27:42.157115 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:42.157096 2577 scope.go:117] "RemoveContainer" containerID="eaf6e07fe62e4db7b4c662bfdff71a609954170fbb91c69d62a0e4ca6ac5825a" Apr 20 23:27:42.157411 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:27:42.157392 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaf6e07fe62e4db7b4c662bfdff71a609954170fbb91c69d62a0e4ca6ac5825a\": container with ID starting with eaf6e07fe62e4db7b4c662bfdff71a609954170fbb91c69d62a0e4ca6ac5825a not found: ID does not exist" containerID="eaf6e07fe62e4db7b4c662bfdff71a609954170fbb91c69d62a0e4ca6ac5825a" Apr 20 23:27:42.157463 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:42.157423 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaf6e07fe62e4db7b4c662bfdff71a609954170fbb91c69d62a0e4ca6ac5825a"} err="failed to get container status \"eaf6e07fe62e4db7b4c662bfdff71a609954170fbb91c69d62a0e4ca6ac5825a\": rpc error: code = NotFound desc = could not find container \"eaf6e07fe62e4db7b4c662bfdff71a609954170fbb91c69d62a0e4ca6ac5825a\": container with ID starting with eaf6e07fe62e4db7b4c662bfdff71a609954170fbb91c69d62a0e4ca6ac5825a not found: ID does not exist" Apr 20 23:27:42.168689 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:42.168657 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-d94mw"] Apr 20 23:27:42.171768 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:42.171741 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-d94mw"] Apr 20 23:27:42.918518 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:42.918488 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c242e56f-dc46-4205-97c2-6cd996b11cb0" path="/var/lib/kubelet/pods/c242e56f-dc46-4205-97c2-6cd996b11cb0/volumes" Apr 20 23:27:50.850646 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:50.850618 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_ff387f4dda566eb1da7627d669fc453f/kube-rbac-proxy-crio/2.log" Apr 20 23:27:50.852269 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:27:50.852244 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-ip-10-0-134-166.ec2.internal_ff387f4dda566eb1da7627d669fc453f/kube-rbac-proxy-crio/2.log" Apr 20 23:28:22.862606 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.862568 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8"] Apr 20 23:28:22.863102 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.863042 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c242e56f-dc46-4205-97c2-6cd996b11cb0" containerName="authorino" Apr 20 23:28:22.863102 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.863059 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c242e56f-dc46-4205-97c2-6cd996b11cb0" containerName="authorino" Apr 20 23:28:22.863178 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.863154 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c242e56f-dc46-4205-97c2-6cd996b11cb0" containerName="authorino" Apr 20 23:28:22.867858 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.867832 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:22.871427 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.871392 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 23:28:22.871427 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.871402 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 23:28:22.871633 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.871392 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 23:28:22.871742 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.871722 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-2w8th\"" Apr 20 23:28:22.875242 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.874961 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8"] Apr 20 23:28:22.976868 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.976817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0806a190-e33a-41e0-850e-00959e4bdd0e-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:22.976868 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.976871 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0806a190-e33a-41e0-850e-00959e4bdd0e-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:22.977140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.976932 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0806a190-e33a-41e0-850e-00959e4bdd0e-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:22.977140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.977018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0806a190-e33a-41e0-850e-00959e4bdd0e-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:22.977140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.977065 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0806a190-e33a-41e0-850e-00959e4bdd0e-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:22.977140 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:22.977113 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsgk6\" (UniqueName: \"kubernetes.io/projected/0806a190-e33a-41e0-850e-00959e4bdd0e-kube-api-access-gsgk6\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:23.078272 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:23.078231 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0806a190-e33a-41e0-850e-00959e4bdd0e-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:23.078272 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:23.078287 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0806a190-e33a-41e0-850e-00959e4bdd0e-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:23.078552 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:23.078401 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0806a190-e33a-41e0-850e-00959e4bdd0e-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:23.078552 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:23.078450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0806a190-e33a-41e0-850e-00959e4bdd0e-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:23.078552 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:23.078488 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsgk6\" (UniqueName: \"kubernetes.io/projected/0806a190-e33a-41e0-850e-00959e4bdd0e-kube-api-access-gsgk6\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:23.078552 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:23.078552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0806a190-e33a-41e0-850e-00959e4bdd0e-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:23.078754 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:23.078673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0806a190-e33a-41e0-850e-00959e4bdd0e-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:23.078804 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:23.078766 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0806a190-e33a-41e0-850e-00959e4bdd0e-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:23.078849 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:23.078799 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0806a190-e33a-41e0-850e-00959e4bdd0e-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:23.080659 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:23.080632 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0806a190-e33a-41e0-850e-00959e4bdd0e-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:23.080913 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:23.080894 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0806a190-e33a-41e0-850e-00959e4bdd0e-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:23.085517 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:23.085495 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsgk6\" (UniqueName: \"kubernetes.io/projected/0806a190-e33a-41e0-850e-00959e4bdd0e-kube-api-access-gsgk6\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-m5st8\" (UID: \"0806a190-e33a-41e0-850e-00959e4bdd0e\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:23.179838 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:23.179737 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:23.312101 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:23.312077 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8"] Apr 20 23:28:23.313964 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:28:23.313926 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0806a190_e33a_41e0_850e_00959e4bdd0e.slice/crio-bd72ae5509f0a0eb632a04b0184206e78ba60d0d625a2d62ff2c399e5e48002b WatchSource:0}: Error finding container bd72ae5509f0a0eb632a04b0184206e78ba60d0d625a2d62ff2c399e5e48002b: Status 404 returned error can't find the container with id bd72ae5509f0a0eb632a04b0184206e78ba60d0d625a2d62ff2c399e5e48002b Apr 20 23:28:24.313958 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:24.313901 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" event={"ID":"0806a190-e33a-41e0-850e-00959e4bdd0e","Type":"ContainerStarted","Data":"bd72ae5509f0a0eb632a04b0184206e78ba60d0d625a2d62ff2c399e5e48002b"} Apr 20 23:28:29.341038 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:29.340998 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" event={"ID":"0806a190-e33a-41e0-850e-00959e4bdd0e","Type":"ContainerStarted","Data":"1f13b12855f29817528c2d98c6a3f4e757eaab8044e5aaea79f972e1a994b346"} Apr 20 23:28:34.361845 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:34.361809 2577 generic.go:358] "Generic (PLEG): container finished" podID="0806a190-e33a-41e0-850e-00959e4bdd0e" containerID="1f13b12855f29817528c2d98c6a3f4e757eaab8044e5aaea79f972e1a994b346" exitCode=0 Apr 20 23:28:34.362254 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:34.361868 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" event={"ID":"0806a190-e33a-41e0-850e-00959e4bdd0e","Type":"ContainerDied","Data":"1f13b12855f29817528c2d98c6a3f4e757eaab8044e5aaea79f972e1a994b346"} Apr 20 23:28:36.371785 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:36.371741 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" event={"ID":"0806a190-e33a-41e0-850e-00959e4bdd0e","Type":"ContainerStarted","Data":"e1e9193b163752f5c87468bbda1aada08de9e55bbabf34761834bb1c5f4b2dfc"} Apr 20 23:28:36.372212 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:36.372005 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:36.389304 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:36.389255 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" podStartSLOduration=2.352885264 podStartE2EDuration="14.389239862s" podCreationTimestamp="2026-04-20 23:28:22 +0000 UTC" firstStartedPulling="2026-04-20 23:28:23.315652216 +0000 UTC m=+932.987021341" lastFinishedPulling="2026-04-20 23:28:35.352006811 +0000 UTC m=+945.023375939" observedRunningTime="2026-04-20 23:28:36.388611183 +0000 UTC m=+946.059980343" watchObservedRunningTime="2026-04-20 23:28:36.389239862 +0000 UTC m=+946.060609009" Apr 20 23:28:47.388670 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:47.388638 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-m5st8" Apr 20 23:28:49.773725 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:49.773691 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg"] Apr 20 23:28:49.798292 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:49.798257 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg"] Apr 20 23:28:49.798454 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:49.798381 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:49.801177 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:49.801154 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 23:28:49.918109 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:49.918067 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24b1a3e6-aea2-4efa-af4e-6c219785763b-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:49.918286 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:49.918123 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24b1a3e6-aea2-4efa-af4e-6c219785763b-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:49.918286 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:49.918204 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24b1a3e6-aea2-4efa-af4e-6c219785763b-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:49.918286 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:49.918228 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24b1a3e6-aea2-4efa-af4e-6c219785763b-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:49.918286 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:49.918252 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24b1a3e6-aea2-4efa-af4e-6c219785763b-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:49.918423 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:49.918333 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzzkw\" (UniqueName: \"kubernetes.io/projected/24b1a3e6-aea2-4efa-af4e-6c219785763b-kube-api-access-tzzkw\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:50.019105 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.019064 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24b1a3e6-aea2-4efa-af4e-6c219785763b-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:50.019282 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.019133 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24b1a3e6-aea2-4efa-af4e-6c219785763b-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:50.019282 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.019160 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24b1a3e6-aea2-4efa-af4e-6c219785763b-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:50.019282 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.019187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24b1a3e6-aea2-4efa-af4e-6c219785763b-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:50.019282 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.019236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzzkw\" (UniqueName: \"kubernetes.io/projected/24b1a3e6-aea2-4efa-af4e-6c219785763b-kube-api-access-tzzkw\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:50.019282 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.019268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24b1a3e6-aea2-4efa-af4e-6c219785763b-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:50.019594 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.019566 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/24b1a3e6-aea2-4efa-af4e-6c219785763b-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:50.019656 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.019576 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24b1a3e6-aea2-4efa-af4e-6c219785763b-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:50.019656 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.019612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/24b1a3e6-aea2-4efa-af4e-6c219785763b-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:50.021564 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.021539 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/24b1a3e6-aea2-4efa-af4e-6c219785763b-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:50.021883 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.021864 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/24b1a3e6-aea2-4efa-af4e-6c219785763b-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:50.028074 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.028005 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzzkw\" (UniqueName: \"kubernetes.io/projected/24b1a3e6-aea2-4efa-af4e-6c219785763b-kube-api-access-tzzkw\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg\" (UID: \"24b1a3e6-aea2-4efa-af4e-6c219785763b\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:50.107477 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.107437 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:28:50.233405 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.233379 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg"] Apr 20 23:28:50.235835 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:28:50.235806 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b1a3e6_aea2_4efa_af4e_6c219785763b.slice/crio-ae7e2842dac7f76901b305341fbd6f0a5573515f65f0da9058527128b962de40 WatchSource:0}: Error finding container ae7e2842dac7f76901b305341fbd6f0a5573515f65f0da9058527128b962de40: Status 404 returned error can't find the container with id ae7e2842dac7f76901b305341fbd6f0a5573515f65f0da9058527128b962de40 Apr 20 23:28:50.429724 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.429684 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" event={"ID":"24b1a3e6-aea2-4efa-af4e-6c219785763b","Type":"ContainerStarted","Data":"05f74b7824da64ea0050a9816133a204dcfe95945f389a013b9acbbdf8273adc"} Apr 20 23:28:50.429724 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:50.429726 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" event={"ID":"24b1a3e6-aea2-4efa-af4e-6c219785763b","Type":"ContainerStarted","Data":"ae7e2842dac7f76901b305341fbd6f0a5573515f65f0da9058527128b962de40"} Apr 20 23:28:59.470625 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:59.470589 2577 generic.go:358] "Generic (PLEG): container finished" podID="24b1a3e6-aea2-4efa-af4e-6c219785763b" containerID="05f74b7824da64ea0050a9816133a204dcfe95945f389a013b9acbbdf8273adc" exitCode=0 Apr 20 23:28:59.471057 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:28:59.470667 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" event={"ID":"24b1a3e6-aea2-4efa-af4e-6c219785763b","Type":"ContainerDied","Data":"05f74b7824da64ea0050a9816133a204dcfe95945f389a013b9acbbdf8273adc"} Apr 20 23:29:00.476895 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:00.476815 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" event={"ID":"24b1a3e6-aea2-4efa-af4e-6c219785763b","Type":"ContainerStarted","Data":"b28b318875056f4d24c8ea1b45c4914f7a74ebc3e8ecd697351ffd7351a169aa"} Apr 20 23:29:00.477305 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:00.477014 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:29:00.498630 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:00.498547 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" podStartSLOduration=11.295397958 podStartE2EDuration="11.4985209s" podCreationTimestamp="2026-04-20 23:28:49 +0000 UTC" firstStartedPulling="2026-04-20 23:28:59.47132437 +0000 UTC m=+969.142693495" lastFinishedPulling="2026-04-20 23:28:59.674447309 +0000 UTC m=+969.345816437" observedRunningTime="2026-04-20 23:29:00.493251839 +0000 UTC m=+970.164620986" watchObservedRunningTime="2026-04-20 23:29:00.4985209 +0000 UTC m=+970.169890094" Apr 20 23:29:11.493978 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:11.493915 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg" Apr 20 23:29:13.283183 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.283150 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs"] Apr 20 23:29:13.314343 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.314290 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs"] Apr 20 23:29:13.314513 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.314475 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.317195 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.317171 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 23:29:13.430328 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.430284 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.430501 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.430332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.430501 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.430391 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.430501 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.430443 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.430612 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.430502 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.430612 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.430553 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76fr2\" (UniqueName: \"kubernetes.io/projected/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-kube-api-access-76fr2\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.531905 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.531860 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.531905 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.531909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.532168 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.531934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.532168 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.531976 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.532168 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.532034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.532168 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.532059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76fr2\" (UniqueName: \"kubernetes.io/projected/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-kube-api-access-76fr2\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.532476 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.532442 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.532599 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.532479 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.532599 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.532523 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.534309 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.534256 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.534656 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.534637 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.539796 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.539770 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76fr2\" (UniqueName: \"kubernetes.io/projected/65b1d81f-20e4-4c11-92fb-b8fa37f406ba-kube-api-access-76fr2\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs\" (UID: \"65b1d81f-20e4-4c11-92fb-b8fa37f406ba\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.624289 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.624249 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:13.754886 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:13.754858 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs"] Apr 20 23:29:13.756880 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:29:13.756846 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b1d81f_20e4_4c11_92fb_b8fa37f406ba.slice/crio-afefd9e107c148a9b7810e8d70de61018118430d6c5ce0e6f288d0be9d5eaf89 WatchSource:0}: Error finding container afefd9e107c148a9b7810e8d70de61018118430d6c5ce0e6f288d0be9d5eaf89: Status 404 returned error can't find the container with id afefd9e107c148a9b7810e8d70de61018118430d6c5ce0e6f288d0be9d5eaf89 Apr 20 23:29:14.535651 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:14.535594 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" event={"ID":"65b1d81f-20e4-4c11-92fb-b8fa37f406ba","Type":"ContainerStarted","Data":"d076843a21c0d2dba0c53a7d315e66a6845c99642529bdb42c7a462faf4e3a6c"} Apr 20 23:29:14.536209 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:14.535660 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" event={"ID":"65b1d81f-20e4-4c11-92fb-b8fa37f406ba","Type":"ContainerStarted","Data":"afefd9e107c148a9b7810e8d70de61018118430d6c5ce0e6f288d0be9d5eaf89"} Apr 20 23:29:19.556585 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:19.556547 2577 generic.go:358] "Generic (PLEG): container finished" podID="65b1d81f-20e4-4c11-92fb-b8fa37f406ba" containerID="d076843a21c0d2dba0c53a7d315e66a6845c99642529bdb42c7a462faf4e3a6c" exitCode=0 Apr 20 23:29:19.557079 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:19.556628 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" event={"ID":"65b1d81f-20e4-4c11-92fb-b8fa37f406ba","Type":"ContainerDied","Data":"d076843a21c0d2dba0c53a7d315e66a6845c99642529bdb42c7a462faf4e3a6c"} Apr 20 23:29:20.562613 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:20.562580 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" event={"ID":"65b1d81f-20e4-4c11-92fb-b8fa37f406ba","Type":"ContainerStarted","Data":"1b04fa213f2772475f3894c9c8c5562dae06f82b7ee24d49f02feb198f2550c4"} Apr 20 23:29:20.563059 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:20.562797 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:20.581179 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:20.581085 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" podStartSLOduration=7.360417133 podStartE2EDuration="7.581065948s" podCreationTimestamp="2026-04-20 23:29:13 +0000 UTC" firstStartedPulling="2026-04-20 23:29:19.557388697 +0000 UTC m=+989.228757821" lastFinishedPulling="2026-04-20 23:29:19.778037511 +0000 UTC m=+989.449406636" observedRunningTime="2026-04-20 23:29:20.579683334 +0000 UTC m=+990.251052480" watchObservedRunningTime="2026-04-20 23:29:20.581065948 +0000 UTC m=+990.252435095" Apr 20 23:29:31.579248 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:31.579211 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs" Apr 20 23:29:32.627883 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:32.627841 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7fc85c9f59-k27wh"] Apr 20 23:29:32.632852 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:32.632825 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7fc85c9f59-k27wh"] Apr 20 23:29:32.633034 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:32.632940 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7fc85c9f59-k27wh" Apr 20 23:29:32.808676 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:32.808628 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2j2b\" (UniqueName: \"kubernetes.io/projected/2a27871a-4e81-4ab6-a8bb-b8a9a583a8e8-kube-api-access-p2j2b\") pod \"authorino-7fc85c9f59-k27wh\" (UID: \"2a27871a-4e81-4ab6-a8bb-b8a9a583a8e8\") " pod="kuadrant-system/authorino-7fc85c9f59-k27wh" Apr 20 23:29:32.909457 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:32.909356 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2j2b\" (UniqueName: \"kubernetes.io/projected/2a27871a-4e81-4ab6-a8bb-b8a9a583a8e8-kube-api-access-p2j2b\") pod \"authorino-7fc85c9f59-k27wh\" (UID: \"2a27871a-4e81-4ab6-a8bb-b8a9a583a8e8\") " pod="kuadrant-system/authorino-7fc85c9f59-k27wh" Apr 20 23:29:32.917403 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:32.917373 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2j2b\" (UniqueName: \"kubernetes.io/projected/2a27871a-4e81-4ab6-a8bb-b8a9a583a8e8-kube-api-access-p2j2b\") pod \"authorino-7fc85c9f59-k27wh\" (UID: \"2a27871a-4e81-4ab6-a8bb-b8a9a583a8e8\") " pod="kuadrant-system/authorino-7fc85c9f59-k27wh" Apr 20 23:29:32.942937 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:32.942899 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7fc85c9f59-k27wh" Apr 20 23:29:33.081508 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:33.081479 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7fc85c9f59-k27wh"] Apr 20 23:29:33.083116 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:29:33.083080 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a27871a_4e81_4ab6_a8bb_b8a9a583a8e8.slice/crio-31cb47a15fe6a0a85e40414c6e9374410a74bdb233c3ff8f14037da86f78a3bd WatchSource:0}: Error finding container 31cb47a15fe6a0a85e40414c6e9374410a74bdb233c3ff8f14037da86f78a3bd: Status 404 returned error can't find the container with id 31cb47a15fe6a0a85e40414c6e9374410a74bdb233c3ff8f14037da86f78a3bd Apr 20 23:29:33.615743 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:33.615705 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7fc85c9f59-k27wh" event={"ID":"2a27871a-4e81-4ab6-a8bb-b8a9a583a8e8","Type":"ContainerStarted","Data":"cea3fa2f51bd21efcce1d4e60beaf17c71977243e64d917a63b41079e0db8296"} Apr 20 23:29:33.615743 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:33.615747 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7fc85c9f59-k27wh" event={"ID":"2a27871a-4e81-4ab6-a8bb-b8a9a583a8e8","Type":"ContainerStarted","Data":"31cb47a15fe6a0a85e40414c6e9374410a74bdb233c3ff8f14037da86f78a3bd"} Apr 20 23:29:33.630396 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:33.630184 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7fc85c9f59-k27wh" podStartSLOduration=1.238239652 podStartE2EDuration="1.630166813s" podCreationTimestamp="2026-04-20 23:29:32 +0000 UTC" firstStartedPulling="2026-04-20 23:29:33.084441111 +0000 UTC m=+1002.755810236" lastFinishedPulling="2026-04-20 23:29:33.476368266 +0000 UTC m=+1003.147737397" observedRunningTime="2026-04-20 23:29:33.629893604 +0000 UTC m=+1003.301262752" watchObservedRunningTime="2026-04-20 23:29:33.630166813 +0000 UTC m=+1003.301535961" Apr 20 23:29:33.656571 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:33.656534 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-48hbc"] Apr 20 23:29:33.656822 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:33.656750 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-48hbc" podUID="d2d8e2e6-16e5-42a9-bbcd-0d567bacf009" containerName="authorino" containerID="cri-o://dd61b11cab33371ddbb671a4a7e3d21999761d2f76ba402fc379e43a023d0585" gracePeriod=30 Apr 20 23:29:33.894594 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:33.894569 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-48hbc" Apr 20 23:29:34.018554 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:34.018524 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4tr7\" (UniqueName: \"kubernetes.io/projected/d2d8e2e6-16e5-42a9-bbcd-0d567bacf009-kube-api-access-l4tr7\") pod \"d2d8e2e6-16e5-42a9-bbcd-0d567bacf009\" (UID: \"d2d8e2e6-16e5-42a9-bbcd-0d567bacf009\") " Apr 20 23:29:34.020597 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:34.020575 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d8e2e6-16e5-42a9-bbcd-0d567bacf009-kube-api-access-l4tr7" (OuterVolumeSpecName: "kube-api-access-l4tr7") pod "d2d8e2e6-16e5-42a9-bbcd-0d567bacf009" (UID: "d2d8e2e6-16e5-42a9-bbcd-0d567bacf009"). InnerVolumeSpecName "kube-api-access-l4tr7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 23:29:34.119440 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:34.119401 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l4tr7\" (UniqueName: \"kubernetes.io/projected/d2d8e2e6-16e5-42a9-bbcd-0d567bacf009-kube-api-access-l4tr7\") on node \"ip-10-0-134-166.ec2.internal\" DevicePath \"\"" Apr 20 23:29:34.622886 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:34.622848 2577 generic.go:358] "Generic (PLEG): container finished" podID="d2d8e2e6-16e5-42a9-bbcd-0d567bacf009" containerID="dd61b11cab33371ddbb671a4a7e3d21999761d2f76ba402fc379e43a023d0585" exitCode=0 Apr 20 23:29:34.623084 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:34.622901 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-48hbc" Apr 20 23:29:34.623084 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:34.622932 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-48hbc" event={"ID":"d2d8e2e6-16e5-42a9-bbcd-0d567bacf009","Type":"ContainerDied","Data":"dd61b11cab33371ddbb671a4a7e3d21999761d2f76ba402fc379e43a023d0585"} Apr 20 23:29:34.623084 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:34.622996 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-48hbc" event={"ID":"d2d8e2e6-16e5-42a9-bbcd-0d567bacf009","Type":"ContainerDied","Data":"ee321cef0658d426f88d9a48246b2509ba5656074721b53d397917f8e725264d"} Apr 20 23:29:34.623084 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:34.623011 2577 scope.go:117] "RemoveContainer" containerID="dd61b11cab33371ddbb671a4a7e3d21999761d2f76ba402fc379e43a023d0585" Apr 20 23:29:34.632414 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:34.632392 2577 scope.go:117] "RemoveContainer" containerID="dd61b11cab33371ddbb671a4a7e3d21999761d2f76ba402fc379e43a023d0585" Apr 20 23:29:34.632706 ip-10-0-134-166 kubenswrapper[2577]: E0420 23:29:34.632677 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd61b11cab33371ddbb671a4a7e3d21999761d2f76ba402fc379e43a023d0585\": container with ID starting with dd61b11cab33371ddbb671a4a7e3d21999761d2f76ba402fc379e43a023d0585 not found: ID does not exist" containerID="dd61b11cab33371ddbb671a4a7e3d21999761d2f76ba402fc379e43a023d0585" Apr 20 23:29:34.632746 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:34.632702 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd61b11cab33371ddbb671a4a7e3d21999761d2f76ba402fc379e43a023d0585"} err="failed to get container status \"dd61b11cab33371ddbb671a4a7e3d21999761d2f76ba402fc379e43a023d0585\": rpc error: code = NotFound desc = could not find container \"dd61b11cab33371ddbb671a4a7e3d21999761d2f76ba402fc379e43a023d0585\": container with ID starting with dd61b11cab33371ddbb671a4a7e3d21999761d2f76ba402fc379e43a023d0585 not found: ID does not exist" Apr 20 23:29:34.644459 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:34.644432 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-48hbc"] Apr 20 23:29:34.646546 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:34.646528 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-48hbc"] Apr 20 23:29:34.918502 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:29:34.918411 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d8e2e6-16e5-42a9-bbcd-0d567bacf009" path="/var/lib/kubelet/pods/d2d8e2e6-16e5-42a9-bbcd-0d567bacf009/volumes" Apr 20 23:31:52.950823 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:52.950790 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7fc85c9f59-k27wh_2a27871a-4e81-4ab6-a8bb-b8a9a583a8e8/authorino/0.log" Apr 20 23:31:57.622137 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:57.622107 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d79c565b7-znsfx_6d311889-eb60-4d87-864d-2956f43f404a/manager/0.log" Apr 20 23:31:58.507641 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:58.507611 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh_1e6194a3-80c9-4b57-ab03-32d6aaf59c47/util/0.log" Apr 20 23:31:58.513903 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:58.513880 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh_1e6194a3-80c9-4b57-ab03-32d6aaf59c47/pull/0.log" Apr 20 23:31:58.519795 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:58.519775 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh_1e6194a3-80c9-4b57-ab03-32d6aaf59c47/extract/0.log" Apr 20 23:31:58.626389 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:58.626358 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4_5a276cc4-8103-401a-8e8c-2965d47b3cfc/util/0.log" Apr 20 23:31:58.633266 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:58.633234 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4_5a276cc4-8103-401a-8e8c-2965d47b3cfc/pull/0.log" Apr 20 23:31:58.639372 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:58.639348 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4_5a276cc4-8103-401a-8e8c-2965d47b3cfc/extract/0.log" Apr 20 23:31:58.747628 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:58.747595 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r_5bd464c7-cddf-45d2-8b39-67d3bbb399e5/util/0.log" Apr 20 23:31:58.754177 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:58.754151 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r_5bd464c7-cddf-45d2-8b39-67d3bbb399e5/pull/0.log" Apr 20 23:31:58.759995 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:58.759914 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r_5bd464c7-cddf-45d2-8b39-67d3bbb399e5/extract/0.log" Apr 20 23:31:58.866178 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:58.866148 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x_c2b42f5d-467d-4e2f-bd50-980d960dbe2f/util/0.log" Apr 20 23:31:58.872460 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:58.872428 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x_c2b42f5d-467d-4e2f-bd50-980d960dbe2f/pull/0.log" Apr 20 23:31:58.878956 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:58.878927 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x_c2b42f5d-467d-4e2f-bd50-980d960dbe2f/extract/0.log" Apr 20 23:31:58.992868 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:58.992832 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7fc85c9f59-k27wh_2a27871a-4e81-4ab6-a8bb-b8a9a583a8e8/authorino/0.log" Apr 20 23:31:59.324315 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:59.324287 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-ppwgs_dac056d3-bd56-4772-80bc-fce96d35e020/kuadrant-console-plugin/0.log" Apr 20 23:31:59.438965 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:59.438914 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-mhh54_99f30f6e-5db1-4fdf-a67f-3d1f77c0ba58/registry-server/0.log" Apr 20 23:31:59.561592 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:59.561558 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-k444g_16f36687-4eac-470f-9725-842eeb9bef3f/manager/0.log" Apr 20 23:31:59.791929 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:31:59.791846 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-pfdcl_5ab29ae2-f438-409a-a976-948fc8bc749a/manager/0.log" Apr 20 23:32:00.153544 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:00.153510 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt_3b7c0f57-f6d9-4449-869c-32f8ac8135ed/istio-proxy/0.log" Apr 20 23:32:00.612441 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:00.612407 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-95dhp_1fcfc911-4199-4661-947f-23b4b9e69300/istio-proxy/0.log" Apr 20 23:32:01.292261 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:01.292218 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-m5st8_0806a190-e33a-41e0-850e-00959e4bdd0e/storage-initializer/0.log" Apr 20 23:32:01.299406 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:01.299382 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-m5st8_0806a190-e33a-41e0-850e-00959e4bdd0e/main/0.log" Apr 20 23:32:01.518502 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:01.518476 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs_65b1d81f-20e4-4c11-92fb-b8fa37f406ba/storage-initializer/0.log" Apr 20 23:32:01.538841 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:01.538817 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-cmsbs_65b1d81f-20e4-4c11-92fb-b8fa37f406ba/main/0.log" Apr 20 23:32:01.649302 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:01.649269 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg_24b1a3e6-aea2-4efa-af4e-6c219785763b/storage-initializer/0.log" Apr 20 23:32:01.656700 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:01.656675 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-hhvcg_24b1a3e6-aea2-4efa-af4e-6c219785763b/main/0.log" Apr 20 23:32:08.119536 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:08.119510 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9znkk_6bc92d9d-0606-47a0-bac5-0b39b85308a8/global-pull-secret-syncer/0.log" Apr 20 23:32:08.234845 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:08.234782 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2mnql_33327497-14af-4ec5-a658-d9a33e5963c1/konnectivity-agent/0.log" Apr 20 23:32:08.340959 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:08.340912 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-166.ec2.internal_6dec7aee4b5cc7a9a5eebe9083c21fbd/haproxy/0.log" Apr 20 23:32:12.363841 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.363817 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh_1e6194a3-80c9-4b57-ab03-32d6aaf59c47/extract/0.log" Apr 20 23:32:12.399426 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.399394 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh_1e6194a3-80c9-4b57-ab03-32d6aaf59c47/util/0.log" Apr 20 23:32:12.420656 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.420624 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7598jhlh_1e6194a3-80c9-4b57-ab03-32d6aaf59c47/pull/0.log" Apr 20 23:32:12.448145 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.448114 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4_5a276cc4-8103-401a-8e8c-2965d47b3cfc/extract/0.log" Apr 20 23:32:12.467628 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.467594 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4_5a276cc4-8103-401a-8e8c-2965d47b3cfc/util/0.log" Apr 20 23:32:12.486179 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.486152 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e042lb4_5a276cc4-8103-401a-8e8c-2965d47b3cfc/pull/0.log" Apr 20 23:32:12.511697 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.511669 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r_5bd464c7-cddf-45d2-8b39-67d3bbb399e5/extract/0.log" Apr 20 23:32:12.531569 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.531543 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r_5bd464c7-cddf-45d2-8b39-67d3bbb399e5/util/0.log" Apr 20 23:32:12.552134 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.552104 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73djp8r_5bd464c7-cddf-45d2-8b39-67d3bbb399e5/pull/0.log" Apr 20 23:32:12.578673 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.578640 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x_c2b42f5d-467d-4e2f-bd50-980d960dbe2f/extract/0.log" Apr 20 23:32:12.635911 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.635823 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x_c2b42f5d-467d-4e2f-bd50-980d960dbe2f/util/0.log" Apr 20 23:32:12.678023 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.677998 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1spm4x_c2b42f5d-467d-4e2f-bd50-980d960dbe2f/pull/0.log" Apr 20 23:32:12.751904 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.751868 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7fc85c9f59-k27wh_2a27871a-4e81-4ab6-a8bb-b8a9a583a8e8/authorino/0.log" Apr 20 23:32:12.825586 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.825551 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-ppwgs_dac056d3-bd56-4772-80bc-fce96d35e020/kuadrant-console-plugin/0.log" Apr 20 23:32:12.854361 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.854311 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-mhh54_99f30f6e-5db1-4fdf-a67f-3d1f77c0ba58/registry-server/0.log" Apr 20 23:32:12.894506 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.894411 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-k444g_16f36687-4eac-470f-9725-842eeb9bef3f/manager/0.log" Apr 20 23:32:12.943496 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:12.943454 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-pfdcl_5ab29ae2-f438-409a-a976-948fc8bc749a/manager/0.log" Apr 20 23:32:14.783668 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:14.783636 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tq6lg_9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe/node-exporter/0.log" Apr 20 23:32:14.813358 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:14.813327 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tq6lg_9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe/kube-rbac-proxy/0.log" Apr 20 23:32:14.832524 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:14.832498 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tq6lg_9d3f5282-1b79-4bbc-88f1-23ec4a8f6fbe/init-textfile/0.log" Apr 20 23:32:15.189681 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:15.189653 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-68888c5bd7-mzktx_f615191a-9efd-4021-9ffc-8045826ad131/telemeter-client/0.log" Apr 20 23:32:15.209630 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:15.209604 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-68888c5bd7-mzktx_f615191a-9efd-4021-9ffc-8045826ad131/reload/0.log" Apr 20 23:32:15.229616 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:15.229587 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-68888c5bd7-mzktx_f615191a-9efd-4021-9ffc-8045826ad131/kube-rbac-proxy/0.log" Apr 20 23:32:16.427862 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.427824 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj"] Apr 20 23:32:16.428486 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.428462 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2d8e2e6-16e5-42a9-bbcd-0d567bacf009" containerName="authorino" Apr 20 23:32:16.428576 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.428489 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d8e2e6-16e5-42a9-bbcd-0d567bacf009" containerName="authorino" Apr 20 23:32:16.428629 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.428575 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2d8e2e6-16e5-42a9-bbcd-0d567bacf009" containerName="authorino" Apr 20 23:32:16.431566 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.431543 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.434048 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.434023 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d2tt9\"/\"kube-root-ca.crt\"" Apr 20 23:32:16.434172 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.434142 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-d2tt9\"/\"default-dockercfg-5mhdc\"" Apr 20 23:32:16.434908 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.434882 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-d2tt9\"/\"openshift-service-ca.crt\"" Apr 20 23:32:16.439729 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.439709 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj"] Apr 20 23:32:16.510366 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.510323 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b3a6a53b-5da9-4474-8173-c77a96be295d-proc\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.510555 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.510376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbczf\" (UniqueName: \"kubernetes.io/projected/b3a6a53b-5da9-4474-8173-c77a96be295d-kube-api-access-jbczf\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.510555 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.510409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3a6a53b-5da9-4474-8173-c77a96be295d-lib-modules\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.510555 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.510497 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3a6a53b-5da9-4474-8173-c77a96be295d-sys\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.510555 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.510545 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b3a6a53b-5da9-4474-8173-c77a96be295d-podres\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.611720 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.611680 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b3a6a53b-5da9-4474-8173-c77a96be295d-proc\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.611720 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.611724 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbczf\" (UniqueName: \"kubernetes.io/projected/b3a6a53b-5da9-4474-8173-c77a96be295d-kube-api-access-jbczf\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.611927 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.611746 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3a6a53b-5da9-4474-8173-c77a96be295d-lib-modules\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.611927 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.611781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3a6a53b-5da9-4474-8173-c77a96be295d-sys\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.611927 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.611806 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b3a6a53b-5da9-4474-8173-c77a96be295d-podres\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.611927 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.611820 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b3a6a53b-5da9-4474-8173-c77a96be295d-proc\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.611927 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.611906 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3a6a53b-5da9-4474-8173-c77a96be295d-sys\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.612115 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.611982 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3a6a53b-5da9-4474-8173-c77a96be295d-lib-modules\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.612115 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.611940 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b3a6a53b-5da9-4474-8173-c77a96be295d-podres\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.619633 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.619607 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbczf\" (UniqueName: \"kubernetes.io/projected/b3a6a53b-5da9-4474-8173-c77a96be295d-kube-api-access-jbczf\") pod \"perf-node-gather-daemonset-s64lj\" (UID: \"b3a6a53b-5da9-4474-8173-c77a96be295d\") " pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.743004 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.742884 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:16.869562 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:16.869533 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj"] Apr 20 23:32:16.870576 ip-10-0-134-166 kubenswrapper[2577]: W0420 23:32:16.870549 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb3a6a53b_5da9_4474_8173_c77a96be295d.slice/crio-7815bd6c7afc4c38da0e05adf71c84d0b2e1d346ddfd20bfed2fb47ca1a5ce62 WatchSource:0}: Error finding container 7815bd6c7afc4c38da0e05adf71c84d0b2e1d346ddfd20bfed2fb47ca1a5ce62: Status 404 returned error can't find the container with id 7815bd6c7afc4c38da0e05adf71c84d0b2e1d346ddfd20bfed2fb47ca1a5ce62 Apr 20 23:32:17.249772 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:17.249733 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" event={"ID":"b3a6a53b-5da9-4474-8173-c77a96be295d","Type":"ContainerStarted","Data":"ae253ef390e0fd9cbae5ded29cc330f74eaef478e8de97a1619e89560498edfc"} Apr 20 23:32:17.249772 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:17.249780 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:17.250035 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:17.249789 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" event={"ID":"b3a6a53b-5da9-4474-8173-c77a96be295d","Type":"ContainerStarted","Data":"7815bd6c7afc4c38da0e05adf71c84d0b2e1d346ddfd20bfed2fb47ca1a5ce62"} Apr 20 23:32:17.266275 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:17.266211 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" podStartSLOduration=1.266197194 podStartE2EDuration="1.266197194s" podCreationTimestamp="2026-04-20 23:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 23:32:17.265581888 +0000 UTC m=+1166.936951034" watchObservedRunningTime="2026-04-20 23:32:17.266197194 +0000 UTC m=+1166.937566341" Apr 20 23:32:18.871088 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:18.871042 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ll8sp_b48ecb2f-b835-4a84-b0ca-b01696b7e237/dns/0.log" Apr 20 23:32:18.892518 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:18.892492 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ll8sp_b48ecb2f-b835-4a84-b0ca-b01696b7e237/kube-rbac-proxy/0.log" Apr 20 23:32:18.955181 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:18.955152 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zhjs2_e5e09233-03a8-4a45-aea7-fd8ccb794be7/dns-node-resolver/0.log" Apr 20 23:32:19.399583 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:19.399553 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5f4p9_3e70db85-c7e8-41b6-a59d-3b622a5e7bb0/node-ca/0.log" Apr 20 23:32:20.216826 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:20.216802 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf74dkt_3b7c0f57-f6d9-4449-869c-32f8ac8135ed/istio-proxy/0.log" Apr 20 23:32:20.365430 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:20.365393 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-95dhp_1fcfc911-4199-4661-947f-23b4b9e69300/istio-proxy/0.log" Apr 20 23:32:20.902134 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:20.902104 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mkjgb_93d27a11-204e-4737-b1e5-95f56ec3a768/serve-healthcheck-canary/0.log" Apr 20 23:32:21.460577 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:21.460547 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c62gd_7880a2ec-8385-4990-a775-f9d711d92cc3/kube-rbac-proxy/0.log" Apr 20 23:32:21.479445 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:21.479420 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c62gd_7880a2ec-8385-4990-a775-f9d711d92cc3/exporter/0.log" Apr 20 23:32:21.500581 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:21.500557 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c62gd_7880a2ec-8385-4990-a775-f9d711d92cc3/extractor/0.log" Apr 20 23:32:23.263152 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:23.263124 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-d2tt9/perf-node-gather-daemonset-s64lj" Apr 20 23:32:23.472526 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:23.472493 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d79c565b7-znsfx_6d311889-eb60-4d87-864d-2956f43f404a/manager/0.log" Apr 20 23:32:24.558240 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:24.558214 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6577b568b8-hlssw_09285221-0e2f-478e-b8fa-87bff03e5cef/manager/0.log" Apr 20 23:32:24.600599 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:24.600568 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-srfxd_63c92598-680b-47e6-987b-58661192e3f0/openshift-lws-operator/0.log" Apr 20 23:32:30.106625 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:30.106592 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2lxxb_b96ea9fa-073d-43b9-86ea-ea051d78bec9/kube-multus/0.log" Apr 20 23:32:30.127573 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:30.127538 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2ng5_042233f9-bbe1-4bd7-acbd-b2180ef39cbb/kube-multus-additional-cni-plugins/0.log" Apr 20 23:32:30.150704 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:30.148000 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2ng5_042233f9-bbe1-4bd7-acbd-b2180ef39cbb/egress-router-binary-copy/0.log" Apr 20 23:32:30.165768 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:30.165740 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2ng5_042233f9-bbe1-4bd7-acbd-b2180ef39cbb/cni-plugins/0.log" Apr 20 23:32:30.184328 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:30.184296 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2ng5_042233f9-bbe1-4bd7-acbd-b2180ef39cbb/bond-cni-plugin/0.log" Apr 20 23:32:30.202985 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:30.202931 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2ng5_042233f9-bbe1-4bd7-acbd-b2180ef39cbb/routeoverride-cni/0.log" Apr 20 23:32:30.221666 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:30.221641 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2ng5_042233f9-bbe1-4bd7-acbd-b2180ef39cbb/whereabouts-cni-bincopy/0.log" Apr 20 23:32:30.240603 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:30.240573 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g2ng5_042233f9-bbe1-4bd7-acbd-b2180ef39cbb/whereabouts-cni/0.log" Apr 20 23:32:30.600656 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:30.600613 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7vff9_25b85e22-f989-497b-a027-9ffb78b0533d/network-metrics-daemon/0.log" Apr 20 23:32:30.618228 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:30.618206 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7vff9_25b85e22-f989-497b-a027-9ffb78b0533d/kube-rbac-proxy/0.log" Apr 20 23:32:31.781410 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:31.781378 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbfps_da93c8a9-809d-4b57-b6ad-138eab016391/ovn-controller/0.log" Apr 20 23:32:31.808767 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:31.808730 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbfps_da93c8a9-809d-4b57-b6ad-138eab016391/ovn-acl-logging/0.log" Apr 20 23:32:31.827232 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:31.827201 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbfps_da93c8a9-809d-4b57-b6ad-138eab016391/kube-rbac-proxy-node/0.log" Apr 20 23:32:31.846276 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:31.846252 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbfps_da93c8a9-809d-4b57-b6ad-138eab016391/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 23:32:31.866435 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:31.866402 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbfps_da93c8a9-809d-4b57-b6ad-138eab016391/northd/0.log" Apr 20 23:32:31.885581 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:31.885556 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbfps_da93c8a9-809d-4b57-b6ad-138eab016391/nbdb/0.log" Apr 20 23:32:31.903899 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:31.903871 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbfps_da93c8a9-809d-4b57-b6ad-138eab016391/sbdb/0.log" Apr 20 23:32:32.013209 ip-10-0-134-166 kubenswrapper[2577]: I0420 23:32:32.013176 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kbfps_da93c8a9-809d-4b57-b6ad-138eab016391/ovnkube-controller/0.log"