Apr 16 19:15:24.076539 ip-10-0-130-163 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 19:15:24.076556 ip-10-0-130-163 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 19:15:24.076565 ip-10-0-130-163 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 19:15:24.076904 ip-10-0-130-163 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 19:15:34.147357 ip-10-0-130-163 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 19:15:34.147379 ip-10-0-130-163 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 708cd393b0f940a79b847c077e681f4d -- Apr 16 19:18:08.133758 ip-10-0-130-163 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:18:08.616076 ip-10-0-130-163 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:18:08.616762 ip-10-0-130-163 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:18:08.616762 ip-10-0-130-163 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:18:08.616762 ip-10-0-130-163 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:18:08.616762 ip-10-0-130-163 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:18:08.620449 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.620254 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:18:08.625756 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625733 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:18:08.625756 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625756 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:18:08.625756 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625760 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625763 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625768 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625771 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625774 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625776 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625779 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625782 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625785 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625787 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625790 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625793 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625796 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625799 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625802 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625805 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625807 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625814 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625817 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625820 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:18:08.625863 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625822 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625825 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625828 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625830 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625833 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625835 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625838 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625841 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625844 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625846 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625849 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625852 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625855 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625857 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625862 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625867 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625870 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625874 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625878 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625881 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:18:08.626351 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625884 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625887 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625890 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625893 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625896 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625899 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625901 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625904 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625906 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625909 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625911 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625924 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625927 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625930 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625932 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625936 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625940 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625944 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625947 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:18:08.626901 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625950 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625953 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625956 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625958 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625961 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625964 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625967 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625970 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625974 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625977 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625980 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625983 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625986 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625988 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625993 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625995 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.625998 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626002 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626004 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:18:08.627370 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626007 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626010 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626013 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626015 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626018 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626020 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626468 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626474 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626476 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626479 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626481 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626484 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626487 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626490 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626493 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626495 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626498 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626500 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626503 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626505 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:18:08.627836 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626509 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626512 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626514 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626517 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626520 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626522 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626525 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626528 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626531 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626533 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626536 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626538 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626541 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626544 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626546 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626549 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626551 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626554 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626557 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:18:08.628314 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626559 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626562 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626565 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626567 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626570 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626573 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626575 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626578 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626581 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626583 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626585 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626588 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626590 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626594 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626598 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626600 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626603 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626606 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626608 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626611 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:18:08.628816 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626614 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626617 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626619 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626622 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626624 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626627 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626629 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626632 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626634 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626636 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626639 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626642 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626644 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626647 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626650 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626653 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626655 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626658 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626662 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626664 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:18:08.629316 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626667 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626669 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626674 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626677 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626680 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626683 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626686 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626688 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626693 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626696 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626699 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626702 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.626705 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628335 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628347 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628356 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628360 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628365 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628368 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628373 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:18:08.629835 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628377 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628381 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628384 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628387 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628391 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628394 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628397 2578 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628400 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628417 2578 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628420 2578 flags.go:64] FLAG: --cloud-config="" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628423 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628426 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628434 2578 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628437 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628441 2578 flags.go:64] FLAG: --config-dir="" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628444 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628447 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628452 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628455 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628458 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628461 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628464 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628468 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628471 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628474 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:18:08.630338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628477 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628482 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628486 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628489 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628492 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628495 2578 flags.go:64] FLAG: --enable-server="true" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628498 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628503 2578 flags.go:64] FLAG: --event-burst="100" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628506 2578 flags.go:64] FLAG: --event-qps="50" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628509 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628513 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628516 2578 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628520 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628523 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628526 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628529 2578 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628532 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628535 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628538 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628541 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628544 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628548 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628551 2578 flags.go:64] FLAG: --feature-gates="" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628555 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628558 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:18:08.630962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628562 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628565 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628568 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628571 2578 flags.go:64] FLAG: --help="false" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628574 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-130-163.ec2.internal" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628577 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628580 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628583 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628587 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628591 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628594 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628597 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628600 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628603 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628606 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628609 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628612 2578 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628615 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628618 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628621 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628624 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628627 2578 flags.go:64] FLAG: --lock-file="" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628630 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628633 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:18:08.631599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628636 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628642 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628645 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628648 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628651 2578 flags.go:64] FLAG: --logging-format="text" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628653 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628657 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628660 2578 flags.go:64] FLAG: --manifest-url="" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628663 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628669 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628672 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628676 2578 flags.go:64] FLAG: --max-pods="110" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628679 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628682 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628685 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628688 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628692 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628695 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628698 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628707 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628710 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628714 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628717 2578 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:18:08.632177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628720 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628726 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628729 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628733 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628736 2578 flags.go:64] FLAG: --port="10250" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628739 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628742 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01c1bca11d1f8611f" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628745 2578 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628748 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628751 2578 flags.go:64] FLAG: --register-node="true" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628754 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628757 2578 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628761 2578 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628764 2578 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628767 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628770 2578 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628774 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628777 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628780 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628783 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628786 2578 flags.go:64] FLAG: --runonce="false" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628789 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628792 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628795 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628798 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628804 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:18:08.632799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628807 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628810 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628813 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628817 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628820 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628823 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628826 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628830 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628833 2578 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628836 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628841 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628844 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628847 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628854 2578 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628857 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628860 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628863 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628866 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628869 2578 flags.go:64] FLAG: --v="2" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628874 2578 flags.go:64] FLAG: --version="false" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628878 2578 flags.go:64] FLAG: --vmodule="" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628883 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.628886 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.628989 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:18:08.633469 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.628992 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.628995 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.628998 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629001 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629004 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629007 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629010 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629014 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629017 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629019 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629022 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629024 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629028 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629031 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629033 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629036 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629039 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629042 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629044 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:18:08.634095 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629047 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629049 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629052 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629055 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629058 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629060 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629063 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629065 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629068 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629071 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629073 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629076 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629079 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629082 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629084 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629087 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629089 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629092 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629096 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:18:08.634599 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629099 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629102 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629105 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629107 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629110 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629112 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629115 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629118 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629120 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629123 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629126 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629130 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629133 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629136 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629138 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629141 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629143 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629146 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629148 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629151 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:18:08.635116 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629154 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629156 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629159 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629161 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629164 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629166 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629169 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629171 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629173 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629176 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629178 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629181 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629183 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629186 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629188 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629191 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629193 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629195 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629198 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629200 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:18:08.635625 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629204 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:18:08.636120 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629207 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:18:08.636120 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629209 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:18:08.636120 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629211 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:18:08.636120 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629214 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:18:08.636120 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629216 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:18:08.636120 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.629219 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:18:08.636120 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.629224 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:18:08.637178 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.637153 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:18:08.637225 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.637180 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:18:08.637260 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637251 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:18:08.637289 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637260 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:18:08.637289 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637265 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:18:08.637289 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637268 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:18:08.637289 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637271 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:18:08.637289 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637273 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:18:08.637289 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637276 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:18:08.637289 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637279 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:18:08.637289 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637282 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:18:08.637289 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637284 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:18:08.637289 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637287 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:18:08.637289 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637289 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:18:08.637289 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637292 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:18:08.637289 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637295 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637298 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637301 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637303 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637305 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637308 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637313 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637315 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637318 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637321 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637323 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637326 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637329 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637331 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637335 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637341 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637344 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637347 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637350 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:18:08.637715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637353 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637355 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637358 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637360 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637363 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637365 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637367 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637370 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637373 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637375 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637377 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637380 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637382 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637384 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637387 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637391 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637393 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637396 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637399 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637420 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:18:08.638190 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637425 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637428 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637431 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637434 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637436 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637439 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637441 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637444 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637446 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637449 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637452 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637454 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637458 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637460 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637463 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637466 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637468 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637471 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637474 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637476 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:18:08.638751 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637479 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637482 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637484 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637487 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637490 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637492 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637496 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637501 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637504 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637506 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637509 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637512 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637515 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637517 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.637522 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:18:08.639236 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637649 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637655 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637658 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637661 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637664 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637666 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637669 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637671 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637674 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637676 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637679 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637682 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637684 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637687 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637690 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637692 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637694 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637697 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637699 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637702 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:18:08.639630 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637704 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637706 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637709 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637711 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637714 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637717 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637719 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637723 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637726 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637729 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637731 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637734 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637736 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637740 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637742 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637745 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637747 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637750 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637752 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637754 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:18:08.640119 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637757 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637759 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637762 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637765 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637767 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637770 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637772 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637775 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637777 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637779 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637782 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637784 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637787 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637789 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637792 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637795 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637797 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637800 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637804 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637808 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:18:08.640637 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637811 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637814 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637816 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637819 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637821 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637824 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637826 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637830 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637833 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637836 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637839 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637842 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637845 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637848 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637851 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637853 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637857 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637859 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637862 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:18:08.641124 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637864 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:18:08.641692 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637867 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:18:08.641692 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637869 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:18:08.641692 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637872 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:18:08.641692 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637874 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:18:08.641692 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637877 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:18:08.641692 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:08.637879 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:18:08.641692 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.637884 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:18:08.641692 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.638039 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:18:08.641692 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.641621 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:18:08.642617 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.642604 2578 server.go:1019] "Starting client certificate rotation" Apr 16 19:18:08.642738 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.642716 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:18:08.642792 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.642780 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:18:08.672556 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.672528 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:18:08.679838 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.679803 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:18:08.696256 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.696225 2578 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:18:08.702952 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.702929 2578 log.go:25] "Validated CRI v1 image API" Apr 16 19:18:08.704843 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.704814 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:18:08.705213 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.705195 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:18:08.710768 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.710729 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 93da4167-e2f4-478e-b5cd-8aee489c8f08:/dev/nvme0n1p3 e33a97da-8289-4ed8-9b14-86fcbbdc8964:/dev/nvme0n1p4] Apr 16 19:18:08.710768 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.710764 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:18:08.717313 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.717173 2578 manager.go:217] Machine: {Timestamp:2026-04-16 19:18:08.715103559 +0000 UTC m=+0.453006818 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100314 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b25e057580abce36c2c37af4b9888 SystemUUID:ec2b25e0-5758-0abc-e36c-2c37af4b9888 BootID:708cd393-b0f9-40a7-9b84-7c077e681f4d Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:79:39:00:03:4d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:79:39:00:03:4d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1e:28:78:ed:60:2f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:18:08.717313 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.717305 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:18:08.717439 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.717401 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:18:08.718693 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.718661 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:18:08.718851 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.718697 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-163.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:18:08.718898 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.718861 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:18:08.718898 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.718870 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:18:08.718898 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.718884 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:18:08.719885 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.719873 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:18:08.721307 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.721295 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:18:08.721451 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.721441 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:18:08.723828 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.723815 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:18:08.723877 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.723856 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:18:08.723877 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.723868 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:18:08.723877 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.723878 2578 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:18:08.723965 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.723890 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:18:08.725124 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.725110 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:18:08.725173 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.725131 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:18:08.728679 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.728654 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:18:08.728935 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.728918 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qbk6g" Apr 16 19:18:08.730107 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.730093 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:18:08.731596 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.731584 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:18:08.731644 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.731602 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:18:08.731644 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.731608 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:18:08.731644 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.731614 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:18:08.731644 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.731620 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:18:08.731644 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.731626 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:18:08.731644 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.731634 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:18:08.731644 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.731642 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:18:08.731903 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.731655 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:18:08.731903 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.731662 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:18:08.731903 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.731672 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:18:08.731903 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.731681 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:18:08.732716 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.732704 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:18:08.732716 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.732716 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:18:08.737466 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.737437 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qbk6g" Apr 16 19:18:08.738538 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.738516 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:18:08.738648 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.738575 2578 server.go:1295] "Started kubelet" Apr 16 19:18:08.738814 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.738723 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:18:08.738870 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.738728 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:18:08.738870 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.738863 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:18:08.739601 ip-10-0-130-163 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:18:08.739747 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.739639 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-163.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:18:08.739747 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:08.739682 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-163.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:18:08.739845 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:08.739746 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:18:08.740197 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.740179 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:18:08.741868 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.741849 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:18:08.748007 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.747987 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:18:08.748636 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.748619 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:18:08.749325 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.749306 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:18:08.749454 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.749438 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:18:08.749592 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.749370 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:18:08.749674 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.749625 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:18:08.749674 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.749634 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:18:08.750913 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.749826 2578 factory.go:55] Registering systemd factory Apr 16 19:18:08.750913 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.749848 2578 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:18:08.750913 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:08.750192 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-163.ec2.internal\" not found" Apr 16 19:18:08.751455 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.751433 2578 factory.go:153] Registering CRI-O factory Apr 16 19:18:08.751455 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.751458 2578 factory.go:223] Registration of the crio container factory successfully Apr 16 19:18:08.751595 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.751517 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:18:08.751595 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.751546 2578 factory.go:103] Registering Raw factory Apr 16 19:18:08.751595 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.751564 2578 manager.go:1196] Started watching for new ooms in manager Apr 16 19:18:08.751908 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.751893 2578 manager.go:319] Starting recovery of all containers Apr 16 19:18:08.752207 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:08.752177 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:18:08.752349 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.752329 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:18:08.755555 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:08.755532 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-163.ec2.internal\" not found" node="ip-10-0-130-163.ec2.internal" Apr 16 19:18:08.762778 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.762585 2578 manager.go:324] Recovery completed Apr 16 19:18:08.767945 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.767926 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:18:08.770733 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.770715 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:18:08.770830 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.770750 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:18:08.770830 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.770766 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:18:08.771233 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.771219 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:18:08.771282 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.771233 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:18:08.771282 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.771250 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:18:08.773732 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.773719 2578 policy_none.go:49] "None policy: Start" Apr 16 19:18:08.773784 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.773736 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:18:08.773784 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.773747 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:18:08.811086 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.811061 2578 manager.go:341] "Starting Device Plugin manager" Apr 16 19:18:08.816127 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:08.811104 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:18:08.816127 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.811115 2578 server.go:85] "Starting device plugin registration server" Apr 16 19:18:08.816127 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.811427 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:18:08.816127 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.811444 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:18:08.816127 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.811535 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:18:08.816127 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.811630 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:18:08.816127 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.811640 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:18:08.816127 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:08.812920 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:18:08.816127 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:08.812958 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-163.ec2.internal\" not found" Apr 16 19:18:08.880188 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.880095 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:18:08.881627 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.881608 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:18:08.881713 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.881640 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:18:08.881713 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.881662 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:18:08.881713 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.881669 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:18:08.881849 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:08.881712 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:18:08.884012 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.883988 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:18:08.912056 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.912015 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:18:08.913080 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.913061 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:18:08.913184 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.913099 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:18:08.913184 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.913114 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:18:08.913184 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.913146 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-163.ec2.internal" Apr 16 19:18:08.926243 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.926211 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-163.ec2.internal" Apr 16 19:18:08.926387 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:08.926251 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-163.ec2.internal\": node \"ip-10-0-130-163.ec2.internal\" not found" Apr 16 19:18:08.945758 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:08.945728 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-163.ec2.internal\" not found" Apr 16 19:18:08.982587 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.982554 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-163.ec2.internal"] Apr 16 19:18:08.982668 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.982634 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:18:08.983667 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.983650 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:18:08.983769 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.983681 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:18:08.983769 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.983691 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:18:08.984901 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.984888 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:18:08.985096 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.985081 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal" Apr 16 19:18:08.985154 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.985109 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:18:08.985625 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.985610 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:18:08.985625 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.985620 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:18:08.985748 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.985637 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:18:08.985748 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.985646 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:18:08.985829 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.985645 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:18:08.985829 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.985821 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:18:08.986730 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.986714 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-163.ec2.internal" Apr 16 19:18:08.986793 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.986742 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:18:08.987456 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.987442 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:18:08.987516 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.987466 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:18:08.987516 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:08.987477 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:18:09.031310 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:09.031285 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-163.ec2.internal\" not found" node="ip-10-0-130-163.ec2.internal" Apr 16 19:18:09.035764 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:09.035744 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-163.ec2.internal\" not found" node="ip-10-0-130-163.ec2.internal" Apr 16 19:18:09.046477 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:09.046451 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-163.ec2.internal\" not found" Apr 16 19:18:09.051830 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.051805 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9b83e43084b16f544f9da37f924cc2fa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal\" (UID: \"9b83e43084b16f544f9da37f924cc2fa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal" Apr 16 19:18:09.051924 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.051844 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b83e43084b16f544f9da37f924cc2fa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal\" (UID: \"9b83e43084b16f544f9da37f924cc2fa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal" Apr 16 19:18:09.051924 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.051866 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/01a926c8ca0a1b00ab2f53afa74bfa04-config\") pod \"kube-apiserver-proxy-ip-10-0-130-163.ec2.internal\" (UID: \"01a926c8ca0a1b00ab2f53afa74bfa04\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-163.ec2.internal" Apr 16 19:18:09.147454 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:09.147329 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-163.ec2.internal\" not found" Apr 16 19:18:09.152785 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.152760 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9b83e43084b16f544f9da37f924cc2fa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal\" (UID: \"9b83e43084b16f544f9da37f924cc2fa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal" Apr 16 19:18:09.152888 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.152797 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b83e43084b16f544f9da37f924cc2fa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal\" (UID: \"9b83e43084b16f544f9da37f924cc2fa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal" Apr 16 19:18:09.152888 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.152829 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/01a926c8ca0a1b00ab2f53afa74bfa04-config\") pod \"kube-apiserver-proxy-ip-10-0-130-163.ec2.internal\" (UID: \"01a926c8ca0a1b00ab2f53afa74bfa04\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-163.ec2.internal" Apr 16 19:18:09.152888 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.152858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9b83e43084b16f544f9da37f924cc2fa-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal\" (UID: \"9b83e43084b16f544f9da37f924cc2fa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal" Apr 16 19:18:09.152888 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.152873 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9b83e43084b16f544f9da37f924cc2fa-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal\" (UID: \"9b83e43084b16f544f9da37f924cc2fa\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal" Apr 16 19:18:09.153042 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.152886 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/01a926c8ca0a1b00ab2f53afa74bfa04-config\") pod \"kube-apiserver-proxy-ip-10-0-130-163.ec2.internal\" (UID: \"01a926c8ca0a1b00ab2f53afa74bfa04\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-163.ec2.internal" Apr 16 19:18:09.248253 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:09.248205 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-163.ec2.internal\" not found" Apr 16 19:18:09.334754 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.334721 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal" Apr 16 19:18:09.338357 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.338334 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-163.ec2.internal" Apr 16 19:18:09.349277 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:09.349249 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-163.ec2.internal\" not found" Apr 16 19:18:09.449849 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:09.449757 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-163.ec2.internal\" not found" Apr 16 19:18:09.550266 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:09.550227 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-163.ec2.internal\" not found" Apr 16 19:18:09.642603 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.642560 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:18:09.643188 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.642770 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:18:09.643188 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.642778 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:18:09.650832 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:09.650803 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-163.ec2.internal\" not found" Apr 16 19:18:09.670634 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.670594 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:18:09.740076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.740030 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:13:08 +0000 UTC" deadline="2027-09-23 03:47:10.965429356 +0000 UTC" Apr 16 19:18:09.740076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.740067 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12584h29m1.225366087s" Apr 16 19:18:09.748164 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.748132 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:18:09.751162 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:09.751141 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-163.ec2.internal\" not found" Apr 16 19:18:09.759723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.759687 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:18:09.778784 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.778748 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-clrrd" Apr 16 19:18:09.784637 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.784611 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-clrrd" Apr 16 19:18:09.841495 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:09.841459 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b83e43084b16f544f9da37f924cc2fa.slice/crio-5a2ef43ca0b06f31e09671edafcc27c2b5bd30605aad49d58d8813fb6b2ea647 WatchSource:0}: Error finding container 5a2ef43ca0b06f31e09671edafcc27c2b5bd30605aad49d58d8813fb6b2ea647: Status 404 returned error can't find the container with id 5a2ef43ca0b06f31e09671edafcc27c2b5bd30605aad49d58d8813fb6b2ea647 Apr 16 19:18:09.842958 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:09.842927 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01a926c8ca0a1b00ab2f53afa74bfa04.slice/crio-c3b59093f6439e9d6c8b1eca658b0386e2fee7cf1c54f8f21af16225f29add66 WatchSource:0}: Error finding container c3b59093f6439e9d6c8b1eca658b0386e2fee7cf1c54f8f21af16225f29add66: Status 404 returned error can't find the container with id c3b59093f6439e9d6c8b1eca658b0386e2fee7cf1c54f8f21af16225f29add66 Apr 16 19:18:09.847991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.847973 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:18:09.851808 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:09.851789 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-163.ec2.internal\" not found" Apr 16 19:18:09.886982 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.886929 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal" event={"ID":"9b83e43084b16f544f9da37f924cc2fa","Type":"ContainerStarted","Data":"5a2ef43ca0b06f31e09671edafcc27c2b5bd30605aad49d58d8813fb6b2ea647"} Apr 16 19:18:09.890082 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.890040 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-163.ec2.internal" event={"ID":"01a926c8ca0a1b00ab2f53afa74bfa04","Type":"ContainerStarted","Data":"c3b59093f6439e9d6c8b1eca658b0386e2fee7cf1c54f8f21af16225f29add66"} Apr 16 19:18:09.952429 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:09.952362 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-163.ec2.internal\" not found" Apr 16 19:18:09.992257 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:09.992148 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:18:10.049203 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.049160 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-163.ec2.internal" Apr 16 19:18:10.059738 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.059702 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:18:10.060895 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.060875 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal" Apr 16 19:18:10.070331 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.070305 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:18:10.483360 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.483281 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:18:10.725577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.725548 2578 apiserver.go:52] "Watching apiserver" Apr 16 19:18:10.732837 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.732810 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:18:10.734567 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.734482 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-h9xsl","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc","openshift-dns/node-resolver-kndsd","openshift-multus/multus-bl8m2","openshift-multus/network-metrics-daemon-xhnjz","openshift-network-diagnostics/network-check-target-m5l2p","openshift-network-operator/iptables-alerter-s2zm4","openshift-ovn-kubernetes/ovnkube-node-bcplr","kube-system/kube-apiserver-proxy-ip-10-0-130-163.ec2.internal","openshift-cluster-node-tuning-operator/tuned-9k9rp","openshift-image-registry/node-ca-v9w9k","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal","openshift-multus/multus-additional-cni-plugins-sqvwz"] Apr 16 19:18:10.737053 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.737031 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.739080 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.739044 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:10.739197 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:10.739130 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:10.739641 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.739621 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:18:10.739717 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.739679 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dt27d\"" Apr 16 19:18:10.740329 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.740302 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:18:10.741180 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.741163 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:10.741304 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:10.741226 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:10.743745 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.743377 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s2zm4" Apr 16 19:18:10.745175 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.745157 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:18:10.745530 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.745468 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dmmlb\"" Apr 16 19:18:10.745625 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.745597 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:18:10.745693 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.745630 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:18:10.748997 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.748976 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.749111 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.748985 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h9xsl" Apr 16 19:18:10.751578 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.751362 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:18:10.751578 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.751386 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:18:10.751578 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.751451 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:18:10.751578 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.751504 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:18:10.751834 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.751763 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sh9hm\"" Apr 16 19:18:10.753073 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.752539 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:18:10.753073 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.752599 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:18:10.753073 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.752689 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gcn7q\"" Apr 16 19:18:10.753073 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.752806 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:18:10.753073 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.752884 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:18:10.755113 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.755086 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.757386 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.757368 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:18:10.757933 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.757902 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qxzft\"" Apr 16 19:18:10.757933 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.757926 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:18:10.758119 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.758078 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:18:10.758287 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.758272 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v9w9k" Apr 16 19:18:10.758386 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.758366 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kndsd" Apr 16 19:18:10.760706 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.760686 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.761056 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.761034 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:18:10.761162 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.761057 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:18:10.761345 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.761328 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:18:10.761456 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.761441 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-jrghd\"" Apr 16 19:18:10.761523 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.761471 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:18:10.761523 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.761441 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:18:10.761856 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.761840 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fsnql\"" Apr 16 19:18:10.762486 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762466 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fdrl6\"" Apr 16 19:18:10.762573 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762546 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d90da04b-9bd0-4142-9e92-b6e47a4f708c-host\") pod \"node-ca-v9w9k\" (UID: \"d90da04b-9bd0-4142-9e92-b6e47a4f708c\") " pod="openshift-image-registry/node-ca-v9w9k" Apr 16 19:18:10.762617 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762576 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-ovnkube-script-lib\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.762617 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762592 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-sysconfig\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.762617 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762613 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-host\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.762706 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762638 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvmft\" (UniqueName: \"kubernetes.io/projected/6d4442d8-719c-4d16-8de1-b4ca7c709645-kube-api-access-wvmft\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.762706 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762661 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-kubernetes\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.762706 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762685 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c01ac44c-e356-4ba6-9aa3-005d0558378f-agent-certs\") pod \"konnectivity-agent-h9xsl\" (UID: \"c01ac44c-e356-4ba6-9aa3-005d0558378f\") " pod="kube-system/konnectivity-agent-h9xsl" Apr 16 19:18:10.762706 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzc88\" (UniqueName: \"kubernetes.io/projected/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-kube-api-access-dzc88\") pod \"network-metrics-daemon-xhnjz\" (UID: \"81fa50c0-8c06-4a6c-9d00-a1ed89b88844\") " pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:10.762819 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762718 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-log-socket\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.762819 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762738 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-run-ovn-kubernetes\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.762819 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762764 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.762819 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762788 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-ovnkube-config\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.762819 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762806 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-env-overrides\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.763007 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762830 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-sysctl-conf\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.763007 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762852 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/99c90d56-3b40-4f40-bc97-12d990afb385-iptables-alerter-script\") pod \"iptables-alerter-s2zm4\" (UID: \"99c90d56-3b40-4f40-bc97-12d990afb385\") " pod="openshift-network-operator/iptables-alerter-s2zm4" Apr 16 19:18:10.763007 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762867 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-var-lib-openvswitch\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.763007 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762880 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-etc-openvswitch\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.763007 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762896 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwf4s\" (UniqueName: \"kubernetes.io/projected/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-kube-api-access-vwf4s\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.763007 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762910 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d90da04b-9bd0-4142-9e92-b6e47a4f708c-serviceca\") pod \"node-ca-v9w9k\" (UID: \"d90da04b-9bd0-4142-9e92-b6e47a4f708c\") " pod="openshift-image-registry/node-ca-v9w9k" Apr 16 19:18:10.763007 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762925 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-run\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.763007 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.762967 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-tuned\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.763007 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763002 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c01ac44c-e356-4ba6-9aa3-005d0558378f-konnectivity-ca\") pod \"konnectivity-agent-h9xsl\" (UID: \"c01ac44c-e356-4ba6-9aa3-005d0558378f\") " pod="kube-system/konnectivity-agent-h9xsl" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763030 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-kubelet\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763055 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-run-netns\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763078 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-node-log\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763101 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-cni-bin\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763122 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-cni-netd\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763146 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-ovn-node-metrics-cert\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763169 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-modprobe-d\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763193 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763204 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-lib-modules\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fbx4\" (UniqueName: \"kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4\") pod \"network-check-target-m5l2p\" (UID: \"82ff5922-30ac-4de1-81e9-3d15bce731aa\") " pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763274 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99c90d56-3b40-4f40-bc97-12d990afb385-host-slash\") pod \"iptables-alerter-s2zm4\" (UID: \"99c90d56-3b40-4f40-bc97-12d990afb385\") " pod="openshift-network-operator/iptables-alerter-s2zm4" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763303 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-systemd-units\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-slash\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763355 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-run-openvswitch\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763378 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d4442d8-719c-4d16-8de1-b4ca7c709645-tmp\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763423 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-run-systemd\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.763569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763445 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-systemd\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.764125 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763474 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-sys\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.764125 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763497 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twjg8\" (UniqueName: \"kubernetes.io/projected/99c90d56-3b40-4f40-bc97-12d990afb385-kube-api-access-twjg8\") pod \"iptables-alerter-s2zm4\" (UID: \"99c90d56-3b40-4f40-bc97-12d990afb385\") " pod="openshift-network-operator/iptables-alerter-s2zm4" Apr 16 19:18:10.764125 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763520 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-run-ovn\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.764125 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763563 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxzhm\" (UniqueName: \"kubernetes.io/projected/d90da04b-9bd0-4142-9e92-b6e47a4f708c-kube-api-access-kxzhm\") pod \"node-ca-v9w9k\" (UID: \"d90da04b-9bd0-4142-9e92-b6e47a4f708c\") " pod="openshift-image-registry/node-ca-v9w9k" Apr 16 19:18:10.764125 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763591 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-sysctl-d\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.764125 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763625 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-var-lib-kubelet\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.764125 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.763660 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs\") pod \"network-metrics-daemon-xhnjz\" (UID: \"81fa50c0-8c06-4a6c-9d00-a1ed89b88844\") " pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:10.765300 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.764805 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:18:10.765300 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.764869 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:18:10.765300 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.765002 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:18:10.765300 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.765018 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:18:10.765591 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.765339 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:18:10.765591 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.765425 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-bc9sd\"" Apr 16 19:18:10.765591 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.765342 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:18:10.785245 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.785217 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:13:09 +0000 UTC" deadline="2027-11-12 23:59:29.610900743 +0000 UTC" Apr 16 19:18:10.785245 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.785243 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13804h41m18.825660578s" Apr 16 19:18:10.850458 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.850424 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:18:10.864124 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864088 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-run-ovn-kubernetes\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.864308 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.864308 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864184 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-ovnkube-config\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.864308 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-sysctl-conf\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.864308 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864225 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-run-ovn-kubernetes\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.864308 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-run-k8s-cni-cncf-io\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.864308 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864275 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-var-lib-cni-multus\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.864308 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.864308 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864298 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/99c90d56-3b40-4f40-bc97-12d990afb385-iptables-alerter-script\") pod \"iptables-alerter-s2zm4\" (UID: \"99c90d56-3b40-4f40-bc97-12d990afb385\") " pod="openshift-network-operator/iptables-alerter-s2zm4" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864338 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d90da04b-9bd0-4142-9e92-b6e47a4f708c-serviceca\") pod \"node-ca-v9w9k\" (UID: \"d90da04b-9bd0-4142-9e92-b6e47a4f708c\") " pod="openshift-image-registry/node-ca-v9w9k" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864370 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-tuned\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864400 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-multus-conf-dir\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864427 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-sysctl-conf\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864447 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-kubelet\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864472 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-node-log\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864498 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-cni-bin\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864522 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-cni-netd\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864550 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-ovn-node-metrics-cert\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864580 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-modprobe-d\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864590 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-node-log\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864601 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c8ee73e7-c318-4fd0-a148-eef6ac668052-hosts-file\") pod \"node-resolver-kndsd\" (UID: \"c8ee73e7-c318-4fd0-a148-eef6ac668052\") " pod="openshift-dns/node-resolver-kndsd" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864616 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-os-release\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864626 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-kubelet\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864632 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbx4\" (UniqueName: \"kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4\") pod \"network-check-target-m5l2p\" (UID: \"82ff5922-30ac-4de1-81e9-3d15bce731aa\") " pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864647 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99c90d56-3b40-4f40-bc97-12d990afb385-host-slash\") pod \"iptables-alerter-s2zm4\" (UID: \"99c90d56-3b40-4f40-bc97-12d990afb385\") " pod="openshift-network-operator/iptables-alerter-s2zm4" Apr 16 19:18:10.864723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864663 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-run-openvswitch\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864681 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-socket-dir\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864720 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-var-lib-kubelet\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864751 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-run-systemd\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-sys\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864791 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.864975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-ovnkube-config\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865011 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-modprobe-d\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/99c90d56-3b40-4f40-bc97-12d990afb385-iptables-alerter-script\") pod \"iptables-alerter-s2zm4\" (UID: \"99c90d56-3b40-4f40-bc97-12d990afb385\") " pod="openshift-network-operator/iptables-alerter-s2zm4" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865080 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/99c90d56-3b40-4f40-bc97-12d990afb385-host-slash\") pod \"iptables-alerter-s2zm4\" (UID: \"99c90d56-3b40-4f40-bc97-12d990afb385\") " pod="openshift-network-operator/iptables-alerter-s2zm4" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865085 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-run-openvswitch\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865175 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-cni-bin\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865230 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-sys\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865267 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-run-systemd\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865421 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-cni-netd\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.865478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865474 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d90da04b-9bd0-4142-9e92-b6e47a4f708c-serviceca\") pod \"node-ca-v9w9k\" (UID: \"d90da04b-9bd0-4142-9e92-b6e47a4f708c\") " pod="openshift-image-registry/node-ca-v9w9k" Apr 16 19:18:10.866047 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865527 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-registration-dir\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.866047 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twjg8\" (UniqueName: \"kubernetes.io/projected/99c90d56-3b40-4f40-bc97-12d990afb385-kube-api-access-twjg8\") pod \"iptables-alerter-s2zm4\" (UID: \"99c90d56-3b40-4f40-bc97-12d990afb385\") " pod="openshift-network-operator/iptables-alerter-s2zm4" Apr 16 19:18:10.866047 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865595 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-run-ovn\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.866047 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865617 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxzhm\" (UniqueName: \"kubernetes.io/projected/d90da04b-9bd0-4142-9e92-b6e47a4f708c-kube-api-access-kxzhm\") pod \"node-ca-v9w9k\" (UID: \"d90da04b-9bd0-4142-9e92-b6e47a4f708c\") " pod="openshift-image-registry/node-ca-v9w9k" Apr 16 19:18:10.866047 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865742 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-sysctl-d\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.866047 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865736 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-run-ovn\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.866047 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865779 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-var-lib-kubelet\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.866047 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865859 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-var-lib-kubelet\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.866047 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865892 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-sysctl-d\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.866047 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865907 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-multus-cni-dir\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.866047 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.865976 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-multus-socket-dir-parent\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.866047 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866036 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-hostroot\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866066 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-sysconfig\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866090 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-kubernetes\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866110 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c01ac44c-e356-4ba6-9aa3-005d0558378f-agent-certs\") pod \"konnectivity-agent-h9xsl\" (UID: \"c01ac44c-e356-4ba6-9aa3-005d0558378f\") " pod="kube-system/konnectivity-agent-h9xsl" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866131 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzc88\" (UniqueName: \"kubernetes.io/projected/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-kube-api-access-dzc88\") pod \"network-metrics-daemon-xhnjz\" (UID: \"81fa50c0-8c06-4a6c-9d00-a1ed89b88844\") " pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866155 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-var-lib-cni-bin\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866177 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f85164ec-72d2-43d8-8a96-11e63cc91aeb-multus-daemon-config\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866188 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-kubernetes\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866200 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-os-release\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866224 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-cni-binary-copy\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866243 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-sysconfig\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-log-socket\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866309 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-log-socket\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866442 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-env-overrides\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866533 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-run\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.866577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866560 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c01ac44c-e356-4ba6-9aa3-005d0558378f-konnectivity-ca\") pod \"konnectivity-agent-h9xsl\" (UID: \"c01ac44c-e356-4ba6-9aa3-005d0558378f\") " pod="kube-system/konnectivity-agent-h9xsl" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866584 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-run\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866603 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-etc-selinux\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866628 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-system-cni-dir\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866677 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-var-lib-openvswitch\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866724 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-etc-openvswitch\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866767 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-etc-openvswitch\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866797 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwf4s\" (UniqueName: \"kubernetes.io/projected/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-kube-api-access-vwf4s\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866813 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-var-lib-openvswitch\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.866955 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-env-overrides\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl6mk\" (UniqueName: \"kubernetes.io/projected/2c0e3e53-533e-40a3-85d0-53f899612b42-kube-api-access-xl6mk\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c8ee73e7-c318-4fd0-a148-eef6ac668052-tmp-dir\") pod \"node-resolver-kndsd\" (UID: \"c8ee73e7-c318-4fd0-a148-eef6ac668052\") " pod="openshift-dns/node-resolver-kndsd" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867103 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c01ac44c-e356-4ba6-9aa3-005d0558378f-konnectivity-ca\") pod \"konnectivity-agent-h9xsl\" (UID: \"c01ac44c-e356-4ba6-9aa3-005d0558378f\") " pod="kube-system/konnectivity-agent-h9xsl" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867105 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-cnibin\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867161 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867192 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867223 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-run-netns\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.867249 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867249 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-lib-modules\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.868021 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867279 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-sys-fs\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.868021 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867305 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-systemd-units\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.868021 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867301 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-run-netns\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.868021 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867331 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-slash\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.868021 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867365 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-lib-modules\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.868021 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867364 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d4442d8-719c-4d16-8de1-b4ca7c709645-tmp\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.868021 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867378 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-host-slash\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.868021 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867435 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-systemd-units\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.868021 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867435 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-system-cni-dir\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.868021 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867493 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.868021 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-systemd\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.868021 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867569 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-systemd\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.868021 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.867552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xck4w\" (UniqueName: \"kubernetes.io/projected/c8ee73e7-c318-4fd0-a148-eef6ac668052-kube-api-access-xck4w\") pod \"node-resolver-kndsd\" (UID: \"c8ee73e7-c318-4fd0-a148-eef6ac668052\") " pod="openshift-dns/node-resolver-kndsd" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.868677 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-cnibin\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.868730 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f85164ec-72d2-43d8-8a96-11e63cc91aeb-cni-binary-copy\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.868759 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-run-multus-certs\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.868789 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-etc-kubernetes\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.868822 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnpk8\" (UniqueName: \"kubernetes.io/projected/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-kube-api-access-gnpk8\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.868858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs\") pod \"network-metrics-daemon-xhnjz\" (UID: \"81fa50c0-8c06-4a6c-9d00-a1ed89b88844\") " pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.868888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d90da04b-9bd0-4142-9e92-b6e47a4f708c-host\") pod \"node-ca-v9w9k\" (UID: \"d90da04b-9bd0-4142-9e92-b6e47a4f708c\") " pod="openshift-image-registry/node-ca-v9w9k" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.868914 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-ovnkube-script-lib\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.868944 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-host\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.868974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvmft\" (UniqueName: \"kubernetes.io/projected/6d4442d8-719c-4d16-8de1-b4ca7c709645-kube-api-access-wvmft\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.869005 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.869035 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-device-dir\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.869062 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-run-netns\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.869094 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c75x\" (UniqueName: \"kubernetes.io/projected/f85164ec-72d2-43d8-8a96-11e63cc91aeb-kube-api-access-9c75x\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:10.869306 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.869355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6d4442d8-719c-4d16-8de1-b4ca7c709645-etc-tuned\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.869711 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:10.869391 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs podName:81fa50c0-8c06-4a6c-9d00-a1ed89b88844 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:11.369361014 +0000 UTC m=+3.107264283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs") pod "network-metrics-daemon-xhnjz" (UID: "81fa50c0-8c06-4a6c-9d00-a1ed89b88844") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:10.870565 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.869523 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-ovn-node-metrics-cert\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.870565 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.869831 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d4442d8-719c-4d16-8de1-b4ca7c709645-host\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.870565 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.869900 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d90da04b-9bd0-4142-9e92-b6e47a4f708c-host\") pod \"node-ca-v9w9k\" (UID: \"d90da04b-9bd0-4142-9e92-b6e47a4f708c\") " pod="openshift-image-registry/node-ca-v9w9k" Apr 16 19:18:10.871253 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.871221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-ovnkube-script-lib\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.872631 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.872605 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d4442d8-719c-4d16-8de1-b4ca7c709645-tmp\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.872843 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.872826 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c01ac44c-e356-4ba6-9aa3-005d0558378f-agent-certs\") pod \"konnectivity-agent-h9xsl\" (UID: \"c01ac44c-e356-4ba6-9aa3-005d0558378f\") " pod="kube-system/konnectivity-agent-h9xsl" Apr 16 19:18:10.873688 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:10.873669 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:10.873756 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:10.873700 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:10.873756 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:10.873715 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8fbx4 for pod openshift-network-diagnostics/network-check-target-m5l2p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:10.873896 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:10.873874 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4 podName:82ff5922-30ac-4de1-81e9-3d15bce731aa nodeName:}" failed. No retries permitted until 2026-04-16 19:18:11.373854133 +0000 UTC m=+3.111757401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-8fbx4" (UniqueName: "kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4") pod "network-check-target-m5l2p" (UID: "82ff5922-30ac-4de1-81e9-3d15bce731aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:10.876574 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.876546 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzc88\" (UniqueName: \"kubernetes.io/projected/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-kube-api-access-dzc88\") pod \"network-metrics-daemon-xhnjz\" (UID: \"81fa50c0-8c06-4a6c-9d00-a1ed89b88844\") " pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:10.877591 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.877570 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twjg8\" (UniqueName: \"kubernetes.io/projected/99c90d56-3b40-4f40-bc97-12d990afb385-kube-api-access-twjg8\") pod \"iptables-alerter-s2zm4\" (UID: \"99c90d56-3b40-4f40-bc97-12d990afb385\") " pod="openshift-network-operator/iptables-alerter-s2zm4" Apr 16 19:18:10.877679 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.877603 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxzhm\" (UniqueName: \"kubernetes.io/projected/d90da04b-9bd0-4142-9e92-b6e47a4f708c-kube-api-access-kxzhm\") pod \"node-ca-v9w9k\" (UID: \"d90da04b-9bd0-4142-9e92-b6e47a4f708c\") " pod="openshift-image-registry/node-ca-v9w9k" Apr 16 19:18:10.877736 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.877718 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwf4s\" (UniqueName: \"kubernetes.io/projected/5b22e5a6-5ee7-48e8-b3d5-b1b77686c765-kube-api-access-vwf4s\") pod \"ovnkube-node-bcplr\" (UID: \"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765\") " pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:10.886061 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.886034 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvmft\" (UniqueName: \"kubernetes.io/projected/6d4442d8-719c-4d16-8de1-b4ca7c709645-kube-api-access-wvmft\") pod \"tuned-9k9rp\" (UID: \"6d4442d8-719c-4d16-8de1-b4ca7c709645\") " pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:10.905283 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.905251 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:18:10.969514 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969468 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xck4w\" (UniqueName: \"kubernetes.io/projected/c8ee73e7-c318-4fd0-a148-eef6ac668052-kube-api-access-xck4w\") pod \"node-resolver-kndsd\" (UID: \"c8ee73e7-c318-4fd0-a148-eef6ac668052\") " pod="openshift-dns/node-resolver-kndsd" Apr 16 19:18:10.969514 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969513 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-cnibin\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.969730 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f85164ec-72d2-43d8-8a96-11e63cc91aeb-cni-binary-copy\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.969730 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969563 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-run-multus-certs\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.969730 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969593 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-etc-kubernetes\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.969730 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969617 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnpk8\" (UniqueName: \"kubernetes.io/projected/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-kube-api-access-gnpk8\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.969730 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.969730 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969683 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-device-dir\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.969730 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969729 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-run-netns\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9c75x\" (UniqueName: \"kubernetes.io/projected/f85164ec-72d2-43d8-8a96-11e63cc91aeb-kube-api-access-9c75x\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-run-k8s-cni-cncf-io\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969824 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-var-lib-cni-multus\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969853 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-multus-conf-dir\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969875 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-run-multus-certs\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969950 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-run-netns\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969952 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c8ee73e7-c318-4fd0-a148-eef6ac668052-hosts-file\") pod \"node-resolver-kndsd\" (UID: \"c8ee73e7-c318-4fd0-a148-eef6ac668052\") " pod="openshift-dns/node-resolver-kndsd" Apr 16 19:18:10.970076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969986 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-device-dir\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.970076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970033 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-etc-kubernetes\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c8ee73e7-c318-4fd0-a148-eef6ac668052-hosts-file\") pod \"node-resolver-kndsd\" (UID: \"c8ee73e7-c318-4fd0-a148-eef6ac668052\") " pod="openshift-dns/node-resolver-kndsd" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970203 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-os-release\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f85164ec-72d2-43d8-8a96-11e63cc91aeb-cni-binary-copy\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-socket-dir\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-var-lib-cni-multus\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970264 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-var-lib-kubelet\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970283 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970284 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-multus-conf-dir\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-registration-dir\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970315 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-var-lib-kubelet\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.969823 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-cnibin\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970341 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-registration-dir\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970345 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-multus-cni-dir\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970341 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-os-release\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-run-k8s-cni-cncf-io\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970374 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-multus-socket-dir-parent\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-hostroot\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.970547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970432 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-multus-cni-dir\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970441 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-socket-dir\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970455 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-var-lib-cni-bin\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970465 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-multus-socket-dir-parent\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970474 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-hostroot\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970481 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f85164ec-72d2-43d8-8a96-11e63cc91aeb-multus-daemon-config\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970505 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-os-release\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970508 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-host-var-lib-cni-bin\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970531 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-cni-binary-copy\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970560 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-etc-selinux\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970586 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-os-release\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970621 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-system-cni-dir\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970652 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl6mk\" (UniqueName: \"kubernetes.io/projected/2c0e3e53-533e-40a3-85d0-53f899612b42-kube-api-access-xl6mk\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970680 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c8ee73e7-c318-4fd0-a148-eef6ac668052-tmp-dir\") pod \"node-resolver-kndsd\" (UID: \"c8ee73e7-c318-4fd0-a148-eef6ac668052\") " pod="openshift-dns/node-resolver-kndsd" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970716 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-cnibin\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970741 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970768 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.971398 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970796 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-sys-fs\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.971991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970825 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-system-cni-dir\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.971991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.970851 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.971991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.971026 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f85164ec-72d2-43d8-8a96-11e63cc91aeb-multus-daemon-config\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.971991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.971049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-cni-binary-copy\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.971991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.971088 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-cnibin\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.971991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.971110 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f85164ec-72d2-43d8-8a96-11e63cc91aeb-system-cni-dir\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.971991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.971199 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-etc-selinux\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.971991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.971355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.971991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.971447 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2c0e3e53-533e-40a3-85d0-53f899612b42-sys-fs\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:10.971991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.971497 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-system-cni-dir\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.971991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.971521 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c8ee73e7-c318-4fd0-a148-eef6ac668052-tmp-dir\") pod \"node-resolver-kndsd\" (UID: \"c8ee73e7-c318-4fd0-a148-eef6ac668052\") " pod="openshift-dns/node-resolver-kndsd" Apr 16 19:18:10.971991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.971538 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.971991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.971628 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.978011 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.977983 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c75x\" (UniqueName: \"kubernetes.io/projected/f85164ec-72d2-43d8-8a96-11e63cc91aeb-kube-api-access-9c75x\") pod \"multus-bl8m2\" (UID: \"f85164ec-72d2-43d8-8a96-11e63cc91aeb\") " pod="openshift-multus/multus-bl8m2" Apr 16 19:18:10.978227 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.978208 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnpk8\" (UniqueName: \"kubernetes.io/projected/ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e-kube-api-access-gnpk8\") pod \"multus-additional-cni-plugins-sqvwz\" (UID: \"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e\") " pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:10.978227 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.978218 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xck4w\" (UniqueName: \"kubernetes.io/projected/c8ee73e7-c318-4fd0-a148-eef6ac668052-kube-api-access-xck4w\") pod \"node-resolver-kndsd\" (UID: \"c8ee73e7-c318-4fd0-a148-eef6ac668052\") " pod="openshift-dns/node-resolver-kndsd" Apr 16 19:18:10.979318 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:10.979295 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl6mk\" (UniqueName: \"kubernetes.io/projected/2c0e3e53-533e-40a3-85d0-53f899612b42-kube-api-access-xl6mk\") pod \"aws-ebs-csi-driver-node-jd9bc\" (UID: \"2c0e3e53-533e-40a3-85d0-53f899612b42\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:11.047752 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.047662 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" Apr 16 19:18:11.060815 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.060772 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s2zm4" Apr 16 19:18:11.069710 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.069676 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:11.074614 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.074583 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h9xsl" Apr 16 19:18:11.081568 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.081514 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" Apr 16 19:18:11.087525 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.087498 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v9w9k" Apr 16 19:18:11.095279 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.095249 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kndsd" Apr 16 19:18:11.101523 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.101174 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bl8m2" Apr 16 19:18:11.106100 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.106073 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sqvwz" Apr 16 19:18:11.373740 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.373642 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs\") pod \"network-metrics-daemon-xhnjz\" (UID: \"81fa50c0-8c06-4a6c-9d00-a1ed89b88844\") " pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:11.373896 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:11.373829 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:11.373959 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:11.373909 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs podName:81fa50c0-8c06-4a6c-9d00-a1ed89b88844 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:12.373888769 +0000 UTC m=+4.111792034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs") pod "network-metrics-daemon-xhnjz" (UID: "81fa50c0-8c06-4a6c-9d00-a1ed89b88844") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:11.474463 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.474420 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbx4\" (UniqueName: \"kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4\") pod \"network-check-target-m5l2p\" (UID: \"82ff5922-30ac-4de1-81e9-3d15bce731aa\") " pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:11.474653 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:11.474629 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:11.474701 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:11.474660 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:11.474701 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:11.474676 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8fbx4 for pod openshift-network-diagnostics/network-check-target-m5l2p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:11.474791 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:11.474758 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4 podName:82ff5922-30ac-4de1-81e9-3d15bce731aa nodeName:}" failed. No retries permitted until 2026-04-16 19:18:12.474723856 +0000 UTC m=+4.212627104 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-8fbx4" (UniqueName: "kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4") pod "network-check-target-m5l2p" (UID: "82ff5922-30ac-4de1-81e9-3d15bce731aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:11.602738 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:11.602702 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccb4c0a2_5a79_41a1_b87b_38f1a9443c9e.slice/crio-e1d40c0a4ae6584591ebdf431bc8770073690e92fdc906ae9e5059acbc639a3f WatchSource:0}: Error finding container e1d40c0a4ae6584591ebdf431bc8770073690e92fdc906ae9e5059acbc639a3f: Status 404 returned error can't find the container with id e1d40c0a4ae6584591ebdf431bc8770073690e92fdc906ae9e5059acbc639a3f Apr 16 19:18:11.603811 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:11.603786 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d4442d8_719c_4d16_8de1_b4ca7c709645.slice/crio-30f292946e8de082622954991383068574d172b915a0b2a1b50db2de32020930 WatchSource:0}: Error finding container 30f292946e8de082622954991383068574d172b915a0b2a1b50db2de32020930: Status 404 returned error can't find the container with id 30f292946e8de082622954991383068574d172b915a0b2a1b50db2de32020930 Apr 16 19:18:11.604715 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:11.604690 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85164ec_72d2_43d8_8a96_11e63cc91aeb.slice/crio-819fd514a788ce66c0d7467b95a1e1ebd4de04c4ba64b7d5343213d6bf65482f WatchSource:0}: Error finding container 819fd514a788ce66c0d7467b95a1e1ebd4de04c4ba64b7d5343213d6bf65482f: Status 404 returned error can't find the container with id 819fd514a788ce66c0d7467b95a1e1ebd4de04c4ba64b7d5343213d6bf65482f Apr 16 19:18:11.609608 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:11.609545 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd90da04b_9bd0_4142_9e92_b6e47a4f708c.slice/crio-38e2be3ef943347494e6205963fc2a1ccfdadd057697276a5839cf6cc18d4920 WatchSource:0}: Error finding container 38e2be3ef943347494e6205963fc2a1ccfdadd057697276a5839cf6cc18d4920: Status 404 returned error can't find the container with id 38e2be3ef943347494e6205963fc2a1ccfdadd057697276a5839cf6cc18d4920 Apr 16 19:18:11.609890 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:11.609872 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99c90d56_3b40_4f40_bc97_12d990afb385.slice/crio-66d953d609aa0040a0a8835f6876067da841e1e6cc7617c80209de49805efcb4 WatchSource:0}: Error finding container 66d953d609aa0040a0a8835f6876067da841e1e6cc7617c80209de49805efcb4: Status 404 returned error can't find the container with id 66d953d609aa0040a0a8835f6876067da841e1e6cc7617c80209de49805efcb4 Apr 16 19:18:11.611528 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:11.611421 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8ee73e7_c318_4fd0_a148_eef6ac668052.slice/crio-096ccc008ae6f7351849bf0e467f8dfb2727eb8ce58d8847180c97c5b162baba WatchSource:0}: Error finding container 096ccc008ae6f7351849bf0e467f8dfb2727eb8ce58d8847180c97c5b162baba: Status 404 returned error can't find the container with id 096ccc008ae6f7351849bf0e467f8dfb2727eb8ce58d8847180c97c5b162baba Apr 16 19:18:11.611786 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:11.611747 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b22e5a6_5ee7_48e8_b3d5_b1b77686c765.slice/crio-df04e81da3871bca733d5e4f56884c06f5851981e50e441950d6fab5729dfbab WatchSource:0}: Error finding container df04e81da3871bca733d5e4f56884c06f5851981e50e441950d6fab5729dfbab: Status 404 returned error can't find the container with id df04e81da3871bca733d5e4f56884c06f5851981e50e441950d6fab5729dfbab Apr 16 19:18:11.612813 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:11.612612 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc01ac44c_e356_4ba6_9aa3_005d0558378f.slice/crio-9e612b2778c279ee131cfe542d6473d17b9c7614830bd00fb6b7bf3e376403db WatchSource:0}: Error finding container 9e612b2778c279ee131cfe542d6473d17b9c7614830bd00fb6b7bf3e376403db: Status 404 returned error can't find the container with id 9e612b2778c279ee131cfe542d6473d17b9c7614830bd00fb6b7bf3e376403db Apr 16 19:18:11.786178 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.786134 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:13:09 +0000 UTC" deadline="2027-12-04 19:14:56.769738824 +0000 UTC" Apr 16 19:18:11.786178 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.786173 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14327h56m44.983569609s" Apr 16 19:18:11.895244 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.895108 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqvwz" event={"ID":"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e","Type":"ContainerStarted","Data":"e1d40c0a4ae6584591ebdf431bc8770073690e92fdc906ae9e5059acbc639a3f"} Apr 16 19:18:11.896714 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.896677 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-163.ec2.internal" event={"ID":"01a926c8ca0a1b00ab2f53afa74bfa04","Type":"ContainerStarted","Data":"a46e2a122c8acd0de935edec721efed706c66093df17e217e28e4bf2e6783d40"} Apr 16 19:18:11.897773 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.897738 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" event={"ID":"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765","Type":"ContainerStarted","Data":"df04e81da3871bca733d5e4f56884c06f5851981e50e441950d6fab5729dfbab"} Apr 16 19:18:11.898815 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.898783 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bl8m2" event={"ID":"f85164ec-72d2-43d8-8a96-11e63cc91aeb","Type":"ContainerStarted","Data":"819fd514a788ce66c0d7467b95a1e1ebd4de04c4ba64b7d5343213d6bf65482f"} Apr 16 19:18:11.899716 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.899677 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" event={"ID":"6d4442d8-719c-4d16-8de1-b4ca7c709645","Type":"ContainerStarted","Data":"30f292946e8de082622954991383068574d172b915a0b2a1b50db2de32020930"} Apr 16 19:18:11.902002 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.901969 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s2zm4" event={"ID":"99c90d56-3b40-4f40-bc97-12d990afb385","Type":"ContainerStarted","Data":"66d953d609aa0040a0a8835f6876067da841e1e6cc7617c80209de49805efcb4"} Apr 16 19:18:11.903017 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.902993 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h9xsl" event={"ID":"c01ac44c-e356-4ba6-9aa3-005d0558378f","Type":"ContainerStarted","Data":"9e612b2778c279ee131cfe542d6473d17b9c7614830bd00fb6b7bf3e376403db"} Apr 16 19:18:11.904124 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.904102 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kndsd" event={"ID":"c8ee73e7-c318-4fd0-a148-eef6ac668052","Type":"ContainerStarted","Data":"096ccc008ae6f7351849bf0e467f8dfb2727eb8ce58d8847180c97c5b162baba"} Apr 16 19:18:11.905455 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.905431 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v9w9k" event={"ID":"d90da04b-9bd0-4142-9e92-b6e47a4f708c","Type":"ContainerStarted","Data":"38e2be3ef943347494e6205963fc2a1ccfdadd057697276a5839cf6cc18d4920"} Apr 16 19:18:11.907022 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.907001 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" event={"ID":"2c0e3e53-533e-40a3-85d0-53f899612b42","Type":"ContainerStarted","Data":"84f4e8fa0b9697ddf2cc6003c2d13e2586f8ca4e07df9e97396e0ecb3b642ddf"} Apr 16 19:18:11.912000 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:11.911955 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-163.ec2.internal" podStartSLOduration=1.911942676 podStartE2EDuration="1.911942676s" podCreationTimestamp="2026-04-16 19:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:18:11.911832476 +0000 UTC m=+3.649735745" watchObservedRunningTime="2026-04-16 19:18:11.911942676 +0000 UTC m=+3.649845944" Apr 16 19:18:12.382157 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:12.382069 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs\") pod \"network-metrics-daemon-xhnjz\" (UID: \"81fa50c0-8c06-4a6c-9d00-a1ed89b88844\") " pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:12.382330 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:12.382265 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:12.382430 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:12.382399 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs podName:81fa50c0-8c06-4a6c-9d00-a1ed89b88844 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:14.382325627 +0000 UTC m=+6.120228878 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs") pod "network-metrics-daemon-xhnjz" (UID: "81fa50c0-8c06-4a6c-9d00-a1ed89b88844") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:12.483928 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:12.483289 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbx4\" (UniqueName: \"kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4\") pod \"network-check-target-m5l2p\" (UID: \"82ff5922-30ac-4de1-81e9-3d15bce731aa\") " pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:12.483928 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:12.483484 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:12.483928 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:12.483505 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:12.483928 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:12.483518 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8fbx4 for pod openshift-network-diagnostics/network-check-target-m5l2p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:12.483928 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:12.483581 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4 podName:82ff5922-30ac-4de1-81e9-3d15bce731aa nodeName:}" failed. No retries permitted until 2026-04-16 19:18:14.483560296 +0000 UTC m=+6.221463560 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-8fbx4" (UniqueName: "kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4") pod "network-check-target-m5l2p" (UID: "82ff5922-30ac-4de1-81e9-3d15bce731aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:12.882600 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:12.882564 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:12.883057 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:12.882720 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:12.883519 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:12.883493 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:12.883627 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:12.883609 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:12.930884 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:12.929655 2578 generic.go:358] "Generic (PLEG): container finished" podID="9b83e43084b16f544f9da37f924cc2fa" containerID="7c3cd6e241373ba5f31e583c35dd80e678f179834be2a2382e1d6351c86bf870" exitCode=0 Apr 16 19:18:12.930884 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:12.930664 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal" event={"ID":"9b83e43084b16f544f9da37f924cc2fa","Type":"ContainerDied","Data":"7c3cd6e241373ba5f31e583c35dd80e678f179834be2a2382e1d6351c86bf870"} Apr 16 19:18:13.286965 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:13.286933 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-258dd"] Apr 16 19:18:13.292135 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:13.291785 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:13.292135 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:13.291881 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:13.389655 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:13.389619 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:13.389831 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:13.389706 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ae60b17b-715c-479d-a9cc-496e69796c4e-kubelet-config\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:13.389831 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:13.389736 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ae60b17b-715c-479d-a9cc-496e69796c4e-dbus\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:13.490938 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:13.490889 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:13.491132 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:13.490966 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ae60b17b-715c-479d-a9cc-496e69796c4e-kubelet-config\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:13.491132 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:13.490993 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ae60b17b-715c-479d-a9cc-496e69796c4e-dbus\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:13.491132 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:13.491037 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:13.491132 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:13.491104 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret podName:ae60b17b-715c-479d-a9cc-496e69796c4e nodeName:}" failed. No retries permitted until 2026-04-16 19:18:13.991084746 +0000 UTC m=+5.728987999 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret") pod "global-pull-secret-syncer-258dd" (UID: "ae60b17b-715c-479d-a9cc-496e69796c4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:13.491342 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:13.491178 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ae60b17b-715c-479d-a9cc-496e69796c4e-dbus\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:13.491342 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:13.491222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ae60b17b-715c-479d-a9cc-496e69796c4e-kubelet-config\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:13.940117 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:13.940056 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal" event={"ID":"9b83e43084b16f544f9da37f924cc2fa","Type":"ContainerStarted","Data":"315366cb168006183deeb6a593cd0beab91c8f4c908c3b9c83ee483511221b68"} Apr 16 19:18:13.994999 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:13.994956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:13.995201 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:13.995181 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:13.995281 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:13.995258 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret podName:ae60b17b-715c-479d-a9cc-496e69796c4e nodeName:}" failed. No retries permitted until 2026-04-16 19:18:14.995237799 +0000 UTC m=+6.733141054 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret") pod "global-pull-secret-syncer-258dd" (UID: "ae60b17b-715c-479d-a9cc-496e69796c4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:14.398574 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:14.398482 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs\") pod \"network-metrics-daemon-xhnjz\" (UID: \"81fa50c0-8c06-4a6c-9d00-a1ed89b88844\") " pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:14.398738 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:14.398639 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:14.398791 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:14.398780 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs podName:81fa50c0-8c06-4a6c-9d00-a1ed89b88844 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:18.398757653 +0000 UTC m=+10.136660909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs") pod "network-metrics-daemon-xhnjz" (UID: "81fa50c0-8c06-4a6c-9d00-a1ed89b88844") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:14.499743 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:14.499695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbx4\" (UniqueName: \"kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4\") pod \"network-check-target-m5l2p\" (UID: \"82ff5922-30ac-4de1-81e9-3d15bce731aa\") " pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:14.499987 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:14.499965 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:14.500059 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:14.499993 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:14.500059 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:14.500006 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8fbx4 for pod openshift-network-diagnostics/network-check-target-m5l2p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:14.500153 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:14.500070 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4 podName:82ff5922-30ac-4de1-81e9-3d15bce731aa nodeName:}" failed. No retries permitted until 2026-04-16 19:18:18.50004925 +0000 UTC m=+10.237952497 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-8fbx4" (UniqueName: "kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4") pod "network-check-target-m5l2p" (UID: "82ff5922-30ac-4de1-81e9-3d15bce731aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:14.884426 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:14.884326 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:14.884594 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:14.884476 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:14.884901 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:14.884880 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:14.885005 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:14.884986 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:14.885076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:14.885064 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:14.885152 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:14.885134 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:15.003215 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:15.003178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:15.003775 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:15.003351 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:15.003775 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:15.003446 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret podName:ae60b17b-715c-479d-a9cc-496e69796c4e nodeName:}" failed. No retries permitted until 2026-04-16 19:18:17.003424862 +0000 UTC m=+8.741328141 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret") pod "global-pull-secret-syncer-258dd" (UID: "ae60b17b-715c-479d-a9cc-496e69796c4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:16.882905 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:16.882858 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:16.883397 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:16.882986 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:16.883397 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:16.883000 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:16.883397 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:16.883092 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:16.883397 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:16.883137 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:16.883397 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:16.883196 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:17.023657 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:17.023099 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:17.023657 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:17.023247 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:17.023657 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:17.023315 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret podName:ae60b17b-715c-479d-a9cc-496e69796c4e nodeName:}" failed. No retries permitted until 2026-04-16 19:18:21.023295746 +0000 UTC m=+12.761199015 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret") pod "global-pull-secret-syncer-258dd" (UID: "ae60b17b-715c-479d-a9cc-496e69796c4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:18.434186 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:18.434147 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs\") pod \"network-metrics-daemon-xhnjz\" (UID: \"81fa50c0-8c06-4a6c-9d00-a1ed89b88844\") " pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:18.435095 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:18.434328 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:18.435095 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:18.434391 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs podName:81fa50c0-8c06-4a6c-9d00-a1ed89b88844 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:26.434373872 +0000 UTC m=+18.172277142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs") pod "network-metrics-daemon-xhnjz" (UID: "81fa50c0-8c06-4a6c-9d00-a1ed89b88844") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:18.535172 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:18.535122 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbx4\" (UniqueName: \"kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4\") pod \"network-check-target-m5l2p\" (UID: \"82ff5922-30ac-4de1-81e9-3d15bce731aa\") " pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:18.535371 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:18.535288 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:18.535371 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:18.535307 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:18.535371 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:18.535321 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8fbx4 for pod openshift-network-diagnostics/network-check-target-m5l2p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:18.535565 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:18.535382 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4 podName:82ff5922-30ac-4de1-81e9-3d15bce731aa nodeName:}" failed. No retries permitted until 2026-04-16 19:18:26.535362711 +0000 UTC m=+18.273265961 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-8fbx4" (UniqueName: "kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4") pod "network-check-target-m5l2p" (UID: "82ff5922-30ac-4de1-81e9-3d15bce731aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:18.883174 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:18.883087 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:18.883334 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:18.883200 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:18.883334 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:18.883234 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:18.883334 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:18.883245 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:18.883334 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:18.883312 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:18.883524 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:18.883384 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:20.882179 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:20.882144 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:20.882819 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:20.882285 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:20.882819 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:20.882697 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:20.882819 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:20.882782 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:20.883001 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:20.882838 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:20.883001 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:20.882905 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:21.055854 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:21.055812 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:21.056032 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:21.055992 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:21.056112 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:21.056068 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret podName:ae60b17b-715c-479d-a9cc-496e69796c4e nodeName:}" failed. No retries permitted until 2026-04-16 19:18:29.056048869 +0000 UTC m=+20.793952134 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret") pod "global-pull-secret-syncer-258dd" (UID: "ae60b17b-715c-479d-a9cc-496e69796c4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:22.882933 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:22.882893 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:22.883366 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:22.883020 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:22.883366 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:22.883045 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:22.883366 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:22.883132 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:22.883366 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:22.883180 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:22.883366 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:22.883237 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:24.882440 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:24.882389 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:24.882874 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:24.882389 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:24.882874 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:24.882538 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:24.882874 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:24.882614 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:24.882874 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:24.882389 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:24.882874 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:24.882726 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:26.496549 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:26.496509 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs\") pod \"network-metrics-daemon-xhnjz\" (UID: \"81fa50c0-8c06-4a6c-9d00-a1ed89b88844\") " pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:26.497008 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:26.496634 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:26.497008 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:26.496709 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs podName:81fa50c0-8c06-4a6c-9d00-a1ed89b88844 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:42.496687736 +0000 UTC m=+34.234590999 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs") pod "network-metrics-daemon-xhnjz" (UID: "81fa50c0-8c06-4a6c-9d00-a1ed89b88844") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:26.597279 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:26.597233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbx4\" (UniqueName: \"kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4\") pod \"network-check-target-m5l2p\" (UID: \"82ff5922-30ac-4de1-81e9-3d15bce731aa\") " pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:26.597498 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:26.597397 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:26.597498 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:26.597438 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:26.597498 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:26.597453 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8fbx4 for pod openshift-network-diagnostics/network-check-target-m5l2p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:26.597637 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:26.597520 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4 podName:82ff5922-30ac-4de1-81e9-3d15bce731aa nodeName:}" failed. No retries permitted until 2026-04-16 19:18:42.597501804 +0000 UTC m=+34.335405057 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-8fbx4" (UniqueName: "kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4") pod "network-check-target-m5l2p" (UID: "82ff5922-30ac-4de1-81e9-3d15bce731aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:26.885258 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:26.885184 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:26.885258 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:26.885218 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:26.885476 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:26.885294 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:26.885476 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:26.885341 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:26.885476 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:26.885429 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:26.885592 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:26.885513 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:28.884888 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:28.884856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:28.884888 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:28.884868 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:28.885374 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:28.884857 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:28.885374 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:28.884951 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:28.885374 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:28.885020 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:28.885374 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:28.885098 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:29.117523 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.115678 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:29.117523 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:29.115859 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:29.117523 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:29.115935 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret podName:ae60b17b-715c-479d-a9cc-496e69796c4e nodeName:}" failed. No retries permitted until 2026-04-16 19:18:45.115914981 +0000 UTC m=+36.853818229 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret") pod "global-pull-secret-syncer-258dd" (UID: "ae60b17b-715c-479d-a9cc-496e69796c4e") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:29.968157 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.967927 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h9xsl" event={"ID":"c01ac44c-e356-4ba6-9aa3-005d0558378f","Type":"ContainerStarted","Data":"b3255f21c1b540e235895846d3a1664d0934e1739cc6626e378ee74e13f15257"} Apr 16 19:18:29.969433 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.969389 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kndsd" event={"ID":"c8ee73e7-c318-4fd0-a148-eef6ac668052","Type":"ContainerStarted","Data":"dd4282df41441761fe9812c09a0228ef2cd3fea44bee70587b991c25dd7df0b3"} Apr 16 19:18:29.971584 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.971316 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v9w9k" event={"ID":"d90da04b-9bd0-4142-9e92-b6e47a4f708c","Type":"ContainerStarted","Data":"67bfd0aa76a41ba489ad1c9dc5d7fc96fad18b5e69fd9b0b909bd1b442bcfa80"} Apr 16 19:18:29.974224 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.974194 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" event={"ID":"2c0e3e53-533e-40a3-85d0-53f899612b42","Type":"ContainerStarted","Data":"acdf71a75946517cb73a853b3a8badb39a327bc84e80eedda03bb635fd006472"} Apr 16 19:18:29.975497 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.975473 2578 generic.go:358] "Generic (PLEG): container finished" podID="ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e" containerID="0795cdd7bc04e71c7f2c3a1287dd7434f1d1df8f517d278d6d4f82bbae4f3573" exitCode=0 Apr 16 19:18:29.975598 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.975555 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqvwz" event={"ID":"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e","Type":"ContainerDied","Data":"0795cdd7bc04e71c7f2c3a1287dd7434f1d1df8f517d278d6d4f82bbae4f3573"} Apr 16 19:18:29.979054 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.979033 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:18:29.979400 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.979378 2578 generic.go:358] "Generic (PLEG): container finished" podID="5b22e5a6-5ee7-48e8-b3d5-b1b77686c765" containerID="ebdac06903acccb902916d6b79d5796215f5b3466d6021c95243e5fbb2e5c1ae" exitCode=1 Apr 16 19:18:29.979508 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.979439 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" event={"ID":"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765","Type":"ContainerStarted","Data":"4853866deb909cac0da7b79dd8b62fc0f06fc40ccc46a7ff1527b7664ed27556"} Apr 16 19:18:29.979508 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.979469 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" event={"ID":"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765","Type":"ContainerStarted","Data":"82d2c22aab737b47ed367071c18547e9ec64d54298ac68c70baf83e36c97fa42"} Apr 16 19:18:29.979508 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.979479 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" event={"ID":"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765","Type":"ContainerStarted","Data":"a4e3f120c0c4594b236426454c2a14f89786d0d1d8ee4f27b655f9db32c16fa5"} Apr 16 19:18:29.979508 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.979487 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" event={"ID":"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765","Type":"ContainerDied","Data":"ebdac06903acccb902916d6b79d5796215f5b3466d6021c95243e5fbb2e5c1ae"} Apr 16 19:18:29.979508 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.979499 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" event={"ID":"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765","Type":"ContainerStarted","Data":"7ec51108a42527ec121681d9bd8635386b549503982b2c94b28b1b6c25f467b5"} Apr 16 19:18:29.980824 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.980793 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bl8m2" event={"ID":"f85164ec-72d2-43d8-8a96-11e63cc91aeb","Type":"ContainerStarted","Data":"2d5367327257c002970ede96153f9b7fd2935b0ffdc65ed417f4fcfd6455edf2"} Apr 16 19:18:29.982098 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.982074 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" event={"ID":"6d4442d8-719c-4d16-8de1-b4ca7c709645","Type":"ContainerStarted","Data":"2aabb36604f8609c18258a935e8fead2063f234f56bd11b6a540705a2d48af03"} Apr 16 19:18:29.984816 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.984771 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-163.ec2.internal" podStartSLOduration=19.984760105 podStartE2EDuration="19.984760105s" podCreationTimestamp="2026-04-16 19:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:18:13.958916186 +0000 UTC m=+5.696819455" watchObservedRunningTime="2026-04-16 19:18:29.984760105 +0000 UTC m=+21.722663373" Apr 16 19:18:29.985017 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:29.984978 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-h9xsl" podStartSLOduration=4.654777599 podStartE2EDuration="21.984971127s" podCreationTimestamp="2026-04-16 19:18:08 +0000 UTC" firstStartedPulling="2026-04-16 19:18:11.616168496 +0000 UTC m=+3.354071745" lastFinishedPulling="2026-04-16 19:18:28.946362012 +0000 UTC m=+20.684265273" observedRunningTime="2026-04-16 19:18:29.984369312 +0000 UTC m=+21.722272580" watchObservedRunningTime="2026-04-16 19:18:29.984971127 +0000 UTC m=+21.722874416" Apr 16 19:18:30.026165 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.026106 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9k9rp" podStartSLOduration=4.606340819 podStartE2EDuration="22.026087568s" podCreationTimestamp="2026-04-16 19:18:08 +0000 UTC" firstStartedPulling="2026-04-16 19:18:11.606725959 +0000 UTC m=+3.344629207" lastFinishedPulling="2026-04-16 19:18:29.026472699 +0000 UTC m=+20.764375956" observedRunningTime="2026-04-16 19:18:30.025769123 +0000 UTC m=+21.763672391" watchObservedRunningTime="2026-04-16 19:18:30.026087568 +0000 UTC m=+21.763990837" Apr 16 19:18:30.040353 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.040302 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-v9w9k" podStartSLOduration=4.62943004 podStartE2EDuration="22.040287775s" podCreationTimestamp="2026-04-16 19:18:08 +0000 UTC" firstStartedPulling="2026-04-16 19:18:11.614072161 +0000 UTC m=+3.351975423" lastFinishedPulling="2026-04-16 19:18:29.024929895 +0000 UTC m=+20.762833158" observedRunningTime="2026-04-16 19:18:30.040190102 +0000 UTC m=+21.778093381" watchObservedRunningTime="2026-04-16 19:18:30.040287775 +0000 UTC m=+21.778191043" Apr 16 19:18:30.056466 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.056394 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kndsd" podStartSLOduration=3.724506755 podStartE2EDuration="21.056378787s" podCreationTimestamp="2026-04-16 19:18:09 +0000 UTC" firstStartedPulling="2026-04-16 19:18:11.614492283 +0000 UTC m=+3.352395536" lastFinishedPulling="2026-04-16 19:18:28.946364313 +0000 UTC m=+20.684267568" observedRunningTime="2026-04-16 19:18:30.055768324 +0000 UTC m=+21.793671597" watchObservedRunningTime="2026-04-16 19:18:30.056378787 +0000 UTC m=+21.794282055" Apr 16 19:18:30.078255 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.078180 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bl8m2" podStartSLOduration=3.624056244 podStartE2EDuration="21.07816413s" podCreationTimestamp="2026-04-16 19:18:09 +0000 UTC" firstStartedPulling="2026-04-16 19:18:11.607113944 +0000 UTC m=+3.345017190" lastFinishedPulling="2026-04-16 19:18:29.061221826 +0000 UTC m=+20.799125076" observedRunningTime="2026-04-16 19:18:30.077705814 +0000 UTC m=+21.815609086" watchObservedRunningTime="2026-04-16 19:18:30.07816413 +0000 UTC m=+21.816067398" Apr 16 19:18:30.545846 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.545594 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:18:30.826446 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.826309 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:18:30.545618392Z","UUID":"89416062-852f-41e9-bef5-71d4ed90a810","Handler":null,"Name":"","Endpoint":""} Apr 16 19:18:30.828516 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.828492 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:18:30.828516 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.828522 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:18:30.882176 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.882135 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:30.882364 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.882258 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:30.882364 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:30.882265 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:30.882498 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:30.882387 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:30.882498 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.882447 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:30.882585 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:30.882512 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:30.985695 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.985654 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s2zm4" event={"ID":"99c90d56-3b40-4f40-bc97-12d990afb385","Type":"ContainerStarted","Data":"c305082ef2e4408592b9034af4bb0cb48d3895d1dbb99502500c86d9779a4f5b"} Apr 16 19:18:30.987431 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.987387 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" event={"ID":"2c0e3e53-533e-40a3-85d0-53f899612b42","Type":"ContainerStarted","Data":"232fcefe9d6945e1b48acc89eeb83a3e30365921a03e7425cde8e8cd938aa919"} Apr 16 19:18:30.990066 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.990044 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:18:30.991004 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:30.990485 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" event={"ID":"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765","Type":"ContainerStarted","Data":"3318e813a70c219383cc49d322a3c7466210fa165d4bdda834e8cc234038e24c"} Apr 16 19:18:31.002218 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:31.002168 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-s2zm4" podStartSLOduration=5.590150354 podStartE2EDuration="23.002150455s" podCreationTimestamp="2026-04-16 19:18:08 +0000 UTC" firstStartedPulling="2026-04-16 19:18:11.614269177 +0000 UTC m=+3.352172425" lastFinishedPulling="2026-04-16 19:18:29.026269264 +0000 UTC m=+20.764172526" observedRunningTime="2026-04-16 19:18:31.001658154 +0000 UTC m=+22.739561423" watchObservedRunningTime="2026-04-16 19:18:31.002150455 +0000 UTC m=+22.740053725" Apr 16 19:18:31.995192 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:31.995150 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" event={"ID":"2c0e3e53-533e-40a3-85d0-53f899612b42","Type":"ContainerStarted","Data":"4220eaf133452c6e9767bac2e85a34040a675a7e51dc5730e889aef88a873b70"} Apr 16 19:18:32.014992 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:32.014933 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jd9bc" podStartSLOduration=2.994626688 podStartE2EDuration="23.01491751s" podCreationTimestamp="2026-04-16 19:18:09 +0000 UTC" firstStartedPulling="2026-04-16 19:18:11.608399707 +0000 UTC m=+3.346302953" lastFinishedPulling="2026-04-16 19:18:31.628690528 +0000 UTC m=+23.366593775" observedRunningTime="2026-04-16 19:18:32.014572579 +0000 UTC m=+23.752475859" watchObservedRunningTime="2026-04-16 19:18:32.01491751 +0000 UTC m=+23.752820778" Apr 16 19:18:32.841842 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:32.841638 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-h9xsl" Apr 16 19:18:32.842328 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:32.842306 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-h9xsl" Apr 16 19:18:32.883435 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:32.882388 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:32.883435 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:32.882387 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:32.883435 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:32.882554 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:32.883435 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:32.882645 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:32.883435 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:32.882388 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:32.883435 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:32.882736 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:32.999900 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:32.999872 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:18:33.000312 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:33.000178 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" event={"ID":"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765","Type":"ContainerStarted","Data":"6c71dc53fc931dc0ab1bbc2f7280fb4c9649404c7cd8bc7249d280f7cf12c8ea"} Apr 16 19:18:33.000646 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:33.000629 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-h9xsl" Apr 16 19:18:33.001049 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:33.001025 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-h9xsl" Apr 16 19:18:34.882852 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:34.882765 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:34.883695 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:34.882891 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:34.883695 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:34.882766 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:34.883695 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:34.882973 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:34.883695 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:34.882765 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:34.883695 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:34.883030 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:35.006690 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:35.006518 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:18:35.007038 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:35.007001 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" event={"ID":"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765","Type":"ContainerStarted","Data":"6c09095bf9b00022b5ce574171264298c1ac41e5a2b6c99feada6b6e28db90f2"} Apr 16 19:18:35.010305 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:35.008791 2578 scope.go:117] "RemoveContainer" containerID="ebdac06903acccb902916d6b79d5796215f5b3466d6021c95243e5fbb2e5c1ae" Apr 16 19:18:36.013550 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:36.013521 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:18:36.014164 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:36.014131 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" event={"ID":"5b22e5a6-5ee7-48e8-b3d5-b1b77686c765","Type":"ContainerStarted","Data":"9c2eed79d2cb39623ee6dda2dd4e341c0dfc68fb380909a9ffe18c06a4d77ef5"} Apr 16 19:18:36.014516 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:36.014496 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:36.014630 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:36.014607 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:36.014694 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:36.014687 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:36.031976 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:36.031949 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:36.032145 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:36.032105 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:18:36.045255 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:36.045206 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" podStartSLOduration=10.31356755 podStartE2EDuration="28.045190934s" podCreationTimestamp="2026-04-16 19:18:08 +0000 UTC" firstStartedPulling="2026-04-16 19:18:11.613957988 +0000 UTC m=+3.351861240" lastFinishedPulling="2026-04-16 19:18:29.345581378 +0000 UTC m=+21.083484624" observedRunningTime="2026-04-16 19:18:36.044746486 +0000 UTC m=+27.782649767" watchObservedRunningTime="2026-04-16 19:18:36.045190934 +0000 UTC m=+27.783094237" Apr 16 19:18:36.567183 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:36.567147 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m5l2p"] Apr 16 19:18:36.567462 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:36.567298 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:36.567462 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:36.567436 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:36.568863 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:36.568735 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-258dd"] Apr 16 19:18:36.568863 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:36.568854 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:36.569046 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:36.568940 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:36.569433 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:36.569383 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xhnjz"] Apr 16 19:18:36.569529 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:36.569520 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:36.569660 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:36.569628 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:37.882901 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:37.882867 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:37.883301 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:37.882867 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:37.883301 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:37.882980 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:37.883301 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:37.883045 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:38.019503 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:38.019464 2578 generic.go:358] "Generic (PLEG): container finished" podID="ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e" containerID="d4dadfe1822d84d5885051e45bd3b69cf08bb75903f3292df9b734c7a9519b37" exitCode=0 Apr 16 19:18:38.019674 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:38.019547 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqvwz" event={"ID":"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e","Type":"ContainerDied","Data":"d4dadfe1822d84d5885051e45bd3b69cf08bb75903f3292df9b734c7a9519b37"} Apr 16 19:18:38.883341 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:38.883091 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:38.883818 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:38.883363 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:39.881863 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:39.881826 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:39.882028 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:39.881826 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:39.882028 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:39.881965 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:39.882139 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:39.882029 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:40.025068 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:40.025029 2578 generic.go:358] "Generic (PLEG): container finished" podID="ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e" containerID="dec7101b81baea43090854f25437b7f116340b79e003ec00ab1a12387da1fa12" exitCode=0 Apr 16 19:18:40.025567 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:40.025096 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqvwz" event={"ID":"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e","Type":"ContainerDied","Data":"dec7101b81baea43090854f25437b7f116340b79e003ec00ab1a12387da1fa12"} Apr 16 19:18:40.882890 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:40.882859 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:40.883055 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:40.882977 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:41.882738 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:41.882697 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:41.883228 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:41.882697 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:41.883228 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:41.882815 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-m5l2p" podUID="82ff5922-30ac-4de1-81e9-3d15bce731aa" Apr 16 19:18:41.883228 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:41.882888 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xhnjz" podUID="81fa50c0-8c06-4a6c-9d00-a1ed89b88844" Apr 16 19:18:42.031419 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:42.031378 2578 generic.go:358] "Generic (PLEG): container finished" podID="ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e" containerID="5eeb6d5327e45c67b42d62132d120e16fdcc0d0ceac752a143d7d3eeb2a279ac" exitCode=0 Apr 16 19:18:42.031601 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:42.031468 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqvwz" event={"ID":"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e","Type":"ContainerDied","Data":"5eeb6d5327e45c67b42d62132d120e16fdcc0d0ceac752a143d7d3eeb2a279ac"} Apr 16 19:18:42.508082 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:42.508040 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs\") pod \"network-metrics-daemon-xhnjz\" (UID: \"81fa50c0-8c06-4a6c-9d00-a1ed89b88844\") " pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:42.508270 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:42.508196 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:42.508270 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:42.508265 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs podName:81fa50c0-8c06-4a6c-9d00-a1ed89b88844 nodeName:}" failed. No retries permitted until 2026-04-16 19:19:14.508248819 +0000 UTC m=+66.246152065 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs") pod "network-metrics-daemon-xhnjz" (UID: "81fa50c0-8c06-4a6c-9d00-a1ed89b88844") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:42.609372 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:42.609333 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbx4\" (UniqueName: \"kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4\") pod \"network-check-target-m5l2p\" (UID: \"82ff5922-30ac-4de1-81e9-3d15bce731aa\") " pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:42.609566 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:42.609529 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:42.609566 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:42.609557 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:42.609645 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:42.609572 2578 projected.go:194] Error preparing data for projected volume kube-api-access-8fbx4 for pod openshift-network-diagnostics/network-check-target-m5l2p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:42.609645 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:42.609625 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4 podName:82ff5922-30ac-4de1-81e9-3d15bce731aa nodeName:}" failed. No retries permitted until 2026-04-16 19:19:14.609610235 +0000 UTC m=+66.347513481 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-8fbx4" (UniqueName: "kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4") pod "network-check-target-m5l2p" (UID: "82ff5922-30ac-4de1-81e9-3d15bce731aa") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:42.882449 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:42.882396 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:42.882608 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:42.882537 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-258dd" podUID="ae60b17b-715c-479d-a9cc-496e69796c4e" Apr 16 19:18:43.132123 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.132035 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-163.ec2.internal" event="NodeReady" Apr 16 19:18:43.132671 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.132195 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:18:43.173035 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.173000 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-fd47c7899-27h7v"] Apr 16 19:18:43.195917 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.195873 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7v82m"] Apr 16 19:18:43.196077 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.196044 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.199038 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.198883 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 19:18:43.199038 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.198940 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-bjmd7\"" Apr 16 19:18:43.199038 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.198970 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 19:18:43.199460 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.199441 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 19:18:43.206544 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.206522 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 19:18:43.216670 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.216643 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rdj9w"] Apr 16 19:18:43.216797 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.216777 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:43.219371 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.219348 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:18:43.219517 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.219430 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-n84tw\"" Apr 16 19:18:43.219676 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.219630 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:18:43.237870 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.237845 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-fd47c7899-27h7v"] Apr 16 19:18:43.237870 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.237871 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7v82m"] Apr 16 19:18:43.238039 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.237879 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rdj9w"] Apr 16 19:18:43.238039 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.237994 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rdj9w" Apr 16 19:18:43.241071 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.241051 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9bx8r\"" Apr 16 19:18:43.241324 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.241303 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:18:43.241432 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.241312 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:18:43.241432 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.241355 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:18:43.316088 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.316048 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.316088 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.316090 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-installation-pull-secrets\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.316316 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.316179 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-ca-trust-extracted\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.316316 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.316223 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/368abbc9-410f-479c-9fa1-3676e33eeb51-tmp-dir\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:43.316316 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.316251 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-certificates\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.316316 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.316276 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-757fx\" (UniqueName: \"kubernetes.io/projected/419f622c-e0bd-4976-87ce-7df0a1ed0500-kube-api-access-757fx\") pod \"ingress-canary-rdj9w\" (UID: \"419f622c-e0bd-4976-87ce-7df0a1ed0500\") " pod="openshift-ingress-canary/ingress-canary-rdj9w" Apr 16 19:18:43.316500 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.316326 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-bound-sa-token\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.316500 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.316345 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lcbd\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-kube-api-access-8lcbd\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.316500 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.316385 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:43.316500 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.316447 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fznwl\" (UniqueName: \"kubernetes.io/projected/368abbc9-410f-479c-9fa1-3676e33eeb51-kube-api-access-fznwl\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:43.316500 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.316487 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert\") pod \"ingress-canary-rdj9w\" (UID: \"419f622c-e0bd-4976-87ce-7df0a1ed0500\") " pod="openshift-ingress-canary/ingress-canary-rdj9w" Apr 16 19:18:43.316672 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.316540 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-trusted-ca\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.316672 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.316569 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/368abbc9-410f-479c-9fa1-3676e33eeb51-config-volume\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:43.316672 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.316589 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-image-registry-private-configuration\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.417936 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.417835 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-certificates\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.417936 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.417879 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-757fx\" (UniqueName: \"kubernetes.io/projected/419f622c-e0bd-4976-87ce-7df0a1ed0500-kube-api-access-757fx\") pod \"ingress-canary-rdj9w\" (UID: \"419f622c-e0bd-4976-87ce-7df0a1ed0500\") " pod="openshift-ingress-canary/ingress-canary-rdj9w" Apr 16 19:18:43.418123 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.417941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-bound-sa-token\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.418123 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.417959 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8lcbd\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-kube-api-access-8lcbd\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.418123 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.417983 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:43.418123 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.418012 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fznwl\" (UniqueName: \"kubernetes.io/projected/368abbc9-410f-479c-9fa1-3676e33eeb51-kube-api-access-fznwl\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:43.418123 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.418040 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert\") pod \"ingress-canary-rdj9w\" (UID: \"419f622c-e0bd-4976-87ce-7df0a1ed0500\") " pod="openshift-ingress-canary/ingress-canary-rdj9w" Apr 16 19:18:43.418123 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.418069 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-trusted-ca\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.418123 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.418105 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/368abbc9-410f-479c-9fa1-3676e33eeb51-config-volume\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:43.418323 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.418129 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-image-registry-private-configuration\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.418323 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.418162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.418323 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.418184 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-installation-pull-secrets\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.418323 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.418232 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-ca-trust-extracted\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.418323 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:43.418245 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:43.418323 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.418256 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/368abbc9-410f-479c-9fa1-3676e33eeb51-tmp-dir\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:43.418323 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:43.418315 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert podName:419f622c-e0bd-4976-87ce-7df0a1ed0500 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:43.918291962 +0000 UTC m=+35.656195224 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert") pod "ingress-canary-rdj9w" (UID: "419f622c-e0bd-4976-87ce-7df0a1ed0500") : secret "canary-serving-cert" not found Apr 16 19:18:43.418640 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:43.418249 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:43.418640 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:43.418528 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls podName:368abbc9-410f-479c-9fa1-3676e33eeb51 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:43.918508875 +0000 UTC m=+35.656412135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls") pod "dns-default-7v82m" (UID: "368abbc9-410f-479c-9fa1-3676e33eeb51") : secret "dns-default-metrics-tls" not found Apr 16 19:18:43.418640 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.418588 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-certificates\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.418754 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.418733 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/368abbc9-410f-479c-9fa1-3676e33eeb51-tmp-dir\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:43.418891 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.418870 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-ca-trust-extracted\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.419220 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.419192 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/368abbc9-410f-479c-9fa1-3676e33eeb51-config-volume\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:43.419358 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:43.419283 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:18:43.419358 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:43.419297 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fd47c7899-27h7v: secret "image-registry-tls" not found Apr 16 19:18:43.419358 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:43.419341 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls podName:f99d9c18-6acb-4dcc-a1e9-afadc152c9e2 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:43.919326934 +0000 UTC m=+35.657230184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls") pod "image-registry-fd47c7899-27h7v" (UID: "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2") : secret "image-registry-tls" not found Apr 16 19:18:43.419556 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.419537 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-trusted-ca\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.423117 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.423010 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-image-registry-private-configuration\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.423117 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.423080 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-installation-pull-secrets\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.433520 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.433489 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fznwl\" (UniqueName: \"kubernetes.io/projected/368abbc9-410f-479c-9fa1-3676e33eeb51-kube-api-access-fznwl\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:43.433520 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.433506 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-bound-sa-token\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.433743 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.433680 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lcbd\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-kube-api-access-8lcbd\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.433824 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.433797 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-757fx\" (UniqueName: \"kubernetes.io/projected/419f622c-e0bd-4976-87ce-7df0a1ed0500-kube-api-access-757fx\") pod \"ingress-canary-rdj9w\" (UID: \"419f622c-e0bd-4976-87ce-7df0a1ed0500\") " pod="openshift-ingress-canary/ingress-canary-rdj9w" Apr 16 19:18:43.882465 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.882399 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:18:43.882465 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.882473 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:18:43.885227 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.885197 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:18:43.885377 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.885234 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:18:43.886363 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.886339 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-s2294\"" Apr 16 19:18:43.886502 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.886392 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:18:43.886502 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.886394 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t5lg8\"" Apr 16 19:18:43.921828 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.921793 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:43.921828 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.921832 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert\") pod \"ingress-canary-rdj9w\" (UID: \"419f622c-e0bd-4976-87ce-7df0a1ed0500\") " pod="openshift-ingress-canary/ingress-canary-rdj9w" Apr 16 19:18:43.922044 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:43.921868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:43.922044 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:43.921958 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:18:43.922044 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:43.921958 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:43.922044 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:43.921970 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fd47c7899-27h7v: secret "image-registry-tls" not found Apr 16 19:18:43.922044 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:43.922031 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls podName:f99d9c18-6acb-4dcc-a1e9-afadc152c9e2 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:44.922016903 +0000 UTC m=+36.659920149 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls") pod "image-registry-fd47c7899-27h7v" (UID: "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2") : secret "image-registry-tls" not found Apr 16 19:18:43.922044 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:43.922043 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls podName:368abbc9-410f-479c-9fa1-3676e33eeb51 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:44.922037113 +0000 UTC m=+36.659940359 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls") pod "dns-default-7v82m" (UID: "368abbc9-410f-479c-9fa1-3676e33eeb51") : secret "dns-default-metrics-tls" not found Apr 16 19:18:43.922044 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:43.921957 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:43.922259 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:43.922073 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert podName:419f622c-e0bd-4976-87ce-7df0a1ed0500 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:44.922064659 +0000 UTC m=+36.659967905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert") pod "ingress-canary-rdj9w" (UID: "419f622c-e0bd-4976-87ce-7df0a1ed0500") : secret "canary-serving-cert" not found Apr 16 19:18:44.479980 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.479939 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wvlvr"] Apr 16 19:18:44.485258 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.485227 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wvlvr" Apr 16 19:18:44.489124 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.489051 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 19:18:44.490167 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.490017 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 19:18:44.490167 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.490027 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-mc9v8\"" Apr 16 19:18:44.497342 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.497314 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wvlvr"] Apr 16 19:18:44.627321 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.627274 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vqd4\" (UniqueName: \"kubernetes.io/projected/aab238ff-d84e-4ed9-8165-c0f56eacaf68-kube-api-access-9vqd4\") pod \"migrator-74bb7799d9-wvlvr\" (UID: \"aab238ff-d84e-4ed9-8165-c0f56eacaf68\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wvlvr" Apr 16 19:18:44.728222 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.728189 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vqd4\" (UniqueName: \"kubernetes.io/projected/aab238ff-d84e-4ed9-8165-c0f56eacaf68-kube-api-access-9vqd4\") pod \"migrator-74bb7799d9-wvlvr\" (UID: \"aab238ff-d84e-4ed9-8165-c0f56eacaf68\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wvlvr" Apr 16 19:18:44.737903 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.737826 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vqd4\" (UniqueName: \"kubernetes.io/projected/aab238ff-d84e-4ed9-8165-c0f56eacaf68-kube-api-access-9vqd4\") pod \"migrator-74bb7799d9-wvlvr\" (UID: \"aab238ff-d84e-4ed9-8165-c0f56eacaf68\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wvlvr" Apr 16 19:18:44.796977 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.796939 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wvlvr" Apr 16 19:18:44.882428 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.882388 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:44.885452 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.885394 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:18:44.930020 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.929983 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:44.930208 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.930099 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:44.930208 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.930139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert\") pod \"ingress-canary-rdj9w\" (UID: \"419f622c-e0bd-4976-87ce-7df0a1ed0500\") " pod="openshift-ingress-canary/ingress-canary-rdj9w" Apr 16 19:18:44.930297 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:44.930267 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:44.930345 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:44.930334 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert podName:419f622c-e0bd-4976-87ce-7df0a1ed0500 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:46.930312296 +0000 UTC m=+38.668215562 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert") pod "ingress-canary-rdj9w" (UID: "419f622c-e0bd-4976-87ce-7df0a1ed0500") : secret "canary-serving-cert" not found Apr 16 19:18:44.930556 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:44.930539 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:18:44.930617 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:44.930561 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fd47c7899-27h7v: secret "image-registry-tls" not found Apr 16 19:18:44.930617 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:44.930593 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:44.930617 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:44.930610 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls podName:f99d9c18-6acb-4dcc-a1e9-afadc152c9e2 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:46.930596802 +0000 UTC m=+38.668500048 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls") pod "image-registry-fd47c7899-27h7v" (UID: "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2") : secret "image-registry-tls" not found Apr 16 19:18:44.930760 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:44.930655 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls podName:368abbc9-410f-479c-9fa1-3676e33eeb51 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:46.930638548 +0000 UTC m=+38.668541798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls") pod "dns-default-7v82m" (UID: "368abbc9-410f-479c-9fa1-3676e33eeb51") : secret "dns-default-metrics-tls" not found Apr 16 19:18:44.954426 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:44.954372 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wvlvr"] Apr 16 19:18:44.958179 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:44.958145 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaab238ff_d84e_4ed9_8165_c0f56eacaf68.slice/crio-e5e4e8dd76d50520e8c6a29d0e8c4d72a4052a714768e661b7b64522fba29735 WatchSource:0}: Error finding container e5e4e8dd76d50520e8c6a29d0e8c4d72a4052a714768e661b7b64522fba29735: Status 404 returned error can't find the container with id e5e4e8dd76d50520e8c6a29d0e8c4d72a4052a714768e661b7b64522fba29735 Apr 16 19:18:45.039601 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:45.039569 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wvlvr" event={"ID":"aab238ff-d84e-4ed9-8165-c0f56eacaf68","Type":"ContainerStarted","Data":"e5e4e8dd76d50520e8c6a29d0e8c4d72a4052a714768e661b7b64522fba29735"} Apr 16 19:18:45.132359 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:45.132322 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:45.136587 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:45.136556 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ae60b17b-715c-479d-a9cc-496e69796c4e-original-pull-secret\") pod \"global-pull-secret-syncer-258dd\" (UID: \"ae60b17b-715c-479d-a9cc-496e69796c4e\") " pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:45.195863 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:45.195821 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-258dd" Apr 16 19:18:45.335487 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:45.335396 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-258dd"] Apr 16 19:18:45.340922 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:45.340887 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae60b17b_715c_479d_a9cc_496e69796c4e.slice/crio-460759eb03c8a045b8e9d9f4489a5a6b291b85c6e0bdb4d86b73175f68f28fee WatchSource:0}: Error finding container 460759eb03c8a045b8e9d9f4489a5a6b291b85c6e0bdb4d86b73175f68f28fee: Status 404 returned error can't find the container with id 460759eb03c8a045b8e9d9f4489a5a6b291b85c6e0bdb4d86b73175f68f28fee Apr 16 19:18:45.597141 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:45.597063 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kndsd_c8ee73e7-c318-4fd0-a148-eef6ac668052/dns-node-resolver/0.log" Apr 16 19:18:46.042678 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:46.042623 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-258dd" event={"ID":"ae60b17b-715c-479d-a9cc-496e69796c4e","Type":"ContainerStarted","Data":"460759eb03c8a045b8e9d9f4489a5a6b291b85c6e0bdb4d86b73175f68f28fee"} Apr 16 19:18:46.396662 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:46.396578 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-v9w9k_d90da04b-9bd0-4142-9e92-b6e47a4f708c/node-ca/0.log" Apr 16 19:18:46.949998 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:46.949955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:46.949998 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:46.950002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert\") pod \"ingress-canary-rdj9w\" (UID: \"419f622c-e0bd-4976-87ce-7df0a1ed0500\") " pod="openshift-ingress-canary/ingress-canary-rdj9w" Apr 16 19:18:46.950532 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:46.950058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:46.950532 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:46.950110 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:46.950532 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:46.950180 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:46.950532 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:46.950189 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:18:46.950532 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:46.950203 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fd47c7899-27h7v: secret "image-registry-tls" not found Apr 16 19:18:46.950532 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:46.950182 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls podName:368abbc9-410f-479c-9fa1-3676e33eeb51 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:50.950161761 +0000 UTC m=+42.688065032 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls") pod "dns-default-7v82m" (UID: "368abbc9-410f-479c-9fa1-3676e33eeb51") : secret "dns-default-metrics-tls" not found Apr 16 19:18:46.950532 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:46.950241 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert podName:419f622c-e0bd-4976-87ce-7df0a1ed0500 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:50.95022917 +0000 UTC m=+42.688132416 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert") pod "ingress-canary-rdj9w" (UID: "419f622c-e0bd-4976-87ce-7df0a1ed0500") : secret "canary-serving-cert" not found Apr 16 19:18:46.950532 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:46.950258 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls podName:f99d9c18-6acb-4dcc-a1e9-afadc152c9e2 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:50.950251928 +0000 UTC m=+42.688155173 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls") pod "image-registry-fd47c7899-27h7v" (UID: "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2") : secret "image-registry-tls" not found Apr 16 19:18:47.047493 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:47.047385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wvlvr" event={"ID":"aab238ff-d84e-4ed9-8165-c0f56eacaf68","Type":"ContainerStarted","Data":"43408929713e772c71f95034c8ca844634be224c5d41c7306ee928cb2bff4e57"} Apr 16 19:18:47.047493 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:47.047450 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wvlvr" event={"ID":"aab238ff-d84e-4ed9-8165-c0f56eacaf68","Type":"ContainerStarted","Data":"5ef632465c0fd173fd71a72e6c2ccb768602723788ac8e286ba8081a5e8dbaa3"} Apr 16 19:18:47.075305 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:47.074878 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wvlvr" podStartSLOduration=1.290401257 podStartE2EDuration="3.074856848s" podCreationTimestamp="2026-04-16 19:18:44 +0000 UTC" firstStartedPulling="2026-04-16 19:18:44.960532617 +0000 UTC m=+36.698435868" lastFinishedPulling="2026-04-16 19:18:46.744988211 +0000 UTC m=+38.482891459" observedRunningTime="2026-04-16 19:18:47.072922692 +0000 UTC m=+38.810825961" watchObservedRunningTime="2026-04-16 19:18:47.074856848 +0000 UTC m=+38.812760117" Apr 16 19:18:50.983460 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:50.983397 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:50.983460 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:50.983466 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert\") pod \"ingress-canary-rdj9w\" (UID: \"419f622c-e0bd-4976-87ce-7df0a1ed0500\") " pod="openshift-ingress-canary/ingress-canary-rdj9w" Apr 16 19:18:50.983929 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:50.983531 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:50.983929 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:50.983560 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:50.983929 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:50.983559 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:50.983929 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:50.983621 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert podName:419f622c-e0bd-4976-87ce-7df0a1ed0500 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:58.983603625 +0000 UTC m=+50.721506874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert") pod "ingress-canary-rdj9w" (UID: "419f622c-e0bd-4976-87ce-7df0a1ed0500") : secret "canary-serving-cert" not found Apr 16 19:18:50.983929 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:50.983636 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls podName:368abbc9-410f-479c-9fa1-3676e33eeb51 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:58.983630074 +0000 UTC m=+50.721533321 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls") pod "dns-default-7v82m" (UID: "368abbc9-410f-479c-9fa1-3676e33eeb51") : secret "dns-default-metrics-tls" not found Apr 16 19:18:50.983929 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:50.983678 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:18:50.983929 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:50.983695 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-fd47c7899-27h7v: secret "image-registry-tls" not found Apr 16 19:18:50.983929 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:18:50.983752 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls podName:f99d9c18-6acb-4dcc-a1e9-afadc152c9e2 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:58.983738119 +0000 UTC m=+50.721641364 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls") pod "image-registry-fd47c7899-27h7v" (UID: "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2") : secret "image-registry-tls" not found Apr 16 19:18:52.059704 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:52.059668 2578 generic.go:358] "Generic (PLEG): container finished" podID="ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e" containerID="3135d285efc73c36c1df8df9d30837df1213358611fdfa907573bf6acf46b526" exitCode=0 Apr 16 19:18:52.060116 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:52.059737 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqvwz" event={"ID":"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e","Type":"ContainerDied","Data":"3135d285efc73c36c1df8df9d30837df1213358611fdfa907573bf6acf46b526"} Apr 16 19:18:52.061070 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:52.061040 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-258dd" event={"ID":"ae60b17b-715c-479d-a9cc-496e69796c4e","Type":"ContainerStarted","Data":"5b77fc86021485c33fb92e6cb223c385f17d782926c45add3f3fe591501b215c"} Apr 16 19:18:52.097772 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:52.097720 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-258dd" podStartSLOduration=33.270661399 podStartE2EDuration="39.097704257s" podCreationTimestamp="2026-04-16 19:18:13 +0000 UTC" firstStartedPulling="2026-04-16 19:18:45.342993621 +0000 UTC m=+37.080896868" lastFinishedPulling="2026-04-16 19:18:51.170036467 +0000 UTC m=+42.907939726" observedRunningTime="2026-04-16 19:18:52.096788034 +0000 UTC m=+43.834691301" watchObservedRunningTime="2026-04-16 19:18:52.097704257 +0000 UTC m=+43.835607519" Apr 16 19:18:53.066092 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:53.066060 2578 generic.go:358] "Generic (PLEG): container finished" podID="ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e" containerID="72a2446bb8dd9114db869b3474df70c6208447aa9a3bd895568f5ceb32a9d56e" exitCode=0 Apr 16 19:18:53.066479 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:53.066152 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqvwz" event={"ID":"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e","Type":"ContainerDied","Data":"72a2446bb8dd9114db869b3474df70c6208447aa9a3bd895568f5ceb32a9d56e"} Apr 16 19:18:54.071383 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:54.071349 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqvwz" event={"ID":"ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e","Type":"ContainerStarted","Data":"6b814b843a50c54ab4c847acd7087ca321bb0d755aef3346f6eab14fdb20e140"} Apr 16 19:18:54.098542 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:54.098486 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sqvwz" podStartSLOduration=5.541465573 podStartE2EDuration="45.098471903s" podCreationTimestamp="2026-04-16 19:18:09 +0000 UTC" firstStartedPulling="2026-04-16 19:18:11.604607149 +0000 UTC m=+3.342510406" lastFinishedPulling="2026-04-16 19:18:51.16161349 +0000 UTC m=+42.899516736" observedRunningTime="2026-04-16 19:18:54.096852045 +0000 UTC m=+45.834755313" watchObservedRunningTime="2026-04-16 19:18:54.098471903 +0000 UTC m=+45.836375170" Apr 16 19:18:59.051275 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:59.051227 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:59.051275 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:59.051272 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert\") pod \"ingress-canary-rdj9w\" (UID: \"419f622c-e0bd-4976-87ce-7df0a1ed0500\") " pod="openshift-ingress-canary/ingress-canary-rdj9w" Apr 16 19:18:59.051808 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:59.051307 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:59.054722 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:59.054690 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/419f622c-e0bd-4976-87ce-7df0a1ed0500-cert\") pod \"ingress-canary-rdj9w\" (UID: \"419f622c-e0bd-4976-87ce-7df0a1ed0500\") " pod="openshift-ingress-canary/ingress-canary-rdj9w" Apr 16 19:18:59.054722 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:59.054710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls\") pod \"image-registry-fd47c7899-27h7v\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:59.054863 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:59.054698 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/368abbc9-410f-479c-9fa1-3676e33eeb51-metrics-tls\") pod \"dns-default-7v82m\" (UID: \"368abbc9-410f-479c-9fa1-3676e33eeb51\") " pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:59.107747 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:59.107707 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:18:59.126693 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:59.126660 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7v82m" Apr 16 19:18:59.145745 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:59.145712 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rdj9w" Apr 16 19:18:59.257503 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:59.257456 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf99d9c18_6acb_4dcc_a1e9_afadc152c9e2.slice/crio-7f398e09ee1c3002d2ede05b0a4e6c72660138ab4087f4c9af45b4ce4d9cf2d3 WatchSource:0}: Error finding container 7f398e09ee1c3002d2ede05b0a4e6c72660138ab4087f4c9af45b4ce4d9cf2d3: Status 404 returned error can't find the container with id 7f398e09ee1c3002d2ede05b0a4e6c72660138ab4087f4c9af45b4ce4d9cf2d3 Apr 16 19:18:59.261345 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:59.261032 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-fd47c7899-27h7v"] Apr 16 19:18:59.271299 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:59.271268 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7v82m"] Apr 16 19:18:59.274564 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:59.274538 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod368abbc9_410f_479c_9fa1_3676e33eeb51.slice/crio-10c29518c767d177b26d9c1fe18014e4f9a4d946801f393f27f535e620982e33 WatchSource:0}: Error finding container 10c29518c767d177b26d9c1fe18014e4f9a4d946801f393f27f535e620982e33: Status 404 returned error can't find the container with id 10c29518c767d177b26d9c1fe18014e4f9a4d946801f393f27f535e620982e33 Apr 16 19:18:59.289297 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:18:59.289266 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rdj9w"] Apr 16 19:18:59.293046 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:18:59.293004 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod419f622c_e0bd_4976_87ce_7df0a1ed0500.slice/crio-3a2c14fd4eb8030537ac84a89cc2bdf91729977ce1691c2811fe5ec3ebea2c10 WatchSource:0}: Error finding container 3a2c14fd4eb8030537ac84a89cc2bdf91729977ce1691c2811fe5ec3ebea2c10: Status 404 returned error can't find the container with id 3a2c14fd4eb8030537ac84a89cc2bdf91729977ce1691c2811fe5ec3ebea2c10 Apr 16 19:19:00.083280 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:00.083236 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7v82m" event={"ID":"368abbc9-410f-479c-9fa1-3676e33eeb51","Type":"ContainerStarted","Data":"10c29518c767d177b26d9c1fe18014e4f9a4d946801f393f27f535e620982e33"} Apr 16 19:19:00.084370 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:00.084335 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rdj9w" event={"ID":"419f622c-e0bd-4976-87ce-7df0a1ed0500","Type":"ContainerStarted","Data":"3a2c14fd4eb8030537ac84a89cc2bdf91729977ce1691c2811fe5ec3ebea2c10"} Apr 16 19:19:00.085755 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:00.085730 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fd47c7899-27h7v" event={"ID":"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2","Type":"ContainerStarted","Data":"9dc1691566ebe6371c05d3e3c39a93b391988e8aee069d87f579bb66375e37fe"} Apr 16 19:19:00.085879 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:00.085762 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fd47c7899-27h7v" event={"ID":"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2","Type":"ContainerStarted","Data":"7f398e09ee1c3002d2ede05b0a4e6c72660138ab4087f4c9af45b4ce4d9cf2d3"} Apr 16 19:19:00.086078 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:00.086053 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:19:00.107760 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:00.107694 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-fd47c7899-27h7v" podStartSLOduration=25.107675226 podStartE2EDuration="25.107675226s" podCreationTimestamp="2026-04-16 19:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:19:00.106496943 +0000 UTC m=+51.844400208" watchObservedRunningTime="2026-04-16 19:19:00.107675226 +0000 UTC m=+51.845578496" Apr 16 19:19:03.094509 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:03.094460 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7v82m" event={"ID":"368abbc9-410f-479c-9fa1-3676e33eeb51","Type":"ContainerStarted","Data":"86b7176ea7077eb1dc0a73659e6f2beea7176a3f7f9e732c415ea40f801bd0be"} Apr 16 19:19:03.094509 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:03.094511 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7v82m" event={"ID":"368abbc9-410f-479c-9fa1-3676e33eeb51","Type":"ContainerStarted","Data":"e81a096d1be382a26d9d01ee00c7474010896712fa8ad471c4187f64425a0e09"} Apr 16 19:19:03.095054 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:03.094622 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7v82m" Apr 16 19:19:03.095788 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:03.095764 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rdj9w" event={"ID":"419f622c-e0bd-4976-87ce-7df0a1ed0500","Type":"ContainerStarted","Data":"32da6d663168c59b8bad20e60bc53b9b18ac2884bb14c292ff7a7f2f67f1abc4"} Apr 16 19:19:03.114770 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:03.114692 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7v82m" podStartSLOduration=17.22357842 podStartE2EDuration="20.114678779s" podCreationTimestamp="2026-04-16 19:18:43 +0000 UTC" firstStartedPulling="2026-04-16 19:18:59.276777076 +0000 UTC m=+51.014680323" lastFinishedPulling="2026-04-16 19:19:02.167877422 +0000 UTC m=+53.905780682" observedRunningTime="2026-04-16 19:19:03.113466941 +0000 UTC m=+54.851370209" watchObservedRunningTime="2026-04-16 19:19:03.114678779 +0000 UTC m=+54.852582047" Apr 16 19:19:03.130426 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:03.130357 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rdj9w" podStartSLOduration=17.252744992 podStartE2EDuration="20.130341319s" podCreationTimestamp="2026-04-16 19:18:43 +0000 UTC" firstStartedPulling="2026-04-16 19:18:59.295096783 +0000 UTC m=+51.033000049" lastFinishedPulling="2026-04-16 19:19:02.172693127 +0000 UTC m=+53.910596376" observedRunningTime="2026-04-16 19:19:03.12955925 +0000 UTC m=+54.867462517" watchObservedRunningTime="2026-04-16 19:19:03.130341319 +0000 UTC m=+54.868244595" Apr 16 19:19:08.035966 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:08.035937 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bcplr" Apr 16 19:19:13.101988 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.101953 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7v82m" Apr 16 19:19:13.125375 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.125339 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qr8c4"] Apr 16 19:19:13.129825 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.129805 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.135672 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.135647 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:19:13.135877 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.135863 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:19:13.136184 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.136152 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:19:13.136303 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.136241 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vl99s\"" Apr 16 19:19:13.141314 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.141296 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:19:13.146387 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.146363 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qr8c4"] Apr 16 19:19:13.154588 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.154554 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fll25\" (UniqueName: \"kubernetes.io/projected/245c93d8-aeaf-4048-9b6a-90741c0996b3-kube-api-access-fll25\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.154733 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.154627 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/245c93d8-aeaf-4048-9b6a-90741c0996b3-crio-socket\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.154733 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.154653 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/245c93d8-aeaf-4048-9b6a-90741c0996b3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.154865 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.154768 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/245c93d8-aeaf-4048-9b6a-90741c0996b3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.154865 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.154859 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/245c93d8-aeaf-4048-9b6a-90741c0996b3-data-volume\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.171372 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.171340 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-fd47c7899-27h7v"] Apr 16 19:19:13.227039 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.227004 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z"] Apr 16 19:19:13.229810 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.229792 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" Apr 16 19:19:13.232252 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.232229 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 19:19:13.232597 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.232573 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 19:19:13.232886 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.232871 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 19:19:13.232966 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.232953 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 19:19:13.249524 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.249494 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z"] Apr 16 19:19:13.255910 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.255884 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/245c93d8-aeaf-4048-9b6a-90741c0996b3-data-volume\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.256066 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.255925 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fll25\" (UniqueName: \"kubernetes.io/projected/245c93d8-aeaf-4048-9b6a-90741c0996b3-kube-api-access-fll25\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.256066 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.255957 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/245c93d8-aeaf-4048-9b6a-90741c0996b3-crio-socket\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.256189 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.256081 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/245c93d8-aeaf-4048-9b6a-90741c0996b3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.256189 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.256122 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8flbv\" (UniqueName: \"kubernetes.io/projected/a4d76a52-2d19-47e6-8416-a11e4041f038-kube-api-access-8flbv\") pod \"klusterlet-addon-workmgr-764cfd7c89-c6g8z\" (UID: \"a4d76a52-2d19-47e6-8416-a11e4041f038\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" Apr 16 19:19:13.256189 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.256143 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/245c93d8-aeaf-4048-9b6a-90741c0996b3-crio-socket\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.256189 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.256170 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/245c93d8-aeaf-4048-9b6a-90741c0996b3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.256369 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.256206 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a4d76a52-2d19-47e6-8416-a11e4041f038-tmp\") pod \"klusterlet-addon-workmgr-764cfd7c89-c6g8z\" (UID: \"a4d76a52-2d19-47e6-8416-a11e4041f038\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" Apr 16 19:19:13.256369 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.256257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a4d76a52-2d19-47e6-8416-a11e4041f038-klusterlet-config\") pod \"klusterlet-addon-workmgr-764cfd7c89-c6g8z\" (UID: \"a4d76a52-2d19-47e6-8416-a11e4041f038\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" Apr 16 19:19:13.256369 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.256307 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/245c93d8-aeaf-4048-9b6a-90741c0996b3-data-volume\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.256771 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.256747 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/245c93d8-aeaf-4048-9b6a-90741c0996b3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.258869 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.258848 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/245c93d8-aeaf-4048-9b6a-90741c0996b3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.278630 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.278586 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fll25\" (UniqueName: \"kubernetes.io/projected/245c93d8-aeaf-4048-9b6a-90741c0996b3-kube-api-access-fll25\") pod \"insights-runtime-extractor-qr8c4\" (UID: \"245c93d8-aeaf-4048-9b6a-90741c0996b3\") " pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.310799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.310760 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-698cb67655-bhcxl"] Apr 16 19:19:13.313943 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.313925 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.329537 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.329503 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-698cb67655-bhcxl"] Apr 16 19:19:13.357139 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.357049 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a4d76a52-2d19-47e6-8416-a11e4041f038-tmp\") pod \"klusterlet-addon-workmgr-764cfd7c89-c6g8z\" (UID: \"a4d76a52-2d19-47e6-8416-a11e4041f038\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" Apr 16 19:19:13.357139 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.357094 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7a37b419-4e8b-4075-8e76-e678c39e2bdb-image-registry-private-configuration\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.357139 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.357115 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a37b419-4e8b-4075-8e76-e678c39e2bdb-registry-certificates\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.357139 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.357133 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a37b419-4e8b-4075-8e76-e678c39e2bdb-installation-pull-secrets\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.357377 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.357183 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a37b419-4e8b-4075-8e76-e678c39e2bdb-ca-trust-extracted\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.357377 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.357229 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a37b419-4e8b-4075-8e76-e678c39e2bdb-bound-sa-token\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.357377 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.357289 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a37b419-4e8b-4075-8e76-e678c39e2bdb-registry-tls\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.357377 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.357315 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42cm6\" (UniqueName: \"kubernetes.io/projected/7a37b419-4e8b-4075-8e76-e678c39e2bdb-kube-api-access-42cm6\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.357377 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.357345 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a4d76a52-2d19-47e6-8416-a11e4041f038-klusterlet-config\") pod \"klusterlet-addon-workmgr-764cfd7c89-c6g8z\" (UID: \"a4d76a52-2d19-47e6-8416-a11e4041f038\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" Apr 16 19:19:13.357568 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.357375 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a37b419-4e8b-4075-8e76-e678c39e2bdb-trusted-ca\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.357568 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.357426 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8flbv\" (UniqueName: \"kubernetes.io/projected/a4d76a52-2d19-47e6-8416-a11e4041f038-kube-api-access-8flbv\") pod \"klusterlet-addon-workmgr-764cfd7c89-c6g8z\" (UID: \"a4d76a52-2d19-47e6-8416-a11e4041f038\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" Apr 16 19:19:13.357568 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.357486 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a4d76a52-2d19-47e6-8416-a11e4041f038-tmp\") pod \"klusterlet-addon-workmgr-764cfd7c89-c6g8z\" (UID: \"a4d76a52-2d19-47e6-8416-a11e4041f038\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" Apr 16 19:19:13.359904 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.359884 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/a4d76a52-2d19-47e6-8416-a11e4041f038-klusterlet-config\") pod \"klusterlet-addon-workmgr-764cfd7c89-c6g8z\" (UID: \"a4d76a52-2d19-47e6-8416-a11e4041f038\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" Apr 16 19:19:13.368863 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.368838 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8flbv\" (UniqueName: \"kubernetes.io/projected/a4d76a52-2d19-47e6-8416-a11e4041f038-kube-api-access-8flbv\") pod \"klusterlet-addon-workmgr-764cfd7c89-c6g8z\" (UID: \"a4d76a52-2d19-47e6-8416-a11e4041f038\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" Apr 16 19:19:13.439938 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.439896 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qr8c4" Apr 16 19:19:13.458070 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.458030 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7a37b419-4e8b-4075-8e76-e678c39e2bdb-image-registry-private-configuration\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.458070 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.458070 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a37b419-4e8b-4075-8e76-e678c39e2bdb-registry-certificates\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.458313 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.458087 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a37b419-4e8b-4075-8e76-e678c39e2bdb-installation-pull-secrets\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.458313 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.458108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a37b419-4e8b-4075-8e76-e678c39e2bdb-ca-trust-extracted\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.458313 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.458156 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a37b419-4e8b-4075-8e76-e678c39e2bdb-bound-sa-token\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.458313 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.458183 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a37b419-4e8b-4075-8e76-e678c39e2bdb-registry-tls\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.458313 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.458217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42cm6\" (UniqueName: \"kubernetes.io/projected/7a37b419-4e8b-4075-8e76-e678c39e2bdb-kube-api-access-42cm6\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.458313 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.458268 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a37b419-4e8b-4075-8e76-e678c39e2bdb-trusted-ca\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.458835 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.458790 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7a37b419-4e8b-4075-8e76-e678c39e2bdb-ca-trust-extracted\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.460367 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.460063 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7a37b419-4e8b-4075-8e76-e678c39e2bdb-registry-certificates\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.460367 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.460218 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7a37b419-4e8b-4075-8e76-e678c39e2bdb-trusted-ca\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.463879 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.461654 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7a37b419-4e8b-4075-8e76-e678c39e2bdb-installation-pull-secrets\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.463879 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.461765 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/7a37b419-4e8b-4075-8e76-e678c39e2bdb-image-registry-private-configuration\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.464695 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.464669 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7a37b419-4e8b-4075-8e76-e678c39e2bdb-registry-tls\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.468737 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.468713 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7a37b419-4e8b-4075-8e76-e678c39e2bdb-bound-sa-token\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.468878 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.468847 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42cm6\" (UniqueName: \"kubernetes.io/projected/7a37b419-4e8b-4075-8e76-e678c39e2bdb-kube-api-access-42cm6\") pod \"image-registry-698cb67655-bhcxl\" (UID: \"7a37b419-4e8b-4075-8e76-e678c39e2bdb\") " pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.539824 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.539788 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" Apr 16 19:19:13.578962 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.578925 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qr8c4"] Apr 16 19:19:13.582512 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:19:13.582484 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod245c93d8_aeaf_4048_9b6a_90741c0996b3.slice/crio-603f5fc8ebcc075a35b1c3af38a5cde63a9d59f0e598ffa9312bc798bfc46b91 WatchSource:0}: Error finding container 603f5fc8ebcc075a35b1c3af38a5cde63a9d59f0e598ffa9312bc798bfc46b91: Status 404 returned error can't find the container with id 603f5fc8ebcc075a35b1c3af38a5cde63a9d59f0e598ffa9312bc798bfc46b91 Apr 16 19:19:13.622591 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.622561 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:13.670716 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.670662 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z"] Apr 16 19:19:13.675306 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:19:13.675267 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4d76a52_2d19_47e6_8416_a11e4041f038.slice/crio-13d696021bf7c59d7a892360e51aded2af324115ea08109452470c4da26cbb6a WatchSource:0}: Error finding container 13d696021bf7c59d7a892360e51aded2af324115ea08109452470c4da26cbb6a: Status 404 returned error can't find the container with id 13d696021bf7c59d7a892360e51aded2af324115ea08109452470c4da26cbb6a Apr 16 19:19:13.753506 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:13.753470 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-698cb67655-bhcxl"] Apr 16 19:19:13.757275 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:19:13.757247 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a37b419_4e8b_4075_8e76_e678c39e2bdb.slice/crio-0b1aca260b603af892ba264f6163d256d5ac04e427ec298aaed6226f185516c1 WatchSource:0}: Error finding container 0b1aca260b603af892ba264f6163d256d5ac04e427ec298aaed6226f185516c1: Status 404 returned error can't find the container with id 0b1aca260b603af892ba264f6163d256d5ac04e427ec298aaed6226f185516c1 Apr 16 19:19:14.126800 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.126756 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" event={"ID":"a4d76a52-2d19-47e6-8416-a11e4041f038","Type":"ContainerStarted","Data":"13d696021bf7c59d7a892360e51aded2af324115ea08109452470c4da26cbb6a"} Apr 16 19:19:14.128266 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.128238 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qr8c4" event={"ID":"245c93d8-aeaf-4048-9b6a-90741c0996b3","Type":"ContainerStarted","Data":"7216ded20a0755ac1e9bee3d24cc80e00d97c777c0d23a241cc9e438d7a7015f"} Apr 16 19:19:14.128400 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.128271 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qr8c4" event={"ID":"245c93d8-aeaf-4048-9b6a-90741c0996b3","Type":"ContainerStarted","Data":"603f5fc8ebcc075a35b1c3af38a5cde63a9d59f0e598ffa9312bc798bfc46b91"} Apr 16 19:19:14.129589 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.129565 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-698cb67655-bhcxl" event={"ID":"7a37b419-4e8b-4075-8e76-e678c39e2bdb","Type":"ContainerStarted","Data":"ae02d938d9b97316fecadd0df5f8430a2b9bd2ec5377341fb855a8c97f9810a6"} Apr 16 19:19:14.129589 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.129589 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-698cb67655-bhcxl" event={"ID":"7a37b419-4e8b-4075-8e76-e678c39e2bdb","Type":"ContainerStarted","Data":"0b1aca260b603af892ba264f6163d256d5ac04e427ec298aaed6226f185516c1"} Apr 16 19:19:14.129734 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.129714 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:14.155190 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.155101 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-698cb67655-bhcxl" podStartSLOduration=1.155081575 podStartE2EDuration="1.155081575s" podCreationTimestamp="2026-04-16 19:19:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:19:14.153998041 +0000 UTC m=+65.891901311" watchObservedRunningTime="2026-04-16 19:19:14.155081575 +0000 UTC m=+65.892984820" Apr 16 19:19:14.570275 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.570232 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs\") pod \"network-metrics-daemon-xhnjz\" (UID: \"81fa50c0-8c06-4a6c-9d00-a1ed89b88844\") " pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:19:14.572959 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.572932 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:19:14.584711 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.584667 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81fa50c0-8c06-4a6c-9d00-a1ed89b88844-metrics-certs\") pod \"network-metrics-daemon-xhnjz\" (UID: \"81fa50c0-8c06-4a6c-9d00-a1ed89b88844\") " pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:19:14.671583 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.671481 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbx4\" (UniqueName: \"kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4\") pod \"network-check-target-m5l2p\" (UID: \"82ff5922-30ac-4de1-81e9-3d15bce731aa\") " pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:19:14.674260 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.674233 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:19:14.685247 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.685213 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:19:14.696696 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.696664 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fbx4\" (UniqueName: \"kubernetes.io/projected/82ff5922-30ac-4de1-81e9-3d15bce731aa-kube-api-access-8fbx4\") pod \"network-check-target-m5l2p\" (UID: \"82ff5922-30ac-4de1-81e9-3d15bce731aa\") " pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:19:14.796549 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.796511 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-s2294\"" Apr 16 19:19:14.802162 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.802131 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t5lg8\"" Apr 16 19:19:14.804272 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.804241 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:19:14.809684 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:14.809656 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xhnjz" Apr 16 19:19:15.002772 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:15.002719 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-m5l2p"] Apr 16 19:19:15.008843 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:19:15.008798 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82ff5922_30ac_4de1_81e9_3d15bce731aa.slice/crio-d6b84cf5a59a7d9238a58db5e9940cd0fc9327964ac5f44e584e57ede73bcdc2 WatchSource:0}: Error finding container d6b84cf5a59a7d9238a58db5e9940cd0fc9327964ac5f44e584e57ede73bcdc2: Status 404 returned error can't find the container with id d6b84cf5a59a7d9238a58db5e9940cd0fc9327964ac5f44e584e57ede73bcdc2 Apr 16 19:19:15.021770 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:15.021716 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xhnjz"] Apr 16 19:19:15.027697 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:19:15.027652 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81fa50c0_8c06_4a6c_9d00_a1ed89b88844.slice/crio-5272d86789060ef93bd2e60f9b02b882de173eceafbc15822fd8a76c2e511726 WatchSource:0}: Error finding container 5272d86789060ef93bd2e60f9b02b882de173eceafbc15822fd8a76c2e511726: Status 404 returned error can't find the container with id 5272d86789060ef93bd2e60f9b02b882de173eceafbc15822fd8a76c2e511726 Apr 16 19:19:15.136721 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:15.136660 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qr8c4" event={"ID":"245c93d8-aeaf-4048-9b6a-90741c0996b3","Type":"ContainerStarted","Data":"d828043af478711006758dd7c80329d0ad633819fda03eebac1baf00994f5d32"} Apr 16 19:19:15.138205 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:15.138110 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xhnjz" event={"ID":"81fa50c0-8c06-4a6c-9d00-a1ed89b88844","Type":"ContainerStarted","Data":"5272d86789060ef93bd2e60f9b02b882de173eceafbc15822fd8a76c2e511726"} Apr 16 19:19:15.139750 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:15.139704 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m5l2p" event={"ID":"82ff5922-30ac-4de1-81e9-3d15bce731aa","Type":"ContainerStarted","Data":"d6b84cf5a59a7d9238a58db5e9940cd0fc9327964ac5f44e584e57ede73bcdc2"} Apr 16 19:19:17.148519 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.148468 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qr8c4" event={"ID":"245c93d8-aeaf-4048-9b6a-90741c0996b3","Type":"ContainerStarted","Data":"1306956084cd57b0a89c49def2e35c2c2f3db5f6c93bffea0a5ac8bb61554896"} Apr 16 19:19:17.876199 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.873772 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qr8c4" podStartSLOduration=1.5082015869999998 podStartE2EDuration="4.873751023s" podCreationTimestamp="2026-04-16 19:19:13 +0000 UTC" firstStartedPulling="2026-04-16 19:19:13.652066721 +0000 UTC m=+65.389969967" lastFinishedPulling="2026-04-16 19:19:17.017616146 +0000 UTC m=+68.755519403" observedRunningTime="2026-04-16 19:19:17.178784552 +0000 UTC m=+68.916687824" watchObservedRunningTime="2026-04-16 19:19:17.873751023 +0000 UTC m=+69.611654291" Apr 16 19:19:17.876199 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.874980 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-h4qrx"] Apr 16 19:19:17.878505 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.878481 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" Apr 16 19:19:17.881330 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.881309 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 19:19:17.881474 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.881313 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:19:17.882198 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.882166 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 19:19:17.882324 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.882223 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-8txxs\"" Apr 16 19:19:17.882387 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.882323 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:19:17.882452 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.882440 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:19:17.888993 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.888963 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-h4qrx"] Apr 16 19:19:17.999514 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.999381 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bdf9751d-36a3-47a6-93d9-c268be17be2f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-h4qrx\" (UID: \"bdf9751d-36a3-47a6-93d9-c268be17be2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" Apr 16 19:19:17.999514 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.999458 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdf9751d-36a3-47a6-93d9-c268be17be2f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-h4qrx\" (UID: \"bdf9751d-36a3-47a6-93d9-c268be17be2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" Apr 16 19:19:17.999514 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.999491 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bdf9751d-36a3-47a6-93d9-c268be17be2f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-h4qrx\" (UID: \"bdf9751d-36a3-47a6-93d9-c268be17be2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" Apr 16 19:19:17.999514 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:17.999516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tf7hx\" (UniqueName: \"kubernetes.io/projected/bdf9751d-36a3-47a6-93d9-c268be17be2f-kube-api-access-tf7hx\") pod \"prometheus-operator-5676c8c784-h4qrx\" (UID: \"bdf9751d-36a3-47a6-93d9-c268be17be2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" Apr 16 19:19:18.100907 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:18.100869 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdf9751d-36a3-47a6-93d9-c268be17be2f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-h4qrx\" (UID: \"bdf9751d-36a3-47a6-93d9-c268be17be2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" Apr 16 19:19:18.101082 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:18.100914 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bdf9751d-36a3-47a6-93d9-c268be17be2f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-h4qrx\" (UID: \"bdf9751d-36a3-47a6-93d9-c268be17be2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" Apr 16 19:19:18.101082 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:18.100956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tf7hx\" (UniqueName: \"kubernetes.io/projected/bdf9751d-36a3-47a6-93d9-c268be17be2f-kube-api-access-tf7hx\") pod \"prometheus-operator-5676c8c784-h4qrx\" (UID: \"bdf9751d-36a3-47a6-93d9-c268be17be2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" Apr 16 19:19:18.101082 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:18.101010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bdf9751d-36a3-47a6-93d9-c268be17be2f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-h4qrx\" (UID: \"bdf9751d-36a3-47a6-93d9-c268be17be2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" Apr 16 19:19:18.101715 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:18.101687 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bdf9751d-36a3-47a6-93d9-c268be17be2f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-h4qrx\" (UID: \"bdf9751d-36a3-47a6-93d9-c268be17be2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" Apr 16 19:19:18.104136 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:18.104100 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bdf9751d-36a3-47a6-93d9-c268be17be2f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-h4qrx\" (UID: \"bdf9751d-36a3-47a6-93d9-c268be17be2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" Apr 16 19:19:18.104255 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:18.104207 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bdf9751d-36a3-47a6-93d9-c268be17be2f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-h4qrx\" (UID: \"bdf9751d-36a3-47a6-93d9-c268be17be2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" Apr 16 19:19:18.110282 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:18.110258 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tf7hx\" (UniqueName: \"kubernetes.io/projected/bdf9751d-36a3-47a6-93d9-c268be17be2f-kube-api-access-tf7hx\") pod \"prometheus-operator-5676c8c784-h4qrx\" (UID: \"bdf9751d-36a3-47a6-93d9-c268be17be2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" Apr 16 19:19:18.190482 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:18.190383 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" Apr 16 19:19:19.728076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:19.727992 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-h4qrx"] Apr 16 19:19:19.731521 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:19:19.731476 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdf9751d_36a3_47a6_93d9_c268be17be2f.slice/crio-85d2c95dcf2ee2d23b2582508af7206951281377ee733f94dae6e7cfcffd23a3 WatchSource:0}: Error finding container 85d2c95dcf2ee2d23b2582508af7206951281377ee733f94dae6e7cfcffd23a3: Status 404 returned error can't find the container with id 85d2c95dcf2ee2d23b2582508af7206951281377ee733f94dae6e7cfcffd23a3 Apr 16 19:19:20.159069 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:20.158974 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-m5l2p" event={"ID":"82ff5922-30ac-4de1-81e9-3d15bce731aa","Type":"ContainerStarted","Data":"3d56ac63a0988f97327273a4876291c6435e5d75e172cf4b5432a3faa5aa6984"} Apr 16 19:19:20.159069 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:20.159045 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:19:20.160265 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:20.160237 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" event={"ID":"a4d76a52-2d19-47e6-8416-a11e4041f038","Type":"ContainerStarted","Data":"15c295bc46664de9c8326bb719181da4b621001c927f832046b84885430b2eac"} Apr 16 19:19:20.160567 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:20.160552 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" Apr 16 19:19:20.161989 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:20.161930 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xhnjz" event={"ID":"81fa50c0-8c06-4a6c-9d00-a1ed89b88844","Type":"ContainerStarted","Data":"8cd1a4b2bed1dbcb8a711f45d47454f03feb90047088b4aca5cdc28068e2ddf7"} Apr 16 19:19:20.161989 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:20.161963 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xhnjz" event={"ID":"81fa50c0-8c06-4a6c-9d00-a1ed89b88844","Type":"ContainerStarted","Data":"cbb7e4dcaf880d685ba0dd2f64e7345b5e126794172714346d8102193d4f94c7"} Apr 16 19:19:20.162619 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:20.162601 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" Apr 16 19:19:20.163166 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:20.163143 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" event={"ID":"bdf9751d-36a3-47a6-93d9-c268be17be2f","Type":"ContainerStarted","Data":"85d2c95dcf2ee2d23b2582508af7206951281377ee733f94dae6e7cfcffd23a3"} Apr 16 19:19:20.175576 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:20.175514 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-m5l2p" podStartSLOduration=67.604681217 podStartE2EDuration="1m12.175498432s" podCreationTimestamp="2026-04-16 19:18:08 +0000 UTC" firstStartedPulling="2026-04-16 19:19:15.01155338 +0000 UTC m=+66.749456632" lastFinishedPulling="2026-04-16 19:19:19.582370585 +0000 UTC m=+71.320273847" observedRunningTime="2026-04-16 19:19:20.175110779 +0000 UTC m=+71.913014047" watchObservedRunningTime="2026-04-16 19:19:20.175498432 +0000 UTC m=+71.913401710" Apr 16 19:19:20.189847 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:20.189780 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764cfd7c89-c6g8z" podStartSLOduration=1.276504698 podStartE2EDuration="7.189760803s" podCreationTimestamp="2026-04-16 19:19:13 +0000 UTC" firstStartedPulling="2026-04-16 19:19:13.677731215 +0000 UTC m=+65.415634476" lastFinishedPulling="2026-04-16 19:19:19.590987321 +0000 UTC m=+71.328890581" observedRunningTime="2026-04-16 19:19:20.189377385 +0000 UTC m=+71.927280654" watchObservedRunningTime="2026-04-16 19:19:20.189760803 +0000 UTC m=+71.927664049" Apr 16 19:19:20.205524 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:20.205463 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xhnjz" podStartSLOduration=67.661694829 podStartE2EDuration="1m12.205441361s" podCreationTimestamp="2026-04-16 19:18:08 +0000 UTC" firstStartedPulling="2026-04-16 19:19:15.030479339 +0000 UTC m=+66.768382590" lastFinishedPulling="2026-04-16 19:19:19.574225862 +0000 UTC m=+71.312129122" observedRunningTime="2026-04-16 19:19:20.20426333 +0000 UTC m=+71.942166633" watchObservedRunningTime="2026-04-16 19:19:20.205441361 +0000 UTC m=+71.943344630" Apr 16 19:19:22.169783 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:22.169739 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" event={"ID":"bdf9751d-36a3-47a6-93d9-c268be17be2f","Type":"ContainerStarted","Data":"6296817a2051a80df0075472986d3a840d73fa6ffcf5dd0742055b233b7de0d1"} Apr 16 19:19:22.169783 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:22.169780 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" event={"ID":"bdf9751d-36a3-47a6-93d9-c268be17be2f","Type":"ContainerStarted","Data":"346bef6276300d403d55f2bb0f17ef723a7e2d5b837baffbb640d95c79f4e8ac"} Apr 16 19:19:22.188333 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:22.188238 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-h4qrx" podStartSLOduration=3.447728818 podStartE2EDuration="5.188220606s" podCreationTimestamp="2026-04-16 19:19:17 +0000 UTC" firstStartedPulling="2026-04-16 19:19:19.734895543 +0000 UTC m=+71.472798804" lastFinishedPulling="2026-04-16 19:19:21.475387343 +0000 UTC m=+73.213290592" observedRunningTime="2026-04-16 19:19:22.187228641 +0000 UTC m=+73.925131910" watchObservedRunningTime="2026-04-16 19:19:22.188220606 +0000 UTC m=+73.926123873" Apr 16 19:19:23.176660 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:23.176625 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:19:24.314816 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.314783 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-s75jz"] Apr 16 19:19:24.318537 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.318519 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.321549 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.321520 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:19:24.321654 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.321600 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:19:24.322114 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.322096 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:19:24.322173 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.322114 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-9z59t\"" Apr 16 19:19:24.350395 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.350369 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-textfile\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.350535 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.350401 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-tls\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.350535 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.350478 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.350610 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.350548 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c95ee486-2760-4ad8-9188-14bfc1cb67df-sys\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.350610 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.350567 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c95ee486-2760-4ad8-9188-14bfc1cb67df-metrics-client-ca\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.350610 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.350581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnr2h\" (UniqueName: \"kubernetes.io/projected/c95ee486-2760-4ad8-9188-14bfc1cb67df-kube-api-access-gnr2h\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.350610 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.350600 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-accelerators-collector-config\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.350730 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.350668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c95ee486-2760-4ad8-9188-14bfc1cb67df-root\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.350730 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.350697 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-wtmp\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.451225 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451181 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-textfile\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.451225 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-tls\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.451528 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.451528 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451298 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c95ee486-2760-4ad8-9188-14bfc1cb67df-sys\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.451528 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451324 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c95ee486-2760-4ad8-9188-14bfc1cb67df-metrics-client-ca\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.451528 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451347 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnr2h\" (UniqueName: \"kubernetes.io/projected/c95ee486-2760-4ad8-9188-14bfc1cb67df-kube-api-access-gnr2h\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.451528 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451373 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-accelerators-collector-config\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.451528 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451439 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c95ee486-2760-4ad8-9188-14bfc1cb67df-sys\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.451528 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451482 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c95ee486-2760-4ad8-9188-14bfc1cb67df-root\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.451528 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451511 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-wtmp\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.451831 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451636 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-textfile\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.451831 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451688 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-wtmp\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.451831 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451704 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c95ee486-2760-4ad8-9188-14bfc1cb67df-root\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.452082 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451984 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c95ee486-2760-4ad8-9188-14bfc1cb67df-metrics-client-ca\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.452082 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.451984 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-accelerators-collector-config\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.453872 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.453851 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.453968 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.453868 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c95ee486-2760-4ad8-9188-14bfc1cb67df-node-exporter-tls\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.460634 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.460614 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnr2h\" (UniqueName: \"kubernetes.io/projected/c95ee486-2760-4ad8-9188-14bfc1cb67df-kube-api-access-gnr2h\") pod \"node-exporter-s75jz\" (UID: \"c95ee486-2760-4ad8-9188-14bfc1cb67df\") " pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.627392 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:24.627309 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s75jz" Apr 16 19:19:24.635520 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:19:24.635486 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc95ee486_2760_4ad8_9188_14bfc1cb67df.slice/crio-0fda53d9635f8f1130601464c8a9024e5dd82fbd1cd336df05c1aef335d90630 WatchSource:0}: Error finding container 0fda53d9635f8f1130601464c8a9024e5dd82fbd1cd336df05c1aef335d90630: Status 404 returned error can't find the container with id 0fda53d9635f8f1130601464c8a9024e5dd82fbd1cd336df05c1aef335d90630 Apr 16 19:19:25.179258 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:25.179217 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s75jz" event={"ID":"c95ee486-2760-4ad8-9188-14bfc1cb67df","Type":"ContainerStarted","Data":"0fda53d9635f8f1130601464c8a9024e5dd82fbd1cd336df05c1aef335d90630"} Apr 16 19:19:26.183078 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:26.183045 2578 generic.go:358] "Generic (PLEG): container finished" podID="c95ee486-2760-4ad8-9188-14bfc1cb67df" containerID="17d08d4adfd2b42e542038de36d4c2975af314753a57b1f4ceb4b892c49f9290" exitCode=0 Apr 16 19:19:26.183480 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:26.183090 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s75jz" event={"ID":"c95ee486-2760-4ad8-9188-14bfc1cb67df","Type":"ContainerDied","Data":"17d08d4adfd2b42e542038de36d4c2975af314753a57b1f4ceb4b892c49f9290"} Apr 16 19:19:27.188106 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.188070 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s75jz" event={"ID":"c95ee486-2760-4ad8-9188-14bfc1cb67df","Type":"ContainerStarted","Data":"e5c255e6cf7738eb386a467aafcea5f2a24e12804a37f28f14b374c58d41beea"} Apr 16 19:19:27.188106 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.188107 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s75jz" event={"ID":"c95ee486-2760-4ad8-9188-14bfc1cb67df","Type":"ContainerStarted","Data":"02f1d128e50b66af7a8f7ca8547a956a904109a7a460e1d83674ef3aeb68547e"} Apr 16 19:19:27.214060 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.214001 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-s75jz" podStartSLOduration=2.281028502 podStartE2EDuration="3.213979744s" podCreationTimestamp="2026-04-16 19:19:24 +0000 UTC" firstStartedPulling="2026-04-16 19:19:24.637256854 +0000 UTC m=+76.375160101" lastFinishedPulling="2026-04-16 19:19:25.570208081 +0000 UTC m=+77.308111343" observedRunningTime="2026-04-16 19:19:27.213798565 +0000 UTC m=+78.951701833" watchObservedRunningTime="2026-04-16 19:19:27.213979744 +0000 UTC m=+78.951883013" Apr 16 19:19:27.380972 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.380936 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5774b77849-6p65s"] Apr 16 19:19:27.384394 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.384371 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.387123 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.387092 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 19:19:27.387123 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.387121 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 19:19:27.387342 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.387239 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 19:19:27.387430 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.387382 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 19:19:27.387497 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.387450 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 19:19:27.387556 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.387539 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6p7lo6fba2gn9\"" Apr 16 19:19:27.387628 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.387383 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-rr76k\"" Apr 16 19:19:27.397307 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.397280 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5774b77849-6p65s"] Apr 16 19:19:27.476973 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.476876 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.476973 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.476923 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-grpc-tls\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.476973 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.476951 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwlzm\" (UniqueName: \"kubernetes.io/projected/c3c44c36-e89c-4df8-bf92-0bc611cfe392-kube-api-access-wwlzm\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.477184 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.476994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.477184 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.477050 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.477184 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.477086 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c3c44c36-e89c-4df8-bf92-0bc611cfe392-metrics-client-ca\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.477184 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.477112 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-tls\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.477184 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.477131 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.577891 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.577844 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-grpc-tls\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.577891 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.577889 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwlzm\" (UniqueName: \"kubernetes.io/projected/c3c44c36-e89c-4df8-bf92-0bc611cfe392-kube-api-access-wwlzm\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.578119 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.577910 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.578119 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.578067 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.578219 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.578126 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c3c44c36-e89c-4df8-bf92-0bc611cfe392-metrics-client-ca\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.578219 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.578156 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-tls\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.578219 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.578174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.578483 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.578233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.579124 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.579098 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c3c44c36-e89c-4df8-bf92-0bc611cfe392-metrics-client-ca\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.580936 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.580898 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-grpc-tls\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.581237 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.581216 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.581378 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.581356 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.581477 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.581459 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.581512 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.581464 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-tls\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.581774 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.581756 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c3c44c36-e89c-4df8-bf92-0bc611cfe392-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.587008 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.586983 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwlzm\" (UniqueName: \"kubernetes.io/projected/c3c44c36-e89c-4df8-bf92-0bc611cfe392-kube-api-access-wwlzm\") pod \"thanos-querier-5774b77849-6p65s\" (UID: \"c3c44c36-e89c-4df8-bf92-0bc611cfe392\") " pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.694140 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.694099 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:27.823640 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:27.823613 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5774b77849-6p65s"] Apr 16 19:19:27.826431 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:19:27.826384 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3c44c36_e89c_4df8_bf92_0bc611cfe392.slice/crio-49a7a77765ad5068d8f88dc4c5852884e99f5563be8567b11d5aa03e0d3d8d00 WatchSource:0}: Error finding container 49a7a77765ad5068d8f88dc4c5852884e99f5563be8567b11d5aa03e0d3d8d00: Status 404 returned error can't find the container with id 49a7a77765ad5068d8f88dc4c5852884e99f5563be8567b11d5aa03e0d3d8d00 Apr 16 19:19:28.191313 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:28.191228 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" event={"ID":"c3c44c36-e89c-4df8-bf92-0bc611cfe392","Type":"ContainerStarted","Data":"49a7a77765ad5068d8f88dc4c5852884e99f5563be8567b11d5aa03e0d3d8d00"} Apr 16 19:19:30.198724 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:30.198679 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" event={"ID":"c3c44c36-e89c-4df8-bf92-0bc611cfe392","Type":"ContainerStarted","Data":"3ad2e0e47b99246340f51626a310ffac938715ddb3e3017a0249e88dbe932216"} Apr 16 19:19:30.198724 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:30.198719 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" event={"ID":"c3c44c36-e89c-4df8-bf92-0bc611cfe392","Type":"ContainerStarted","Data":"564053013ead0ada2972b9269f6a495c99f3cddec19545e527a0cd61b2e1071f"} Apr 16 19:19:30.198724 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:30.198728 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" event={"ID":"c3c44c36-e89c-4df8-bf92-0bc611cfe392","Type":"ContainerStarted","Data":"ea96a6efc10defe9d31d44e2306308d5d9c1f3f5a9842474b9f42fa010ddead1"} Apr 16 19:19:31.204495 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:31.204463 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" event={"ID":"c3c44c36-e89c-4df8-bf92-0bc611cfe392","Type":"ContainerStarted","Data":"c01a9779bca349f96286e3f3e5785de3e286ab3e15a9e5de1fd93dce82a0e697"} Apr 16 19:19:31.204846 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:31.204503 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" event={"ID":"c3c44c36-e89c-4df8-bf92-0bc611cfe392","Type":"ContainerStarted","Data":"b35217d21cf63814208bf4917891caa65fdfe7c4d983a568602a85b9059dcd44"} Apr 16 19:19:32.209443 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:32.209385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" event={"ID":"c3c44c36-e89c-4df8-bf92-0bc611cfe392","Type":"ContainerStarted","Data":"b687990e17275e2aa9574843db3ad622dfcf6fa0cd9cea81b606122c95c80844"} Apr 16 19:19:32.209889 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:32.209539 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:32.242145 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:32.242082 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" podStartSLOduration=1.999552881 podStartE2EDuration="5.242061749s" podCreationTimestamp="2026-04-16 19:19:27 +0000 UTC" firstStartedPulling="2026-04-16 19:19:27.828831157 +0000 UTC m=+79.566734416" lastFinishedPulling="2026-04-16 19:19:31.071340035 +0000 UTC m=+82.809243284" observedRunningTime="2026-04-16 19:19:32.239559112 +0000 UTC m=+83.977462379" watchObservedRunningTime="2026-04-16 19:19:32.242061749 +0000 UTC m=+83.979965018" Apr 16 19:19:35.143592 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:35.143562 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-698cb67655-bhcxl" Apr 16 19:19:38.190704 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.190636 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-fd47c7899-27h7v" podUID="f99d9c18-6acb-4dcc-a1e9-afadc152c9e2" containerName="registry" containerID="cri-o://9dc1691566ebe6371c05d3e3c39a93b391988e8aee069d87f579bb66375e37fe" gracePeriod=30 Apr 16 19:19:38.218662 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.218634 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5774b77849-6p65s" Apr 16 19:19:38.434284 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.434251 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:19:38.566958 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.566920 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-trusted-ca\") pod \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " Apr 16 19:19:38.567229 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.567205 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls\") pod \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " Apr 16 19:19:38.567357 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.567341 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-installation-pull-secrets\") pod \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " Apr 16 19:19:38.567512 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.567497 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-certificates\") pod \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " Apr 16 19:19:38.567621 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.567607 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lcbd\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-kube-api-access-8lcbd\") pod \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " Apr 16 19:19:38.567730 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.567715 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-image-registry-private-configuration\") pod \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " Apr 16 19:19:38.567846 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.567832 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-ca-trust-extracted\") pod \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " Apr 16 19:19:38.568173 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.567372 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2" (UID: "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:19:38.568173 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.567971 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-bound-sa-token\") pod \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\" (UID: \"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2\") " Apr 16 19:19:38.568311 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.568200 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-trusted-ca\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:19:38.568554 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.568523 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2" (UID: "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:19:38.570797 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.570754 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2" (UID: "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:19:38.571290 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.571204 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2" (UID: "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:19:38.571290 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.571209 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-kube-api-access-8lcbd" (OuterVolumeSpecName: "kube-api-access-8lcbd") pod "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2" (UID: "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2"). InnerVolumeSpecName "kube-api-access-8lcbd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:19:38.571466 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.571314 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2" (UID: "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:19:38.571524 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.571491 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2" (UID: "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:19:38.587550 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.587500 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2" (UID: "f99d9c18-6acb-4dcc-a1e9-afadc152c9e2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:19:38.669371 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.669330 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-installation-pull-secrets\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:19:38.669371 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.669370 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-certificates\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:19:38.669719 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.669385 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8lcbd\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-kube-api-access-8lcbd\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:19:38.669719 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.669396 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-image-registry-private-configuration\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:19:38.669719 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.669430 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-ca-trust-extracted\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:19:38.669719 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.669470 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-bound-sa-token\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:19:38.669719 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.669481 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2-registry-tls\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:19:38.790644 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.790609 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-587db8f75f-znrfz"] Apr 16 19:19:38.791018 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.790997 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f99d9c18-6acb-4dcc-a1e9-afadc152c9e2" containerName="registry" Apr 16 19:19:38.791142 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.791130 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f99d9c18-6acb-4dcc-a1e9-afadc152c9e2" containerName="registry" Apr 16 19:19:38.791294 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.791284 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f99d9c18-6acb-4dcc-a1e9-afadc152c9e2" containerName="registry" Apr 16 19:19:38.796475 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.796444 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.800162 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.799472 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 19:19:38.800162 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.799883 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 19:19:38.800162 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.800113 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 19:19:38.800510 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.800447 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 19:19:38.800510 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.800468 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 19:19:38.800510 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.800468 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 19:19:38.800655 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.800520 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 19:19:38.800655 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.800467 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7km65\"" Apr 16 19:19:38.806256 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.806223 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-587db8f75f-znrfz"] Apr 16 19:19:38.871885 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.871798 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-service-ca\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.871885 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.871866 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnz7j\" (UniqueName: \"kubernetes.io/projected/9a772c6a-c464-4345-bb91-bd201e07c1be-kube-api-access-hnz7j\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.872055 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.871916 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-oauth-serving-cert\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.872055 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.872005 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a772c6a-c464-4345-bb91-bd201e07c1be-console-serving-cert\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.872055 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.872039 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a772c6a-c464-4345-bb91-bd201e07c1be-console-oauth-config\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.872149 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.872079 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-console-config\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.973335 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.973287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a772c6a-c464-4345-bb91-bd201e07c1be-console-serving-cert\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.973335 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.973331 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a772c6a-c464-4345-bb91-bd201e07c1be-console-oauth-config\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.973630 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.973357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-console-config\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.973630 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.973385 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-service-ca\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.973630 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.973455 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnz7j\" (UniqueName: \"kubernetes.io/projected/9a772c6a-c464-4345-bb91-bd201e07c1be-kube-api-access-hnz7j\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.973630 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.973496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-oauth-serving-cert\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.974240 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.974209 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-console-config\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.974374 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.974250 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-oauth-serving-cert\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.974514 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.974492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-service-ca\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.976120 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.976099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a772c6a-c464-4345-bb91-bd201e07c1be-console-oauth-config\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.976462 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.976440 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a772c6a-c464-4345-bb91-bd201e07c1be-console-serving-cert\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:38.982058 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:38.982027 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnz7j\" (UniqueName: \"kubernetes.io/projected/9a772c6a-c464-4345-bb91-bd201e07c1be-kube-api-access-hnz7j\") pod \"console-587db8f75f-znrfz\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:39.109461 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:39.109380 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:39.231626 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:39.231587 2578 generic.go:358] "Generic (PLEG): container finished" podID="f99d9c18-6acb-4dcc-a1e9-afadc152c9e2" containerID="9dc1691566ebe6371c05d3e3c39a93b391988e8aee069d87f579bb66375e37fe" exitCode=0 Apr 16 19:19:39.232101 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:39.231644 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fd47c7899-27h7v" event={"ID":"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2","Type":"ContainerDied","Data":"9dc1691566ebe6371c05d3e3c39a93b391988e8aee069d87f579bb66375e37fe"} Apr 16 19:19:39.232101 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:39.231674 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-fd47c7899-27h7v" event={"ID":"f99d9c18-6acb-4dcc-a1e9-afadc152c9e2","Type":"ContainerDied","Data":"7f398e09ee1c3002d2ede05b0a4e6c72660138ab4087f4c9af45b4ce4d9cf2d3"} Apr 16 19:19:39.232101 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:39.231686 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-fd47c7899-27h7v" Apr 16 19:19:39.232101 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:39.231699 2578 scope.go:117] "RemoveContainer" containerID="9dc1691566ebe6371c05d3e3c39a93b391988e8aee069d87f579bb66375e37fe" Apr 16 19:19:39.239540 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:39.239510 2578 scope.go:117] "RemoveContainer" containerID="9dc1691566ebe6371c05d3e3c39a93b391988e8aee069d87f579bb66375e37fe" Apr 16 19:19:39.239867 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:19:39.239846 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc1691566ebe6371c05d3e3c39a93b391988e8aee069d87f579bb66375e37fe\": container with ID starting with 9dc1691566ebe6371c05d3e3c39a93b391988e8aee069d87f579bb66375e37fe not found: ID does not exist" containerID="9dc1691566ebe6371c05d3e3c39a93b391988e8aee069d87f579bb66375e37fe" Apr 16 19:19:39.239916 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:39.239876 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc1691566ebe6371c05d3e3c39a93b391988e8aee069d87f579bb66375e37fe"} err="failed to get container status \"9dc1691566ebe6371c05d3e3c39a93b391988e8aee069d87f579bb66375e37fe\": rpc error: code = NotFound desc = could not find container \"9dc1691566ebe6371c05d3e3c39a93b391988e8aee069d87f579bb66375e37fe\": container with ID starting with 9dc1691566ebe6371c05d3e3c39a93b391988e8aee069d87f579bb66375e37fe not found: ID does not exist" Apr 16 19:19:39.242955 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:39.242925 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-587db8f75f-znrfz"] Apr 16 19:19:39.249239 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:19:39.249205 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a772c6a_c464_4345_bb91_bd201e07c1be.slice/crio-8dd8b328efa4508b18d1bfdb9bfa3a2a4a9a90bba2bad3df4734293a7001851e WatchSource:0}: Error finding container 8dd8b328efa4508b18d1bfdb9bfa3a2a4a9a90bba2bad3df4734293a7001851e: Status 404 returned error can't find the container with id 8dd8b328efa4508b18d1bfdb9bfa3a2a4a9a90bba2bad3df4734293a7001851e Apr 16 19:19:39.251721 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:39.251695 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-fd47c7899-27h7v"] Apr 16 19:19:39.260704 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:39.260661 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-fd47c7899-27h7v"] Apr 16 19:19:40.236236 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:40.236195 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-587db8f75f-znrfz" event={"ID":"9a772c6a-c464-4345-bb91-bd201e07c1be","Type":"ContainerStarted","Data":"8dd8b328efa4508b18d1bfdb9bfa3a2a4a9a90bba2bad3df4734293a7001851e"} Apr 16 19:19:40.886878 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:40.886845 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f99d9c18-6acb-4dcc-a1e9-afadc152c9e2" path="/var/lib/kubelet/pods/f99d9c18-6acb-4dcc-a1e9-afadc152c9e2/volumes" Apr 16 19:19:43.246177 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:43.246085 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-587db8f75f-znrfz" event={"ID":"9a772c6a-c464-4345-bb91-bd201e07c1be","Type":"ContainerStarted","Data":"dbbf232eaf365253870025c3dc5133f0ae8e854a8e0a55f6edc1edd3113df95f"} Apr 16 19:19:43.270173 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:43.270118 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-587db8f75f-znrfz" podStartSLOduration=1.5231985780000001 podStartE2EDuration="5.270103794s" podCreationTimestamp="2026-04-16 19:19:38 +0000 UTC" firstStartedPulling="2026-04-16 19:19:39.250740189 +0000 UTC m=+90.988643435" lastFinishedPulling="2026-04-16 19:19:42.997645402 +0000 UTC m=+94.735548651" observedRunningTime="2026-04-16 19:19:43.268087718 +0000 UTC m=+95.005990986" watchObservedRunningTime="2026-04-16 19:19:43.270103794 +0000 UTC m=+95.008007052" Apr 16 19:19:46.270318 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.270278 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74d6fb4844-p6gll"] Apr 16 19:19:46.273511 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.273490 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.282012 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.281988 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 19:19:46.288676 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.288635 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74d6fb4844-p6gll"] Apr 16 19:19:46.437581 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.437479 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-service-ca\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.437581 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.437534 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/244b6cc7-4751-47e9-b5b8-f390117994d3-console-serving-cert\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.437581 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.437554 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/244b6cc7-4751-47e9-b5b8-f390117994d3-console-oauth-config\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.437797 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.437659 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-console-config\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.437797 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.437691 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82jcw\" (UniqueName: \"kubernetes.io/projected/244b6cc7-4751-47e9-b5b8-f390117994d3-kube-api-access-82jcw\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.437797 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.437712 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-oauth-serving-cert\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.437797 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.437745 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-trusted-ca-bundle\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.538840 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.538791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-console-config\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.538840 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.538836 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-82jcw\" (UniqueName: \"kubernetes.io/projected/244b6cc7-4751-47e9-b5b8-f390117994d3-kube-api-access-82jcw\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.539067 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.538856 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-oauth-serving-cert\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.539067 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.538874 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-trusted-ca-bundle\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.539067 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.538921 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-service-ca\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.539067 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.538956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/244b6cc7-4751-47e9-b5b8-f390117994d3-console-serving-cert\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.539067 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.538981 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/244b6cc7-4751-47e9-b5b8-f390117994d3-console-oauth-config\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.539737 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.539660 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-console-config\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.539737 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.539712 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-service-ca\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.540036 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.539821 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-trusted-ca-bundle\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.540388 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.540359 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-oauth-serving-cert\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.541673 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.541643 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/244b6cc7-4751-47e9-b5b8-f390117994d3-console-oauth-config\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.541830 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.541808 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/244b6cc7-4751-47e9-b5b8-f390117994d3-console-serving-cert\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.553010 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.552984 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-82jcw\" (UniqueName: \"kubernetes.io/projected/244b6cc7-4751-47e9-b5b8-f390117994d3-kube-api-access-82jcw\") pod \"console-74d6fb4844-p6gll\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.584075 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.584032 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:46.735036 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:46.734954 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74d6fb4844-p6gll"] Apr 16 19:19:46.740117 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:19:46.740087 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod244b6cc7_4751_47e9_b5b8_f390117994d3.slice/crio-d94e6e8bf6e6cf7106223489b03a935414eb6c8676675793808b569dfdf394d4 WatchSource:0}: Error finding container d94e6e8bf6e6cf7106223489b03a935414eb6c8676675793808b569dfdf394d4: Status 404 returned error can't find the container with id d94e6e8bf6e6cf7106223489b03a935414eb6c8676675793808b569dfdf394d4 Apr 16 19:19:47.259657 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:47.259617 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d6fb4844-p6gll" event={"ID":"244b6cc7-4751-47e9-b5b8-f390117994d3","Type":"ContainerStarted","Data":"1ed0ec454f02db67faf9ddd990708e83fcd4b709f3af5c2185c5ca02384c556a"} Apr 16 19:19:47.259657 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:47.259654 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d6fb4844-p6gll" event={"ID":"244b6cc7-4751-47e9-b5b8-f390117994d3","Type":"ContainerStarted","Data":"d94e6e8bf6e6cf7106223489b03a935414eb6c8676675793808b569dfdf394d4"} Apr 16 19:19:47.291050 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:47.290987 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74d6fb4844-p6gll" podStartSLOduration=1.2909676829999999 podStartE2EDuration="1.290967683s" podCreationTimestamp="2026-04-16 19:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:19:47.288897977 +0000 UTC m=+99.026801247" watchObservedRunningTime="2026-04-16 19:19:47.290967683 +0000 UTC m=+99.028870952" Apr 16 19:19:49.109992 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:49.109941 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:49.110498 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:49.110005 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:49.115001 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:49.114973 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:49.269687 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:49.269658 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:19:51.168563 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:51.168534 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-m5l2p" Apr 16 19:19:56.584586 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:56.584538 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:56.584990 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:56.584601 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:56.589333 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:56.589306 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:57.291794 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:57.291766 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:19:57.351029 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:19:57.350996 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-587db8f75f-znrfz"] Apr 16 19:20:22.373018 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.372950 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-587db8f75f-znrfz" podUID="9a772c6a-c464-4345-bb91-bd201e07c1be" containerName="console" containerID="cri-o://dbbf232eaf365253870025c3dc5133f0ae8e854a8e0a55f6edc1edd3113df95f" gracePeriod=15 Apr 16 19:20:22.611272 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.611246 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-587db8f75f-znrfz_9a772c6a-c464-4345-bb91-bd201e07c1be/console/0.log" Apr 16 19:20:22.611448 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.611311 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:20:22.722673 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.722563 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-service-ca\") pod \"9a772c6a-c464-4345-bb91-bd201e07c1be\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " Apr 16 19:20:22.722673 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.722635 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnz7j\" (UniqueName: \"kubernetes.io/projected/9a772c6a-c464-4345-bb91-bd201e07c1be-kube-api-access-hnz7j\") pod \"9a772c6a-c464-4345-bb91-bd201e07c1be\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " Apr 16 19:20:22.722673 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.722662 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-console-config\") pod \"9a772c6a-c464-4345-bb91-bd201e07c1be\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " Apr 16 19:20:22.722949 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.722698 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-oauth-serving-cert\") pod \"9a772c6a-c464-4345-bb91-bd201e07c1be\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " Apr 16 19:20:22.722949 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.722730 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a772c6a-c464-4345-bb91-bd201e07c1be-console-serving-cert\") pod \"9a772c6a-c464-4345-bb91-bd201e07c1be\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " Apr 16 19:20:22.722949 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.722768 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a772c6a-c464-4345-bb91-bd201e07c1be-console-oauth-config\") pod \"9a772c6a-c464-4345-bb91-bd201e07c1be\" (UID: \"9a772c6a-c464-4345-bb91-bd201e07c1be\") " Apr 16 19:20:22.723071 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.723041 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-service-ca" (OuterVolumeSpecName: "service-ca") pod "9a772c6a-c464-4345-bb91-bd201e07c1be" (UID: "9a772c6a-c464-4345-bb91-bd201e07c1be"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:20:22.723111 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.723046 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-console-config" (OuterVolumeSpecName: "console-config") pod "9a772c6a-c464-4345-bb91-bd201e07c1be" (UID: "9a772c6a-c464-4345-bb91-bd201e07c1be"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:20:22.723152 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.723106 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9a772c6a-c464-4345-bb91-bd201e07c1be" (UID: "9a772c6a-c464-4345-bb91-bd201e07c1be"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:20:22.725314 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.725275 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a772c6a-c464-4345-bb91-bd201e07c1be-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9a772c6a-c464-4345-bb91-bd201e07c1be" (UID: "9a772c6a-c464-4345-bb91-bd201e07c1be"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:22.725314 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.725303 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a772c6a-c464-4345-bb91-bd201e07c1be-kube-api-access-hnz7j" (OuterVolumeSpecName: "kube-api-access-hnz7j") pod "9a772c6a-c464-4345-bb91-bd201e07c1be" (UID: "9a772c6a-c464-4345-bb91-bd201e07c1be"). InnerVolumeSpecName "kube-api-access-hnz7j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:20:22.725501 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.725306 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a772c6a-c464-4345-bb91-bd201e07c1be-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9a772c6a-c464-4345-bb91-bd201e07c1be" (UID: "9a772c6a-c464-4345-bb91-bd201e07c1be"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:22.823939 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.823895 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-service-ca\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:20:22.823939 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.823933 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hnz7j\" (UniqueName: \"kubernetes.io/projected/9a772c6a-c464-4345-bb91-bd201e07c1be-kube-api-access-hnz7j\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:20:22.823939 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.823945 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-console-config\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:20:22.824182 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.823955 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9a772c6a-c464-4345-bb91-bd201e07c1be-oauth-serving-cert\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:20:22.824182 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.823964 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9a772c6a-c464-4345-bb91-bd201e07c1be-console-serving-cert\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:20:22.824182 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:22.823973 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9a772c6a-c464-4345-bb91-bd201e07c1be-console-oauth-config\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:20:23.361684 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:23.361643 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-587db8f75f-znrfz_9a772c6a-c464-4345-bb91-bd201e07c1be/console/0.log" Apr 16 19:20:23.361870 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:23.361711 2578 generic.go:358] "Generic (PLEG): container finished" podID="9a772c6a-c464-4345-bb91-bd201e07c1be" containerID="dbbf232eaf365253870025c3dc5133f0ae8e854a8e0a55f6edc1edd3113df95f" exitCode=2 Apr 16 19:20:23.361870 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:23.361847 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-587db8f75f-znrfz" event={"ID":"9a772c6a-c464-4345-bb91-bd201e07c1be","Type":"ContainerDied","Data":"dbbf232eaf365253870025c3dc5133f0ae8e854a8e0a55f6edc1edd3113df95f"} Apr 16 19:20:23.361987 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:23.361878 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-587db8f75f-znrfz" event={"ID":"9a772c6a-c464-4345-bb91-bd201e07c1be","Type":"ContainerDied","Data":"8dd8b328efa4508b18d1bfdb9bfa3a2a4a9a90bba2bad3df4734293a7001851e"} Apr 16 19:20:23.361987 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:23.361899 2578 scope.go:117] "RemoveContainer" containerID="dbbf232eaf365253870025c3dc5133f0ae8e854a8e0a55f6edc1edd3113df95f" Apr 16 19:20:23.363251 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:23.362115 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-587db8f75f-znrfz" Apr 16 19:20:23.370523 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:23.370498 2578 scope.go:117] "RemoveContainer" containerID="dbbf232eaf365253870025c3dc5133f0ae8e854a8e0a55f6edc1edd3113df95f" Apr 16 19:20:23.370859 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:20:23.370836 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbbf232eaf365253870025c3dc5133f0ae8e854a8e0a55f6edc1edd3113df95f\": container with ID starting with dbbf232eaf365253870025c3dc5133f0ae8e854a8e0a55f6edc1edd3113df95f not found: ID does not exist" containerID="dbbf232eaf365253870025c3dc5133f0ae8e854a8e0a55f6edc1edd3113df95f" Apr 16 19:20:23.370936 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:23.370867 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbbf232eaf365253870025c3dc5133f0ae8e854a8e0a55f6edc1edd3113df95f"} err="failed to get container status \"dbbf232eaf365253870025c3dc5133f0ae8e854a8e0a55f6edc1edd3113df95f\": rpc error: code = NotFound desc = could not find container \"dbbf232eaf365253870025c3dc5133f0ae8e854a8e0a55f6edc1edd3113df95f\": container with ID starting with dbbf232eaf365253870025c3dc5133f0ae8e854a8e0a55f6edc1edd3113df95f not found: ID does not exist" Apr 16 19:20:23.382232 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:23.382194 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-587db8f75f-znrfz"] Apr 16 19:20:23.388907 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:23.388874 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-587db8f75f-znrfz"] Apr 16 19:20:23.880696 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:23.880663 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s75jz_c95ee486-2760-4ad8-9188-14bfc1cb67df/init-textfile/0.log" Apr 16 19:20:24.069516 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:24.069485 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s75jz_c95ee486-2760-4ad8-9188-14bfc1cb67df/node-exporter/0.log" Apr 16 19:20:24.268556 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:24.268527 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s75jz_c95ee486-2760-4ad8-9188-14bfc1cb67df/kube-rbac-proxy/0.log" Apr 16 19:20:24.886383 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:24.886340 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a772c6a-c464-4345-bb91-bd201e07c1be" path="/var/lib/kubelet/pods/9a772c6a-c464-4345-bb91-bd201e07c1be/volumes" Apr 16 19:20:27.069754 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:27.069722 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-h4qrx_bdf9751d-36a3-47a6-93d9-c268be17be2f/prometheus-operator/0.log" Apr 16 19:20:27.269426 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:27.269378 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-h4qrx_bdf9751d-36a3-47a6-93d9-c268be17be2f/kube-rbac-proxy/0.log" Apr 16 19:20:28.269200 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:28.269160 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5774b77849-6p65s_c3c44c36-e89c-4df8-bf92-0bc611cfe392/thanos-query/0.log" Apr 16 19:20:28.468587 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:28.468559 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5774b77849-6p65s_c3c44c36-e89c-4df8-bf92-0bc611cfe392/kube-rbac-proxy-web/0.log" Apr 16 19:20:28.669298 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:28.669214 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5774b77849-6p65s_c3c44c36-e89c-4df8-bf92-0bc611cfe392/kube-rbac-proxy/0.log" Apr 16 19:20:28.869216 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:28.869170 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5774b77849-6p65s_c3c44c36-e89c-4df8-bf92-0bc611cfe392/prom-label-proxy/0.log" Apr 16 19:20:29.068948 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:29.068917 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5774b77849-6p65s_c3c44c36-e89c-4df8-bf92-0bc611cfe392/kube-rbac-proxy-rules/0.log" Apr 16 19:20:29.268651 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:29.268613 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5774b77849-6p65s_c3c44c36-e89c-4df8-bf92-0bc611cfe392/kube-rbac-proxy-metrics/0.log" Apr 16 19:20:30.269640 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:30.269609 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74d6fb4844-p6gll_244b6cc7-4751-47e9-b5b8-f390117994d3/console/0.log" Apr 16 19:20:31.268665 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:31.268637 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rdj9w_419f622c-e0bd-4976-87ce-7df0a1ed0500/serve-healthcheck-canary/0.log" Apr 16 19:20:42.836684 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:42.836640 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76945448b-qj5z6"] Apr 16 19:20:42.837223 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:42.837069 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a772c6a-c464-4345-bb91-bd201e07c1be" containerName="console" Apr 16 19:20:42.837223 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:42.837088 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a772c6a-c464-4345-bb91-bd201e07c1be" containerName="console" Apr 16 19:20:42.837223 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:42.837160 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a772c6a-c464-4345-bb91-bd201e07c1be" containerName="console" Apr 16 19:20:42.840261 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:42.840236 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:42.851993 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:42.851960 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76945448b-qj5z6"] Apr 16 19:20:42.976774 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:42.976734 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-oauth-config\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:42.976774 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:42.976775 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-oauth-serving-cert\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:42.976774 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:42.976793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-serving-cert\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:42.977056 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:42.976811 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4s6c\" (UniqueName: \"kubernetes.io/projected/b47bb31b-ddcf-4cbc-928b-38a6541f8379-kube-api-access-q4s6c\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:42.977056 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:42.976913 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-trusted-ca-bundle\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:42.977056 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:42.976976 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-config\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:42.977056 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:42.977008 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-service-ca\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.077822 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.077786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-oauth-config\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.077822 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.077826 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-oauth-serving-cert\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.078083 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.077843 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-serving-cert\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.078083 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.077862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4s6c\" (UniqueName: \"kubernetes.io/projected/b47bb31b-ddcf-4cbc-928b-38a6541f8379-kube-api-access-q4s6c\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.078083 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.077897 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-trusted-ca-bundle\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.078083 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.077932 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-config\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.078083 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.077962 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-service-ca\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.078733 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.078701 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-oauth-serving-cert\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.078858 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.078743 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-service-ca\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.078858 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.078710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-config\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.078964 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.078943 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-trusted-ca-bundle\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.080594 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.080563 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-serving-cert\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.080777 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.080656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-oauth-config\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.086053 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.086030 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4s6c\" (UniqueName: \"kubernetes.io/projected/b47bb31b-ddcf-4cbc-928b-38a6541f8379-kube-api-access-q4s6c\") pod \"console-76945448b-qj5z6\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.151327 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.151244 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:43.282517 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.282477 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76945448b-qj5z6"] Apr 16 19:20:43.284776 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:20:43.284743 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb47bb31b_ddcf_4cbc_928b_38a6541f8379.slice/crio-6e96de6129ef26e071813fbbf62ab874712167d711312df4c0de14f5ef309f97 WatchSource:0}: Error finding container 6e96de6129ef26e071813fbbf62ab874712167d711312df4c0de14f5ef309f97: Status 404 returned error can't find the container with id 6e96de6129ef26e071813fbbf62ab874712167d711312df4c0de14f5ef309f97 Apr 16 19:20:43.425385 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.425288 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76945448b-qj5z6" event={"ID":"b47bb31b-ddcf-4cbc-928b-38a6541f8379","Type":"ContainerStarted","Data":"8ce264e3191844587fd92ae4513c9d15bf26d4f22230f78380ab450414be573f"} Apr 16 19:20:43.425385 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.425325 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76945448b-qj5z6" event={"ID":"b47bb31b-ddcf-4cbc-928b-38a6541f8379","Type":"ContainerStarted","Data":"6e96de6129ef26e071813fbbf62ab874712167d711312df4c0de14f5ef309f97"} Apr 16 19:20:43.448283 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:43.448232 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76945448b-qj5z6" podStartSLOduration=1.4482141290000001 podStartE2EDuration="1.448214129s" podCreationTimestamp="2026-04-16 19:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:20:43.446985752 +0000 UTC m=+155.184889021" watchObservedRunningTime="2026-04-16 19:20:43.448214129 +0000 UTC m=+155.186117397" Apr 16 19:20:53.152435 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:53.152323 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:53.152435 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:53.152382 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:53.157194 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:53.157166 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:53.458896 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:53.458818 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:20:53.517213 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:20:53.517179 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74d6fb4844-p6gll"] Apr 16 19:21:18.541637 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.541571 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74d6fb4844-p6gll" podUID="244b6cc7-4751-47e9-b5b8-f390117994d3" containerName="console" containerID="cri-o://1ed0ec454f02db67faf9ddd990708e83fcd4b709f3af5c2185c5ca02384c556a" gracePeriod=15 Apr 16 19:21:18.773918 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.773892 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74d6fb4844-p6gll_244b6cc7-4751-47e9-b5b8-f390117994d3/console/0.log" Apr 16 19:21:18.774068 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.773958 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:21:18.962694 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.962596 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-oauth-serving-cert\") pod \"244b6cc7-4751-47e9-b5b8-f390117994d3\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " Apr 16 19:21:18.962694 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.962644 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/244b6cc7-4751-47e9-b5b8-f390117994d3-console-oauth-config\") pod \"244b6cc7-4751-47e9-b5b8-f390117994d3\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " Apr 16 19:21:18.962694 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.962667 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-service-ca\") pod \"244b6cc7-4751-47e9-b5b8-f390117994d3\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " Apr 16 19:21:18.962694 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.962692 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-console-config\") pod \"244b6cc7-4751-47e9-b5b8-f390117994d3\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " Apr 16 19:21:18.963029 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.962753 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/244b6cc7-4751-47e9-b5b8-f390117994d3-console-serving-cert\") pod \"244b6cc7-4751-47e9-b5b8-f390117994d3\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " Apr 16 19:21:18.963029 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.962780 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82jcw\" (UniqueName: \"kubernetes.io/projected/244b6cc7-4751-47e9-b5b8-f390117994d3-kube-api-access-82jcw\") pod \"244b6cc7-4751-47e9-b5b8-f390117994d3\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " Apr 16 19:21:18.963029 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.962856 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-trusted-ca-bundle\") pod \"244b6cc7-4751-47e9-b5b8-f390117994d3\" (UID: \"244b6cc7-4751-47e9-b5b8-f390117994d3\") " Apr 16 19:21:18.963199 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.963159 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "244b6cc7-4751-47e9-b5b8-f390117994d3" (UID: "244b6cc7-4751-47e9-b5b8-f390117994d3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:21:18.963199 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.963176 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-service-ca" (OuterVolumeSpecName: "service-ca") pod "244b6cc7-4751-47e9-b5b8-f390117994d3" (UID: "244b6cc7-4751-47e9-b5b8-f390117994d3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:21:18.963351 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.963322 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "244b6cc7-4751-47e9-b5b8-f390117994d3" (UID: "244b6cc7-4751-47e9-b5b8-f390117994d3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:21:18.963459 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.963400 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-console-config" (OuterVolumeSpecName: "console-config") pod "244b6cc7-4751-47e9-b5b8-f390117994d3" (UID: "244b6cc7-4751-47e9-b5b8-f390117994d3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:21:18.965220 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.965193 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/244b6cc7-4751-47e9-b5b8-f390117994d3-kube-api-access-82jcw" (OuterVolumeSpecName: "kube-api-access-82jcw") pod "244b6cc7-4751-47e9-b5b8-f390117994d3" (UID: "244b6cc7-4751-47e9-b5b8-f390117994d3"). InnerVolumeSpecName "kube-api-access-82jcw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:21:18.965220 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.965204 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/244b6cc7-4751-47e9-b5b8-f390117994d3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "244b6cc7-4751-47e9-b5b8-f390117994d3" (UID: "244b6cc7-4751-47e9-b5b8-f390117994d3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:21:18.965355 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:18.965222 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/244b6cc7-4751-47e9-b5b8-f390117994d3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "244b6cc7-4751-47e9-b5b8-f390117994d3" (UID: "244b6cc7-4751-47e9-b5b8-f390117994d3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:21:19.063813 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.063762 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-trusted-ca-bundle\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:21:19.063813 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.063805 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-oauth-serving-cert\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:21:19.063813 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.063815 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/244b6cc7-4751-47e9-b5b8-f390117994d3-console-oauth-config\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:21:19.063813 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.063825 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-service-ca\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:21:19.063813 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.063835 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/244b6cc7-4751-47e9-b5b8-f390117994d3-console-config\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:21:19.064120 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.063843 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/244b6cc7-4751-47e9-b5b8-f390117994d3-console-serving-cert\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:21:19.064120 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.063852 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-82jcw\" (UniqueName: \"kubernetes.io/projected/244b6cc7-4751-47e9-b5b8-f390117994d3-kube-api-access-82jcw\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:21:19.527015 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.526986 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74d6fb4844-p6gll_244b6cc7-4751-47e9-b5b8-f390117994d3/console/0.log" Apr 16 19:21:19.527193 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.527030 2578 generic.go:358] "Generic (PLEG): container finished" podID="244b6cc7-4751-47e9-b5b8-f390117994d3" containerID="1ed0ec454f02db67faf9ddd990708e83fcd4b709f3af5c2185c5ca02384c556a" exitCode=2 Apr 16 19:21:19.527193 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.527067 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d6fb4844-p6gll" event={"ID":"244b6cc7-4751-47e9-b5b8-f390117994d3","Type":"ContainerDied","Data":"1ed0ec454f02db67faf9ddd990708e83fcd4b709f3af5c2185c5ca02384c556a"} Apr 16 19:21:19.527193 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.527102 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74d6fb4844-p6gll" Apr 16 19:21:19.527193 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.527112 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74d6fb4844-p6gll" event={"ID":"244b6cc7-4751-47e9-b5b8-f390117994d3","Type":"ContainerDied","Data":"d94e6e8bf6e6cf7106223489b03a935414eb6c8676675793808b569dfdf394d4"} Apr 16 19:21:19.527193 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.527129 2578 scope.go:117] "RemoveContainer" containerID="1ed0ec454f02db67faf9ddd990708e83fcd4b709f3af5c2185c5ca02384c556a" Apr 16 19:21:19.535361 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.535342 2578 scope.go:117] "RemoveContainer" containerID="1ed0ec454f02db67faf9ddd990708e83fcd4b709f3af5c2185c5ca02384c556a" Apr 16 19:21:19.535683 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:21:19.535664 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed0ec454f02db67faf9ddd990708e83fcd4b709f3af5c2185c5ca02384c556a\": container with ID starting with 1ed0ec454f02db67faf9ddd990708e83fcd4b709f3af5c2185c5ca02384c556a not found: ID does not exist" containerID="1ed0ec454f02db67faf9ddd990708e83fcd4b709f3af5c2185c5ca02384c556a" Apr 16 19:21:19.535730 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.535693 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed0ec454f02db67faf9ddd990708e83fcd4b709f3af5c2185c5ca02384c556a"} err="failed to get container status \"1ed0ec454f02db67faf9ddd990708e83fcd4b709f3af5c2185c5ca02384c556a\": rpc error: code = NotFound desc = could not find container \"1ed0ec454f02db67faf9ddd990708e83fcd4b709f3af5c2185c5ca02384c556a\": container with ID starting with 1ed0ec454f02db67faf9ddd990708e83fcd4b709f3af5c2185c5ca02384c556a not found: ID does not exist" Apr 16 19:21:19.551231 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.551190 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74d6fb4844-p6gll"] Apr 16 19:21:19.557392 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:19.557361 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74d6fb4844-p6gll"] Apr 16 19:21:20.886543 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:21:20.886502 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="244b6cc7-4751-47e9-b5b8-f390117994d3" path="/var/lib/kubelet/pods/244b6cc7-4751-47e9-b5b8-f390117994d3/volumes" Apr 16 19:23:08.766346 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:23:08.766314 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:23:08.766885 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:23:08.766556 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:23:08.773972 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:23:08.773948 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 19:24:42.178637 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.178597 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c7f7f95f9-sn7pt"] Apr 16 19:24:42.179094 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.178918 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="244b6cc7-4751-47e9-b5b8-f390117994d3" containerName="console" Apr 16 19:24:42.179094 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.178932 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="244b6cc7-4751-47e9-b5b8-f390117994d3" containerName="console" Apr 16 19:24:42.179094 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.178993 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="244b6cc7-4751-47e9-b5b8-f390117994d3" containerName="console" Apr 16 19:24:42.180701 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.180682 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.206025 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.205991 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c7f7f95f9-sn7pt"] Apr 16 19:24:42.242024 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.241982 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b101b29-cfd8-4f18-9019-0845925a7e2d-console-oauth-config\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.242024 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.242022 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b101b29-cfd8-4f18-9019-0845925a7e2d-service-ca\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.242237 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.242043 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b101b29-cfd8-4f18-9019-0845925a7e2d-console-serving-cert\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.242237 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.242126 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b101b29-cfd8-4f18-9019-0845925a7e2d-oauth-serving-cert\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.242237 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.242160 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b101b29-cfd8-4f18-9019-0845925a7e2d-trusted-ca-bundle\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.242237 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.242179 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b101b29-cfd8-4f18-9019-0845925a7e2d-console-config\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.242372 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.242249 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2sbf\" (UniqueName: \"kubernetes.io/projected/5b101b29-cfd8-4f18-9019-0845925a7e2d-kube-api-access-v2sbf\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.343268 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.343238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b101b29-cfd8-4f18-9019-0845925a7e2d-console-oauth-config\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.343268 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.343274 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b101b29-cfd8-4f18-9019-0845925a7e2d-service-ca\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.343482 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.343291 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b101b29-cfd8-4f18-9019-0845925a7e2d-console-serving-cert\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.343482 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.343326 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b101b29-cfd8-4f18-9019-0845925a7e2d-oauth-serving-cert\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.343482 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.343355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b101b29-cfd8-4f18-9019-0845925a7e2d-trusted-ca-bundle\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.343482 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.343381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b101b29-cfd8-4f18-9019-0845925a7e2d-console-config\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.343482 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.343440 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2sbf\" (UniqueName: \"kubernetes.io/projected/5b101b29-cfd8-4f18-9019-0845925a7e2d-kube-api-access-v2sbf\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.344125 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.344099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b101b29-cfd8-4f18-9019-0845925a7e2d-service-ca\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.344254 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.344224 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b101b29-cfd8-4f18-9019-0845925a7e2d-oauth-serving-cert\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.344324 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.344305 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b101b29-cfd8-4f18-9019-0845925a7e2d-trusted-ca-bundle\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.344362 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.344315 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b101b29-cfd8-4f18-9019-0845925a7e2d-console-config\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.345876 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.345851 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b101b29-cfd8-4f18-9019-0845925a7e2d-console-oauth-config\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.346026 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.346008 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b101b29-cfd8-4f18-9019-0845925a7e2d-console-serving-cert\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.352221 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.352200 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2sbf\" (UniqueName: \"kubernetes.io/projected/5b101b29-cfd8-4f18-9019-0845925a7e2d-kube-api-access-v2sbf\") pod \"console-7c7f7f95f9-sn7pt\" (UID: \"5b101b29-cfd8-4f18-9019-0845925a7e2d\") " pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.494643 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.494609 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:42.637979 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.637943 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c7f7f95f9-sn7pt"] Apr 16 19:24:42.641247 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:24:42.641216 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b101b29_cfd8_4f18_9019_0845925a7e2d.slice/crio-fdcf2318161c7e4823c0a85d333e812792287af91c9aef30486c342d2a56a3a2 WatchSource:0}: Error finding container fdcf2318161c7e4823c0a85d333e812792287af91c9aef30486c342d2a56a3a2: Status 404 returned error can't find the container with id fdcf2318161c7e4823c0a85d333e812792287af91c9aef30486c342d2a56a3a2 Apr 16 19:24:42.643540 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:42.643521 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:24:43.073801 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:43.073767 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c7f7f95f9-sn7pt" event={"ID":"5b101b29-cfd8-4f18-9019-0845925a7e2d","Type":"ContainerStarted","Data":"bb0f581759a97b866856681099642f53f6b81dafc916d6644deaf0185db261a3"} Apr 16 19:24:43.073801 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:43.073803 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c7f7f95f9-sn7pt" event={"ID":"5b101b29-cfd8-4f18-9019-0845925a7e2d","Type":"ContainerStarted","Data":"fdcf2318161c7e4823c0a85d333e812792287af91c9aef30486c342d2a56a3a2"} Apr 16 19:24:43.092759 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:43.092701 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c7f7f95f9-sn7pt" podStartSLOduration=1.092679235 podStartE2EDuration="1.092679235s" podCreationTimestamp="2026-04-16 19:24:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:24:43.091563087 +0000 UTC m=+394.829466355" watchObservedRunningTime="2026-04-16 19:24:43.092679235 +0000 UTC m=+394.830582504" Apr 16 19:24:52.495343 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:52.495304 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:52.495718 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:52.495364 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:52.500325 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:52.500299 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:53.105482 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:53.105453 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c7f7f95f9-sn7pt" Apr 16 19:24:53.157011 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:24:53.156978 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76945448b-qj5z6"] Apr 16 19:25:18.176501 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.176373 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76945448b-qj5z6" podUID="b47bb31b-ddcf-4cbc-928b-38a6541f8379" containerName="console" containerID="cri-o://8ce264e3191844587fd92ae4513c9d15bf26d4f22230f78380ab450414be573f" gracePeriod=15 Apr 16 19:25:18.413174 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.413147 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76945448b-qj5z6_b47bb31b-ddcf-4cbc-928b-38a6541f8379/console/0.log" Apr 16 19:25:18.413384 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.413224 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:25:18.521471 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.521429 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-service-ca\") pod \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " Apr 16 19:25:18.521648 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.521508 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-serving-cert\") pod \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " Apr 16 19:25:18.521648 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.521572 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-config\") pod \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " Apr 16 19:25:18.521648 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.521604 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-trusted-ca-bundle\") pod \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " Apr 16 19:25:18.521776 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.521651 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-oauth-serving-cert\") pod \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " Apr 16 19:25:18.521776 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.521689 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4s6c\" (UniqueName: \"kubernetes.io/projected/b47bb31b-ddcf-4cbc-928b-38a6541f8379-kube-api-access-q4s6c\") pod \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " Apr 16 19:25:18.521776 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.521728 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-oauth-config\") pod \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\" (UID: \"b47bb31b-ddcf-4cbc-928b-38a6541f8379\") " Apr 16 19:25:18.521930 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.521833 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-service-ca" (OuterVolumeSpecName: "service-ca") pod "b47bb31b-ddcf-4cbc-928b-38a6541f8379" (UID: "b47bb31b-ddcf-4cbc-928b-38a6541f8379"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:25:18.522078 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.522048 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-config" (OuterVolumeSpecName: "console-config") pod "b47bb31b-ddcf-4cbc-928b-38a6541f8379" (UID: "b47bb31b-ddcf-4cbc-928b-38a6541f8379"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:25:18.522078 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.522058 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b47bb31b-ddcf-4cbc-928b-38a6541f8379" (UID: "b47bb31b-ddcf-4cbc-928b-38a6541f8379"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:25:18.522213 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.522078 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-service-ca\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:25:18.522213 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.522061 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b47bb31b-ddcf-4cbc-928b-38a6541f8379" (UID: "b47bb31b-ddcf-4cbc-928b-38a6541f8379"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:25:18.523906 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.523878 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b47bb31b-ddcf-4cbc-928b-38a6541f8379" (UID: "b47bb31b-ddcf-4cbc-928b-38a6541f8379"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:25:18.523906 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.523892 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b47bb31b-ddcf-4cbc-928b-38a6541f8379" (UID: "b47bb31b-ddcf-4cbc-928b-38a6541f8379"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:25:18.524069 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.523924 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b47bb31b-ddcf-4cbc-928b-38a6541f8379-kube-api-access-q4s6c" (OuterVolumeSpecName: "kube-api-access-q4s6c") pod "b47bb31b-ddcf-4cbc-928b-38a6541f8379" (UID: "b47bb31b-ddcf-4cbc-928b-38a6541f8379"). InnerVolumeSpecName "kube-api-access-q4s6c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:25:18.623135 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.623082 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q4s6c\" (UniqueName: \"kubernetes.io/projected/b47bb31b-ddcf-4cbc-928b-38a6541f8379-kube-api-access-q4s6c\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:25:18.623135 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.623126 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-oauth-config\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:25:18.623135 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.623137 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-serving-cert\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:25:18.623135 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.623147 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-console-config\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:25:18.623135 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.623158 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-trusted-ca-bundle\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:25:18.623475 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:18.623167 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b47bb31b-ddcf-4cbc-928b-38a6541f8379-oauth-serving-cert\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:25:19.176099 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:19.176017 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76945448b-qj5z6_b47bb31b-ddcf-4cbc-928b-38a6541f8379/console/0.log" Apr 16 19:25:19.176099 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:19.176058 2578 generic.go:358] "Generic (PLEG): container finished" podID="b47bb31b-ddcf-4cbc-928b-38a6541f8379" containerID="8ce264e3191844587fd92ae4513c9d15bf26d4f22230f78380ab450414be573f" exitCode=2 Apr 16 19:25:19.176304 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:19.176094 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76945448b-qj5z6" event={"ID":"b47bb31b-ddcf-4cbc-928b-38a6541f8379","Type":"ContainerDied","Data":"8ce264e3191844587fd92ae4513c9d15bf26d4f22230f78380ab450414be573f"} Apr 16 19:25:19.176304 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:19.176141 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76945448b-qj5z6" Apr 16 19:25:19.176304 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:19.176157 2578 scope.go:117] "RemoveContainer" containerID="8ce264e3191844587fd92ae4513c9d15bf26d4f22230f78380ab450414be573f" Apr 16 19:25:19.176304 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:19.176143 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76945448b-qj5z6" event={"ID":"b47bb31b-ddcf-4cbc-928b-38a6541f8379","Type":"ContainerDied","Data":"6e96de6129ef26e071813fbbf62ab874712167d711312df4c0de14f5ef309f97"} Apr 16 19:25:19.184120 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:19.184101 2578 scope.go:117] "RemoveContainer" containerID="8ce264e3191844587fd92ae4513c9d15bf26d4f22230f78380ab450414be573f" Apr 16 19:25:19.184447 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:25:19.184400 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ce264e3191844587fd92ae4513c9d15bf26d4f22230f78380ab450414be573f\": container with ID starting with 8ce264e3191844587fd92ae4513c9d15bf26d4f22230f78380ab450414be573f not found: ID does not exist" containerID="8ce264e3191844587fd92ae4513c9d15bf26d4f22230f78380ab450414be573f" Apr 16 19:25:19.184498 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:19.184452 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ce264e3191844587fd92ae4513c9d15bf26d4f22230f78380ab450414be573f"} err="failed to get container status \"8ce264e3191844587fd92ae4513c9d15bf26d4f22230f78380ab450414be573f\": rpc error: code = NotFound desc = could not find container \"8ce264e3191844587fd92ae4513c9d15bf26d4f22230f78380ab450414be573f\": container with ID starting with 8ce264e3191844587fd92ae4513c9d15bf26d4f22230f78380ab450414be573f not found: ID does not exist" Apr 16 19:25:19.196665 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:19.196635 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76945448b-qj5z6"] Apr 16 19:25:19.200867 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:19.200843 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76945448b-qj5z6"] Apr 16 19:25:20.886509 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:25:20.886474 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b47bb31b-ddcf-4cbc-928b-38a6541f8379" path="/var/lib/kubelet/pods/b47bb31b-ddcf-4cbc-928b-38a6541f8379/volumes" Apr 16 19:26:32.654822 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.654783 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg"] Apr 16 19:26:32.655324 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.655074 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b47bb31b-ddcf-4cbc-928b-38a6541f8379" containerName="console" Apr 16 19:26:32.655324 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.655085 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b47bb31b-ddcf-4cbc-928b-38a6541f8379" containerName="console" Apr 16 19:26:32.655324 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.655167 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b47bb31b-ddcf-4cbc-928b-38a6541f8379" containerName="console" Apr 16 19:26:32.658271 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.658248 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" Apr 16 19:26:32.660856 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.660836 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:26:32.660934 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.660871 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:26:32.661639 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.661624 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ddqj5\"" Apr 16 19:26:32.665953 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.665929 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg"] Apr 16 19:26:32.784379 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.784332 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb5xx\" (UniqueName: \"kubernetes.io/projected/c392d396-996e-4cd8-b457-8f482c6030c3-kube-api-access-gb5xx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg\" (UID: \"c392d396-996e-4cd8-b457-8f482c6030c3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" Apr 16 19:26:32.784379 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.784381 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c392d396-996e-4cd8-b457-8f482c6030c3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg\" (UID: \"c392d396-996e-4cd8-b457-8f482c6030c3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" Apr 16 19:26:32.784603 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.784466 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c392d396-996e-4cd8-b457-8f482c6030c3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg\" (UID: \"c392d396-996e-4cd8-b457-8f482c6030c3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" Apr 16 19:26:32.884854 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.884819 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gb5xx\" (UniqueName: \"kubernetes.io/projected/c392d396-996e-4cd8-b457-8f482c6030c3-kube-api-access-gb5xx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg\" (UID: \"c392d396-996e-4cd8-b457-8f482c6030c3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" Apr 16 19:26:32.884854 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.884855 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c392d396-996e-4cd8-b457-8f482c6030c3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg\" (UID: \"c392d396-996e-4cd8-b457-8f482c6030c3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" Apr 16 19:26:32.885097 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.884885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c392d396-996e-4cd8-b457-8f482c6030c3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg\" (UID: \"c392d396-996e-4cd8-b457-8f482c6030c3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" Apr 16 19:26:32.885280 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.885262 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c392d396-996e-4cd8-b457-8f482c6030c3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg\" (UID: \"c392d396-996e-4cd8-b457-8f482c6030c3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" Apr 16 19:26:32.885353 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.885315 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c392d396-996e-4cd8-b457-8f482c6030c3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg\" (UID: \"c392d396-996e-4cd8-b457-8f482c6030c3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" Apr 16 19:26:32.894084 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.894056 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb5xx\" (UniqueName: \"kubernetes.io/projected/c392d396-996e-4cd8-b457-8f482c6030c3-kube-api-access-gb5xx\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg\" (UID: \"c392d396-996e-4cd8-b457-8f482c6030c3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" Apr 16 19:26:32.968149 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:32.968059 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" Apr 16 19:26:33.093685 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:33.093654 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg"] Apr 16 19:26:33.096823 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:26:33.096788 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc392d396_996e_4cd8_b457_8f482c6030c3.slice/crio-4fdf51e16f077326b34ded8f3b502e9db927ea655efbbb4337110d2f694218ed WatchSource:0}: Error finding container 4fdf51e16f077326b34ded8f3b502e9db927ea655efbbb4337110d2f694218ed: Status 404 returned error can't find the container with id 4fdf51e16f077326b34ded8f3b502e9db927ea655efbbb4337110d2f694218ed Apr 16 19:26:33.383115 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:33.383070 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" event={"ID":"c392d396-996e-4cd8-b457-8f482c6030c3","Type":"ContainerStarted","Data":"4fdf51e16f077326b34ded8f3b502e9db927ea655efbbb4337110d2f694218ed"} Apr 16 19:26:38.399043 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:38.399004 2578 generic.go:358] "Generic (PLEG): container finished" podID="c392d396-996e-4cd8-b457-8f482c6030c3" containerID="7422ee805f9fc523a1a179c13c7bf24e9b2d7592756b931baca7928212f4aff2" exitCode=0 Apr 16 19:26:38.399451 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:38.399049 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" event={"ID":"c392d396-996e-4cd8-b457-8f482c6030c3","Type":"ContainerDied","Data":"7422ee805f9fc523a1a179c13c7bf24e9b2d7592756b931baca7928212f4aff2"} Apr 16 19:26:40.411034 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:40.411002 2578 generic.go:358] "Generic (PLEG): container finished" podID="c392d396-996e-4cd8-b457-8f482c6030c3" containerID="24fee4bd4d712405444e46a4087d7d4f05915e93dd6c0715b7356a10c281f520" exitCode=0 Apr 16 19:26:40.411431 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:40.411080 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" event={"ID":"c392d396-996e-4cd8-b457-8f482c6030c3","Type":"ContainerDied","Data":"24fee4bd4d712405444e46a4087d7d4f05915e93dd6c0715b7356a10c281f520"} Apr 16 19:26:49.438826 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:49.438722 2578 generic.go:358] "Generic (PLEG): container finished" podID="c392d396-996e-4cd8-b457-8f482c6030c3" containerID="fd7dc4c91bd91f082f95fab9824f9baf628a9e6fae5e94bdc9fe4f9646277a6a" exitCode=0 Apr 16 19:26:49.438826 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:49.438811 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" event={"ID":"c392d396-996e-4cd8-b457-8f482c6030c3","Type":"ContainerDied","Data":"fd7dc4c91bd91f082f95fab9824f9baf628a9e6fae5e94bdc9fe4f9646277a6a"} Apr 16 19:26:50.560155 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:50.560130 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" Apr 16 19:26:50.628895 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:50.628860 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c392d396-996e-4cd8-b457-8f482c6030c3-util\") pod \"c392d396-996e-4cd8-b457-8f482c6030c3\" (UID: \"c392d396-996e-4cd8-b457-8f482c6030c3\") " Apr 16 19:26:50.629123 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:50.628951 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb5xx\" (UniqueName: \"kubernetes.io/projected/c392d396-996e-4cd8-b457-8f482c6030c3-kube-api-access-gb5xx\") pod \"c392d396-996e-4cd8-b457-8f482c6030c3\" (UID: \"c392d396-996e-4cd8-b457-8f482c6030c3\") " Apr 16 19:26:50.629123 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:50.629050 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c392d396-996e-4cd8-b457-8f482c6030c3-bundle\") pod \"c392d396-996e-4cd8-b457-8f482c6030c3\" (UID: \"c392d396-996e-4cd8-b457-8f482c6030c3\") " Apr 16 19:26:50.629659 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:50.629626 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c392d396-996e-4cd8-b457-8f482c6030c3-bundle" (OuterVolumeSpecName: "bundle") pod "c392d396-996e-4cd8-b457-8f482c6030c3" (UID: "c392d396-996e-4cd8-b457-8f482c6030c3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:26:50.631436 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:50.631394 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c392d396-996e-4cd8-b457-8f482c6030c3-kube-api-access-gb5xx" (OuterVolumeSpecName: "kube-api-access-gb5xx") pod "c392d396-996e-4cd8-b457-8f482c6030c3" (UID: "c392d396-996e-4cd8-b457-8f482c6030c3"). InnerVolumeSpecName "kube-api-access-gb5xx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:26:50.634053 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:50.634025 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c392d396-996e-4cd8-b457-8f482c6030c3-util" (OuterVolumeSpecName: "util") pod "c392d396-996e-4cd8-b457-8f482c6030c3" (UID: "c392d396-996e-4cd8-b457-8f482c6030c3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:26:50.730361 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:50.730267 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c392d396-996e-4cd8-b457-8f482c6030c3-bundle\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:26:50.730361 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:50.730302 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c392d396-996e-4cd8-b457-8f482c6030c3-util\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:26:50.730361 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:50.730312 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gb5xx\" (UniqueName: \"kubernetes.io/projected/c392d396-996e-4cd8-b457-8f482c6030c3-kube-api-access-gb5xx\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:26:51.445523 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:51.445484 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" event={"ID":"c392d396-996e-4cd8-b457-8f482c6030c3","Type":"ContainerDied","Data":"4fdf51e16f077326b34ded8f3b502e9db927ea655efbbb4337110d2f694218ed"} Apr 16 19:26:51.445523 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:51.445526 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fdf51e16f077326b34ded8f3b502e9db927ea655efbbb4337110d2f694218ed" Apr 16 19:26:51.445769 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:51.445499 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5m9pcg" Apr 16 19:26:55.236736 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.236699 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-c98dz"] Apr 16 19:26:55.237126 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.236969 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c392d396-996e-4cd8-b457-8f482c6030c3" containerName="util" Apr 16 19:26:55.237126 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.236982 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c392d396-996e-4cd8-b457-8f482c6030c3" containerName="util" Apr 16 19:26:55.237126 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.236992 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c392d396-996e-4cd8-b457-8f482c6030c3" containerName="pull" Apr 16 19:26:55.237126 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.237000 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c392d396-996e-4cd8-b457-8f482c6030c3" containerName="pull" Apr 16 19:26:55.237126 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.237031 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c392d396-996e-4cd8-b457-8f482c6030c3" containerName="extract" Apr 16 19:26:55.237126 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.237041 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c392d396-996e-4cd8-b457-8f482c6030c3" containerName="extract" Apr 16 19:26:55.237126 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.237088 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c392d396-996e-4cd8-b457-8f482c6030c3" containerName="extract" Apr 16 19:26:55.242545 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.242524 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-c98dz" Apr 16 19:26:55.245348 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.245327 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 19:26:55.245478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.245460 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:26:55.245524 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.245493 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-n4qp7\"" Apr 16 19:26:55.260099 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.260070 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-c98dz"] Apr 16 19:26:55.260378 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.260350 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nljdw\" (UniqueName: \"kubernetes.io/projected/2d655b8a-8ed2-40a3-ad2f-6c6011b59869-kube-api-access-nljdw\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-c98dz\" (UID: \"2d655b8a-8ed2-40a3-ad2f-6c6011b59869\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-c98dz" Apr 16 19:26:55.260488 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.260397 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2d655b8a-8ed2-40a3-ad2f-6c6011b59869-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-c98dz\" (UID: \"2d655b8a-8ed2-40a3-ad2f-6c6011b59869\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-c98dz" Apr 16 19:26:55.361348 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.361310 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2d655b8a-8ed2-40a3-ad2f-6c6011b59869-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-c98dz\" (UID: \"2d655b8a-8ed2-40a3-ad2f-6c6011b59869\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-c98dz" Apr 16 19:26:55.361559 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.361389 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nljdw\" (UniqueName: \"kubernetes.io/projected/2d655b8a-8ed2-40a3-ad2f-6c6011b59869-kube-api-access-nljdw\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-c98dz\" (UID: \"2d655b8a-8ed2-40a3-ad2f-6c6011b59869\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-c98dz" Apr 16 19:26:55.361737 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.361716 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2d655b8a-8ed2-40a3-ad2f-6c6011b59869-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-c98dz\" (UID: \"2d655b8a-8ed2-40a3-ad2f-6c6011b59869\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-c98dz" Apr 16 19:26:55.370853 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.370816 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nljdw\" (UniqueName: \"kubernetes.io/projected/2d655b8a-8ed2-40a3-ad2f-6c6011b59869-kube-api-access-nljdw\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-c98dz\" (UID: \"2d655b8a-8ed2-40a3-ad2f-6c6011b59869\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-c98dz" Apr 16 19:26:55.551194 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.551166 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-c98dz" Apr 16 19:26:55.679850 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:55.679812 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-c98dz"] Apr 16 19:26:55.684444 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:26:55.684383 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d655b8a_8ed2_40a3_ad2f_6c6011b59869.slice/crio-f184bc68e9a1ad6781886ba71a59a919905f5b902b5be68d3262d355f6ec4126 WatchSource:0}: Error finding container f184bc68e9a1ad6781886ba71a59a919905f5b902b5be68d3262d355f6ec4126: Status 404 returned error can't find the container with id f184bc68e9a1ad6781886ba71a59a919905f5b902b5be68d3262d355f6ec4126 Apr 16 19:26:56.460559 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:56.460521 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-c98dz" event={"ID":"2d655b8a-8ed2-40a3-ad2f-6c6011b59869","Type":"ContainerStarted","Data":"f184bc68e9a1ad6781886ba71a59a919905f5b902b5be68d3262d355f6ec4126"} Apr 16 19:26:58.470323 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:58.470284 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-c98dz" event={"ID":"2d655b8a-8ed2-40a3-ad2f-6c6011b59869","Type":"ContainerStarted","Data":"f971f991bbf2a37efacb76c19f3303a8ad3e102bfe57b245ea9c2aecbe044a4e"} Apr 16 19:26:58.495723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:58.495668 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-c98dz" podStartSLOduration=1.397229719 podStartE2EDuration="3.495654335s" podCreationTimestamp="2026-04-16 19:26:55 +0000 UTC" firstStartedPulling="2026-04-16 19:26:55.686951912 +0000 UTC m=+527.424855172" lastFinishedPulling="2026-04-16 19:26:57.785376539 +0000 UTC m=+529.523279788" observedRunningTime="2026-04-16 19:26:58.494109176 +0000 UTC m=+530.232012446" watchObservedRunningTime="2026-04-16 19:26:58.495654335 +0000 UTC m=+530.233557602" Apr 16 19:26:59.046251 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.046215 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn"] Apr 16 19:26:59.049631 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.049610 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" Apr 16 19:26:59.055299 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.055274 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:26:59.056235 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.055910 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ddqj5\"" Apr 16 19:26:59.056311 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.056277 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:26:59.063262 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.063235 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn"] Apr 16 19:26:59.088549 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.088514 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m4rj\" (UniqueName: \"kubernetes.io/projected/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-kube-api-access-9m4rj\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn\" (UID: \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" Apr 16 19:26:59.088721 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.088564 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn\" (UID: \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" Apr 16 19:26:59.088721 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.088612 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn\" (UID: \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" Apr 16 19:26:59.189195 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.189152 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9m4rj\" (UniqueName: \"kubernetes.io/projected/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-kube-api-access-9m4rj\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn\" (UID: \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" Apr 16 19:26:59.189195 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.189207 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn\" (UID: \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" Apr 16 19:26:59.189440 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.189236 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn\" (UID: \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" Apr 16 19:26:59.189693 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.189675 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn\" (UID: \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" Apr 16 19:26:59.189729 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.189693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn\" (UID: \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" Apr 16 19:26:59.208025 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.207986 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m4rj\" (UniqueName: \"kubernetes.io/projected/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-kube-api-access-9m4rj\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn\" (UID: \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" Apr 16 19:26:59.358878 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.358770 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" Apr 16 19:26:59.489094 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:26:59.489062 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn"] Apr 16 19:26:59.495338 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:26:59.495303 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a1d0ef8_3c95_4bcb_a5fc_65d3ec1ee725.slice/crio-d0b517e7d6d543ce2f7423aa2380a76ae74ce8e22ae3c9ba9f6b023183784bb0 WatchSource:0}: Error finding container d0b517e7d6d543ce2f7423aa2380a76ae74ce8e22ae3c9ba9f6b023183784bb0: Status 404 returned error can't find the container with id d0b517e7d6d543ce2f7423aa2380a76ae74ce8e22ae3c9ba9f6b023183784bb0 Apr 16 19:27:00.477950 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:00.477853 2578 generic.go:358] "Generic (PLEG): container finished" podID="6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725" containerID="5e7165810aca79cb83e7bd05eefbf1c6312d2d1cc4d34c61c7750bec16db40ff" exitCode=0 Apr 16 19:27:00.478102 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:00.477950 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" event={"ID":"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725","Type":"ContainerDied","Data":"5e7165810aca79cb83e7bd05eefbf1c6312d2d1cc4d34c61c7750bec16db40ff"} Apr 16 19:27:00.478102 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:00.477997 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" event={"ID":"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725","Type":"ContainerStarted","Data":"d0b517e7d6d543ce2f7423aa2380a76ae74ce8e22ae3c9ba9f6b023183784bb0"} Apr 16 19:27:02.488134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:02.488094 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" event={"ID":"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725","Type":"ContainerStarted","Data":"645fa416d38242712bfb17110d8491fa6da36b454b4dce78c4d4de8461fe2518"} Apr 16 19:27:03.125000 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.124961 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-dqptn"] Apr 16 19:27:03.128172 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.128153 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-dqptn" Apr 16 19:27:03.131423 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.131386 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 19:27:03.131514 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.131454 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 19:27:03.132355 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.132339 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-w6lpx\"" Apr 16 19:27:03.144800 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.144773 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-dqptn"] Apr 16 19:27:03.225216 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.225175 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14c30787-e48d-4710-82d7-c6be1e75bc60-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-dqptn\" (UID: \"14c30787-e48d-4710-82d7-c6be1e75bc60\") " pod="cert-manager/cert-manager-webhook-597b96b99b-dqptn" Apr 16 19:27:03.225380 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.225237 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vhrk\" (UniqueName: \"kubernetes.io/projected/14c30787-e48d-4710-82d7-c6be1e75bc60-kube-api-access-9vhrk\") pod \"cert-manager-webhook-597b96b99b-dqptn\" (UID: \"14c30787-e48d-4710-82d7-c6be1e75bc60\") " pod="cert-manager/cert-manager-webhook-597b96b99b-dqptn" Apr 16 19:27:03.326746 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.326701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14c30787-e48d-4710-82d7-c6be1e75bc60-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-dqptn\" (UID: \"14c30787-e48d-4710-82d7-c6be1e75bc60\") " pod="cert-manager/cert-manager-webhook-597b96b99b-dqptn" Apr 16 19:27:03.326746 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.326756 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vhrk\" (UniqueName: \"kubernetes.io/projected/14c30787-e48d-4710-82d7-c6be1e75bc60-kube-api-access-9vhrk\") pod \"cert-manager-webhook-597b96b99b-dqptn\" (UID: \"14c30787-e48d-4710-82d7-c6be1e75bc60\") " pod="cert-manager/cert-manager-webhook-597b96b99b-dqptn" Apr 16 19:27:03.335865 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.335838 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vhrk\" (UniqueName: \"kubernetes.io/projected/14c30787-e48d-4710-82d7-c6be1e75bc60-kube-api-access-9vhrk\") pod \"cert-manager-webhook-597b96b99b-dqptn\" (UID: \"14c30787-e48d-4710-82d7-c6be1e75bc60\") " pod="cert-manager/cert-manager-webhook-597b96b99b-dqptn" Apr 16 19:27:03.335990 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.335945 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14c30787-e48d-4710-82d7-c6be1e75bc60-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-dqptn\" (UID: \"14c30787-e48d-4710-82d7-c6be1e75bc60\") " pod="cert-manager/cert-manager-webhook-597b96b99b-dqptn" Apr 16 19:27:03.455505 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.455389 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-dqptn" Apr 16 19:27:03.493074 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.493040 2578 generic.go:358] "Generic (PLEG): container finished" podID="6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725" containerID="645fa416d38242712bfb17110d8491fa6da36b454b4dce78c4d4de8461fe2518" exitCode=0 Apr 16 19:27:03.493552 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.493111 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" event={"ID":"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725","Type":"ContainerDied","Data":"645fa416d38242712bfb17110d8491fa6da36b454b4dce78c4d4de8461fe2518"} Apr 16 19:27:03.605229 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:03.605194 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-dqptn"] Apr 16 19:27:03.608359 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:27:03.608318 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14c30787_e48d_4710_82d7_c6be1e75bc60.slice/crio-4ffdee1aa59955cc60bdd5713382aeeb4a5e334d744bc05823a96acc290b0380 WatchSource:0}: Error finding container 4ffdee1aa59955cc60bdd5713382aeeb4a5e334d744bc05823a96acc290b0380: Status 404 returned error can't find the container with id 4ffdee1aa59955cc60bdd5713382aeeb4a5e334d744bc05823a96acc290b0380 Apr 16 19:27:04.499462 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:04.499425 2578 generic.go:358] "Generic (PLEG): container finished" podID="6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725" containerID="38e64042bf5e25e25697d25eddeeb5b230c1fc3ee03b183d9fbea15c6efb3e0c" exitCode=0 Apr 16 19:27:04.499984 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:04.499511 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" event={"ID":"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725","Type":"ContainerDied","Data":"38e64042bf5e25e25697d25eddeeb5b230c1fc3ee03b183d9fbea15c6efb3e0c"} Apr 16 19:27:04.500598 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:04.500563 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-dqptn" event={"ID":"14c30787-e48d-4710-82d7-c6be1e75bc60","Type":"ContainerStarted","Data":"4ffdee1aa59955cc60bdd5713382aeeb4a5e334d744bc05823a96acc290b0380"} Apr 16 19:27:06.347797 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.347769 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" Apr 16 19:27:06.458758 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.458730 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-util\") pod \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\" (UID: \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\") " Apr 16 19:27:06.458900 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.458768 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m4rj\" (UniqueName: \"kubernetes.io/projected/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-kube-api-access-9m4rj\") pod \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\" (UID: \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\") " Apr 16 19:27:06.458900 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.458808 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-bundle\") pod \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\" (UID: \"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725\") " Apr 16 19:27:06.459267 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.459238 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-bundle" (OuterVolumeSpecName: "bundle") pod "6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725" (UID: "6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:27:06.461167 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.461132 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-kube-api-access-9m4rj" (OuterVolumeSpecName: "kube-api-access-9m4rj") pod "6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725" (UID: "6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725"). InnerVolumeSpecName "kube-api-access-9m4rj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:27:06.463291 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.463248 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-util" (OuterVolumeSpecName: "util") pod "6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725" (UID: "6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:27:06.508164 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.508125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" event={"ID":"6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725","Type":"ContainerDied","Data":"d0b517e7d6d543ce2f7423aa2380a76ae74ce8e22ae3c9ba9f6b023183784bb0"} Apr 16 19:27:06.508164 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.508153 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fwdrjn" Apr 16 19:27:06.508164 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.508169 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0b517e7d6d543ce2f7423aa2380a76ae74ce8e22ae3c9ba9f6b023183784bb0" Apr 16 19:27:06.509558 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.509527 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-dqptn" event={"ID":"14c30787-e48d-4710-82d7-c6be1e75bc60","Type":"ContainerStarted","Data":"a95734006e6c8823ea63ea871b2c43cb47e67f23f64f121ac4204bea6fb2e9f5"} Apr 16 19:27:06.509676 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.509575 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-dqptn" Apr 16 19:27:06.526634 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.526571 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-dqptn" podStartSLOduration=0.739846039 podStartE2EDuration="3.52654949s" podCreationTimestamp="2026-04-16 19:27:03 +0000 UTC" firstStartedPulling="2026-04-16 19:27:03.610508612 +0000 UTC m=+535.348411858" lastFinishedPulling="2026-04-16 19:27:06.397212059 +0000 UTC m=+538.135115309" observedRunningTime="2026-04-16 19:27:06.525522097 +0000 UTC m=+538.263425369" watchObservedRunningTime="2026-04-16 19:27:06.52654949 +0000 UTC m=+538.264452759" Apr 16 19:27:06.559427 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.559372 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-util\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:27:06.559427 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.559427 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9m4rj\" (UniqueName: \"kubernetes.io/projected/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-kube-api-access-9m4rj\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:27:06.559679 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:06.559443 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725-bundle\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:27:10.015663 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.015620 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-bqxhs"] Apr 16 19:27:10.016131 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.016027 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725" containerName="extract" Apr 16 19:27:10.016131 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.016045 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725" containerName="extract" Apr 16 19:27:10.016131 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.016061 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725" containerName="util" Apr 16 19:27:10.016131 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.016069 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725" containerName="util" Apr 16 19:27:10.016131 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.016083 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725" containerName="pull" Apr 16 19:27:10.016131 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.016090 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725" containerName="pull" Apr 16 19:27:10.016497 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.016148 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a1d0ef8-3c95-4bcb-a5fc-65d3ec1ee725" containerName="extract" Apr 16 19:27:10.021284 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.021260 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-bqxhs" Apr 16 19:27:10.023727 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.023699 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-j764x\"" Apr 16 19:27:10.029101 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.029065 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-bqxhs"] Apr 16 19:27:10.089688 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.089647 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl6ms\" (UniqueName: \"kubernetes.io/projected/d7c08a8b-4cf0-49c8-af1d-fcbab3c7a33f-kube-api-access-kl6ms\") pod \"cert-manager-759f64656b-bqxhs\" (UID: \"d7c08a8b-4cf0-49c8-af1d-fcbab3c7a33f\") " pod="cert-manager/cert-manager-759f64656b-bqxhs" Apr 16 19:27:10.089688 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.089688 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7c08a8b-4cf0-49c8-af1d-fcbab3c7a33f-bound-sa-token\") pod \"cert-manager-759f64656b-bqxhs\" (UID: \"d7c08a8b-4cf0-49c8-af1d-fcbab3c7a33f\") " pod="cert-manager/cert-manager-759f64656b-bqxhs" Apr 16 19:27:10.190638 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.190604 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kl6ms\" (UniqueName: \"kubernetes.io/projected/d7c08a8b-4cf0-49c8-af1d-fcbab3c7a33f-kube-api-access-kl6ms\") pod \"cert-manager-759f64656b-bqxhs\" (UID: \"d7c08a8b-4cf0-49c8-af1d-fcbab3c7a33f\") " pod="cert-manager/cert-manager-759f64656b-bqxhs" Apr 16 19:27:10.190638 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.190645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7c08a8b-4cf0-49c8-af1d-fcbab3c7a33f-bound-sa-token\") pod \"cert-manager-759f64656b-bqxhs\" (UID: \"d7c08a8b-4cf0-49c8-af1d-fcbab3c7a33f\") " pod="cert-manager/cert-manager-759f64656b-bqxhs" Apr 16 19:27:10.199680 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.199654 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d7c08a8b-4cf0-49c8-af1d-fcbab3c7a33f-bound-sa-token\") pod \"cert-manager-759f64656b-bqxhs\" (UID: \"d7c08a8b-4cf0-49c8-af1d-fcbab3c7a33f\") " pod="cert-manager/cert-manager-759f64656b-bqxhs" Apr 16 19:27:10.199836 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.199820 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl6ms\" (UniqueName: \"kubernetes.io/projected/d7c08a8b-4cf0-49c8-af1d-fcbab3c7a33f-kube-api-access-kl6ms\") pod \"cert-manager-759f64656b-bqxhs\" (UID: \"d7c08a8b-4cf0-49c8-af1d-fcbab3c7a33f\") " pod="cert-manager/cert-manager-759f64656b-bqxhs" Apr 16 19:27:10.331108 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.331017 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-bqxhs" Apr 16 19:27:10.462134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.462089 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-bqxhs"] Apr 16 19:27:10.464910 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:27:10.464876 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7c08a8b_4cf0_49c8_af1d_fcbab3c7a33f.slice/crio-37d4665d002a5f64162e7c4413acc73e5585ca5bbcb4bb65dc0b1d2dc44f502f WatchSource:0}: Error finding container 37d4665d002a5f64162e7c4413acc73e5585ca5bbcb4bb65dc0b1d2dc44f502f: Status 404 returned error can't find the container with id 37d4665d002a5f64162e7c4413acc73e5585ca5bbcb4bb65dc0b1d2dc44f502f Apr 16 19:27:10.522055 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:10.522026 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-bqxhs" event={"ID":"d7c08a8b-4cf0-49c8-af1d-fcbab3c7a33f","Type":"ContainerStarted","Data":"37d4665d002a5f64162e7c4413acc73e5585ca5bbcb4bb65dc0b1d2dc44f502f"} Apr 16 19:27:11.526713 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:11.526670 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-bqxhs" event={"ID":"d7c08a8b-4cf0-49c8-af1d-fcbab3c7a33f","Type":"ContainerStarted","Data":"84a0a7c2f6cd4e0ef84beee245ebd373b1f401310d3b2b35d845c415ca093fcf"} Apr 16 19:27:11.544425 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:11.544362 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-bqxhs" podStartSLOduration=2.544347805 podStartE2EDuration="2.544347805s" podCreationTimestamp="2026-04-16 19:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:27:11.543081616 +0000 UTC m=+543.280984885" watchObservedRunningTime="2026-04-16 19:27:11.544347805 +0000 UTC m=+543.282251072" Apr 16 19:27:11.992098 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:11.992055 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-89ht7"] Apr 16 19:27:11.995924 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:11.995901 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-89ht7" Apr 16 19:27:11.998475 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:11.998452 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 19:27:11.998577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:11.998460 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:27:11.998577 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:11.998492 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-c6kbs\"" Apr 16 19:27:12.002748 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:12.002724 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-89ht7"] Apr 16 19:27:12.105847 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:12.105803 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6-tmp\") pod \"openshift-lws-operator-bfc7f696d-89ht7\" (UID: \"a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-89ht7" Apr 16 19:27:12.105847 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:12.105851 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxn44\" (UniqueName: \"kubernetes.io/projected/a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6-kube-api-access-qxn44\") pod \"openshift-lws-operator-bfc7f696d-89ht7\" (UID: \"a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-89ht7" Apr 16 19:27:12.207182 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:12.207124 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6-tmp\") pod \"openshift-lws-operator-bfc7f696d-89ht7\" (UID: \"a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-89ht7" Apr 16 19:27:12.207370 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:12.207196 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxn44\" (UniqueName: \"kubernetes.io/projected/a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6-kube-api-access-qxn44\") pod \"openshift-lws-operator-bfc7f696d-89ht7\" (UID: \"a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-89ht7" Apr 16 19:27:12.207626 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:12.207607 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6-tmp\") pod \"openshift-lws-operator-bfc7f696d-89ht7\" (UID: \"a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-89ht7" Apr 16 19:27:12.217006 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:12.216976 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxn44\" (UniqueName: \"kubernetes.io/projected/a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6-kube-api-access-qxn44\") pod \"openshift-lws-operator-bfc7f696d-89ht7\" (UID: \"a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-89ht7" Apr 16 19:27:12.307219 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:12.307134 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-89ht7" Apr 16 19:27:12.429907 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:12.429864 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-89ht7"] Apr 16 19:27:12.434216 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:27:12.434175 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7a5ab98_ce3a_4bf8_837c_3b1ba623c2f6.slice/crio-057fd197a34d70d54034e3074ea4e7d014ac950e8c1ef1544ae7602a5396f802 WatchSource:0}: Error finding container 057fd197a34d70d54034e3074ea4e7d014ac950e8c1ef1544ae7602a5396f802: Status 404 returned error can't find the container with id 057fd197a34d70d54034e3074ea4e7d014ac950e8c1ef1544ae7602a5396f802 Apr 16 19:27:12.514637 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:12.514587 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-dqptn" Apr 16 19:27:12.530317 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:12.530284 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-89ht7" event={"ID":"a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6","Type":"ContainerStarted","Data":"057fd197a34d70d54034e3074ea4e7d014ac950e8c1ef1544ae7602a5396f802"} Apr 16 19:27:14.538770 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:14.538726 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-89ht7" event={"ID":"a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6","Type":"ContainerStarted","Data":"12fa62e7cd8e5bb1e91a18e5705a3fa50ecaf1fb744fcdb31f2f786e57833ea0"} Apr 16 19:27:14.556107 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:14.556055 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-89ht7" podStartSLOduration=1.8870349260000001 podStartE2EDuration="3.556040025s" podCreationTimestamp="2026-04-16 19:27:11 +0000 UTC" firstStartedPulling="2026-04-16 19:27:12.435667991 +0000 UTC m=+544.173571251" lastFinishedPulling="2026-04-16 19:27:14.1046731 +0000 UTC m=+545.842576350" observedRunningTime="2026-04-16 19:27:14.555066028 +0000 UTC m=+546.292969300" watchObservedRunningTime="2026-04-16 19:27:14.556040025 +0000 UTC m=+546.293943294" Apr 16 19:27:16.918865 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:16.918824 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88"] Apr 16 19:27:16.947442 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:16.947385 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88"] Apr 16 19:27:16.947605 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:16.947552 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" Apr 16 19:27:16.950858 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:16.950833 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:27:16.952270 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:16.952250 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:27:16.952385 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:16.952250 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ddqj5\"" Apr 16 19:27:17.050348 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:17.050304 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40366baa-e324-47e7-96f4-d9decef09568-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88\" (UID: \"40366baa-e324-47e7-96f4-d9decef09568\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" Apr 16 19:27:17.050551 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:17.050383 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40366baa-e324-47e7-96f4-d9decef09568-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88\" (UID: \"40366baa-e324-47e7-96f4-d9decef09568\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" Apr 16 19:27:17.050551 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:17.050436 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49xp4\" (UniqueName: \"kubernetes.io/projected/40366baa-e324-47e7-96f4-d9decef09568-kube-api-access-49xp4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88\" (UID: \"40366baa-e324-47e7-96f4-d9decef09568\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" Apr 16 19:27:17.151154 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:17.151111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40366baa-e324-47e7-96f4-d9decef09568-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88\" (UID: \"40366baa-e324-47e7-96f4-d9decef09568\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" Apr 16 19:27:17.151154 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:17.151155 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49xp4\" (UniqueName: \"kubernetes.io/projected/40366baa-e324-47e7-96f4-d9decef09568-kube-api-access-49xp4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88\" (UID: \"40366baa-e324-47e7-96f4-d9decef09568\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" Apr 16 19:27:17.151365 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:17.151208 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40366baa-e324-47e7-96f4-d9decef09568-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88\" (UID: \"40366baa-e324-47e7-96f4-d9decef09568\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" Apr 16 19:27:17.151548 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:17.151528 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40366baa-e324-47e7-96f4-d9decef09568-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88\" (UID: \"40366baa-e324-47e7-96f4-d9decef09568\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" Apr 16 19:27:17.151627 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:17.151604 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40366baa-e324-47e7-96f4-d9decef09568-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88\" (UID: \"40366baa-e324-47e7-96f4-d9decef09568\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" Apr 16 19:27:17.163653 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:17.163614 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49xp4\" (UniqueName: \"kubernetes.io/projected/40366baa-e324-47e7-96f4-d9decef09568-kube-api-access-49xp4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88\" (UID: \"40366baa-e324-47e7-96f4-d9decef09568\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" Apr 16 19:27:17.257254 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:17.257219 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" Apr 16 19:27:17.390213 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:17.390070 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88"] Apr 16 19:27:17.392872 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:27:17.392843 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40366baa_e324_47e7_96f4_d9decef09568.slice/crio-edc792687996ae7b6ff3be97cd1eabcc81390265bea0ad85b0f3d58103b61116 WatchSource:0}: Error finding container edc792687996ae7b6ff3be97cd1eabcc81390265bea0ad85b0f3d58103b61116: Status 404 returned error can't find the container with id edc792687996ae7b6ff3be97cd1eabcc81390265bea0ad85b0f3d58103b61116 Apr 16 19:27:17.551040 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:17.550944 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" event={"ID":"40366baa-e324-47e7-96f4-d9decef09568","Type":"ContainerStarted","Data":"86c7fcf7501c4efd2aeebe22e2931e76440c0ed5993884f2f7ed3caff2a682a0"} Apr 16 19:27:17.551040 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:17.550988 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" event={"ID":"40366baa-e324-47e7-96f4-d9decef09568","Type":"ContainerStarted","Data":"edc792687996ae7b6ff3be97cd1eabcc81390265bea0ad85b0f3d58103b61116"} Apr 16 19:27:18.555665 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:18.555625 2578 generic.go:358] "Generic (PLEG): container finished" podID="40366baa-e324-47e7-96f4-d9decef09568" containerID="86c7fcf7501c4efd2aeebe22e2931e76440c0ed5993884f2f7ed3caff2a682a0" exitCode=0 Apr 16 19:27:18.556066 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:18.555684 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" event={"ID":"40366baa-e324-47e7-96f4-d9decef09568","Type":"ContainerDied","Data":"86c7fcf7501c4efd2aeebe22e2931e76440c0ed5993884f2f7ed3caff2a682a0"} Apr 16 19:27:20.563016 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:20.562976 2578 generic.go:358] "Generic (PLEG): container finished" podID="40366baa-e324-47e7-96f4-d9decef09568" containerID="c759b7e8111dd098ed19ec52b67709698fd89a1fa9c07e0fbb5712a133db3bab" exitCode=0 Apr 16 19:27:20.563450 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:20.563061 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" event={"ID":"40366baa-e324-47e7-96f4-d9decef09568","Type":"ContainerDied","Data":"c759b7e8111dd098ed19ec52b67709698fd89a1fa9c07e0fbb5712a133db3bab"} Apr 16 19:27:21.568383 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:21.568345 2578 generic.go:358] "Generic (PLEG): container finished" podID="40366baa-e324-47e7-96f4-d9decef09568" containerID="68fec69eac2f1f86d6a84a9e86ed023cc65ab6b36c966cda66a962b74b855a57" exitCode=0 Apr 16 19:27:21.568760 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:21.568421 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" event={"ID":"40366baa-e324-47e7-96f4-d9decef09568","Type":"ContainerDied","Data":"68fec69eac2f1f86d6a84a9e86ed023cc65ab6b36c966cda66a962b74b855a57"} Apr 16 19:27:22.692137 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:22.692112 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" Apr 16 19:27:22.798279 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:22.798242 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40366baa-e324-47e7-96f4-d9decef09568-bundle\") pod \"40366baa-e324-47e7-96f4-d9decef09568\" (UID: \"40366baa-e324-47e7-96f4-d9decef09568\") " Apr 16 19:27:22.798474 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:22.798320 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40366baa-e324-47e7-96f4-d9decef09568-util\") pod \"40366baa-e324-47e7-96f4-d9decef09568\" (UID: \"40366baa-e324-47e7-96f4-d9decef09568\") " Apr 16 19:27:22.798474 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:22.798357 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49xp4\" (UniqueName: \"kubernetes.io/projected/40366baa-e324-47e7-96f4-d9decef09568-kube-api-access-49xp4\") pod \"40366baa-e324-47e7-96f4-d9decef09568\" (UID: \"40366baa-e324-47e7-96f4-d9decef09568\") " Apr 16 19:27:22.798949 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:22.798923 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40366baa-e324-47e7-96f4-d9decef09568-bundle" (OuterVolumeSpecName: "bundle") pod "40366baa-e324-47e7-96f4-d9decef09568" (UID: "40366baa-e324-47e7-96f4-d9decef09568"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:27:22.800569 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:22.800547 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40366baa-e324-47e7-96f4-d9decef09568-kube-api-access-49xp4" (OuterVolumeSpecName: "kube-api-access-49xp4") pod "40366baa-e324-47e7-96f4-d9decef09568" (UID: "40366baa-e324-47e7-96f4-d9decef09568"). InnerVolumeSpecName "kube-api-access-49xp4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:27:22.803342 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:22.803315 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40366baa-e324-47e7-96f4-d9decef09568-util" (OuterVolumeSpecName: "util") pod "40366baa-e324-47e7-96f4-d9decef09568" (UID: "40366baa-e324-47e7-96f4-d9decef09568"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:27:22.899267 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:22.899243 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40366baa-e324-47e7-96f4-d9decef09568-util\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:27:22.899267 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:22.899267 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-49xp4\" (UniqueName: \"kubernetes.io/projected/40366baa-e324-47e7-96f4-d9decef09568-kube-api-access-49xp4\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:27:22.899416 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:22.899277 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40366baa-e324-47e7-96f4-d9decef09568-bundle\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:27:23.576131 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:23.576094 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" event={"ID":"40366baa-e324-47e7-96f4-d9decef09568","Type":"ContainerDied","Data":"edc792687996ae7b6ff3be97cd1eabcc81390265bea0ad85b0f3d58103b61116"} Apr 16 19:27:23.576131 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:23.576117 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5spt88" Apr 16 19:27:23.576361 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:23.576128 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edc792687996ae7b6ff3be97cd1eabcc81390265bea0ad85b0f3d58103b61116" Apr 16 19:27:28.511948 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.511910 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk"] Apr 16 19:27:28.512343 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.512202 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40366baa-e324-47e7-96f4-d9decef09568" containerName="pull" Apr 16 19:27:28.512343 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.512212 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="40366baa-e324-47e7-96f4-d9decef09568" containerName="pull" Apr 16 19:27:28.512343 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.512223 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40366baa-e324-47e7-96f4-d9decef09568" containerName="extract" Apr 16 19:27:28.512343 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.512229 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="40366baa-e324-47e7-96f4-d9decef09568" containerName="extract" Apr 16 19:27:28.512343 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.512238 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40366baa-e324-47e7-96f4-d9decef09568" containerName="util" Apr 16 19:27:28.512343 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.512246 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="40366baa-e324-47e7-96f4-d9decef09568" containerName="util" Apr 16 19:27:28.512343 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.512294 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="40366baa-e324-47e7-96f4-d9decef09568" containerName="extract" Apr 16 19:27:28.516633 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.516611 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" Apr 16 19:27:28.519527 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.519493 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:27:28.520174 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.520154 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ddqj5\"" Apr 16 19:27:28.520287 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.520188 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:27:28.524824 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.524798 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk"] Apr 16 19:27:28.643450 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.643386 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk\" (UID: \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" Apr 16 19:27:28.643649 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.643491 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rhfq\" (UniqueName: \"kubernetes.io/projected/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-kube-api-access-6rhfq\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk\" (UID: \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" Apr 16 19:27:28.643649 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.643530 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk\" (UID: \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" Apr 16 19:27:28.744733 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.744687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk\" (UID: \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" Apr 16 19:27:28.744733 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.744741 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rhfq\" (UniqueName: \"kubernetes.io/projected/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-kube-api-access-6rhfq\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk\" (UID: \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" Apr 16 19:27:28.744944 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.744762 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk\" (UID: \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" Apr 16 19:27:28.745089 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.745065 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk\" (UID: \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" Apr 16 19:27:28.745130 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.745079 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk\" (UID: \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" Apr 16 19:27:28.765283 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.765198 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rhfq\" (UniqueName: \"kubernetes.io/projected/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-kube-api-access-6rhfq\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk\" (UID: \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" Apr 16 19:27:28.827116 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.827058 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" Apr 16 19:27:28.965521 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:28.965495 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk"] Apr 16 19:27:28.967939 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:27:28.967905 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49eb59f3_a6d4_46a0_9a16_eac2f381fd05.slice/crio-eff58de84677f19d2e08fc07f524a31e0037a6cab39640e2de00c4a7fda19fd4 WatchSource:0}: Error finding container eff58de84677f19d2e08fc07f524a31e0037a6cab39640e2de00c4a7fda19fd4: Status 404 returned error can't find the container with id eff58de84677f19d2e08fc07f524a31e0037a6cab39640e2de00c4a7fda19fd4 Apr 16 19:27:29.595765 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:29.595723 2578 generic.go:358] "Generic (PLEG): container finished" podID="49eb59f3-a6d4-46a0-9a16-eac2f381fd05" containerID="8ef14b623c62bfd1e3e2308e3b973f3d90b8d0fc8dcb2b4031dfb9be2542483f" exitCode=0 Apr 16 19:27:29.596126 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:29.595778 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" event={"ID":"49eb59f3-a6d4-46a0-9a16-eac2f381fd05","Type":"ContainerDied","Data":"8ef14b623c62bfd1e3e2308e3b973f3d90b8d0fc8dcb2b4031dfb9be2542483f"} Apr 16 19:27:29.596126 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:29.595806 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" event={"ID":"49eb59f3-a6d4-46a0-9a16-eac2f381fd05","Type":"ContainerStarted","Data":"eff58de84677f19d2e08fc07f524a31e0037a6cab39640e2de00c4a7fda19fd4"} Apr 16 19:27:30.601160 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:30.601112 2578 generic.go:358] "Generic (PLEG): container finished" podID="49eb59f3-a6d4-46a0-9a16-eac2f381fd05" containerID="593767be6f8536aeaea1259c42da680a4abccaa42deb05bb0d1ab02f61597f60" exitCode=0 Apr 16 19:27:30.601637 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:30.601201 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" event={"ID":"49eb59f3-a6d4-46a0-9a16-eac2f381fd05","Type":"ContainerDied","Data":"593767be6f8536aeaea1259c42da680a4abccaa42deb05bb0d1ab02f61597f60"} Apr 16 19:27:30.821729 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:30.821694 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985"] Apr 16 19:27:30.824921 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:30.824902 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" Apr 16 19:27:30.828127 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:30.828092 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-42x6b\"" Apr 16 19:27:30.828254 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:30.828124 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 19:27:30.828510 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:30.828388 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 19:27:30.828728 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:30.828701 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 19:27:30.835565 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:30.835529 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 19:27:30.839961 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:30.839939 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985"] Apr 16 19:27:30.961871 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:30.961785 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6hwk\" (UniqueName: \"kubernetes.io/projected/f2d31f98-a2f2-4976-a57e-f7e4f46a93f6-kube-api-access-f6hwk\") pod \"opendatahub-operator-controller-manager-66b64c949f-c9985\" (UID: \"f2d31f98-a2f2-4976-a57e-f7e4f46a93f6\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" Apr 16 19:27:30.961871 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:30.961849 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2d31f98-a2f2-4976-a57e-f7e4f46a93f6-apiservice-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-c9985\" (UID: \"f2d31f98-a2f2-4976-a57e-f7e4f46a93f6\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" Apr 16 19:27:30.962048 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:30.961892 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2d31f98-a2f2-4976-a57e-f7e4f46a93f6-webhook-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-c9985\" (UID: \"f2d31f98-a2f2-4976-a57e-f7e4f46a93f6\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" Apr 16 19:27:31.063309 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:31.063270 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6hwk\" (UniqueName: \"kubernetes.io/projected/f2d31f98-a2f2-4976-a57e-f7e4f46a93f6-kube-api-access-f6hwk\") pod \"opendatahub-operator-controller-manager-66b64c949f-c9985\" (UID: \"f2d31f98-a2f2-4976-a57e-f7e4f46a93f6\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" Apr 16 19:27:31.063532 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:31.063325 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2d31f98-a2f2-4976-a57e-f7e4f46a93f6-apiservice-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-c9985\" (UID: \"f2d31f98-a2f2-4976-a57e-f7e4f46a93f6\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" Apr 16 19:27:31.063532 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:31.063350 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2d31f98-a2f2-4976-a57e-f7e4f46a93f6-webhook-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-c9985\" (UID: \"f2d31f98-a2f2-4976-a57e-f7e4f46a93f6\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" Apr 16 19:27:31.066015 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:31.065990 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f2d31f98-a2f2-4976-a57e-f7e4f46a93f6-webhook-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-c9985\" (UID: \"f2d31f98-a2f2-4976-a57e-f7e4f46a93f6\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" Apr 16 19:27:31.066143 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:31.066122 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f2d31f98-a2f2-4976-a57e-f7e4f46a93f6-apiservice-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-c9985\" (UID: \"f2d31f98-a2f2-4976-a57e-f7e4f46a93f6\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" Apr 16 19:27:31.075527 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:31.075498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6hwk\" (UniqueName: \"kubernetes.io/projected/f2d31f98-a2f2-4976-a57e-f7e4f46a93f6-kube-api-access-f6hwk\") pod \"opendatahub-operator-controller-manager-66b64c949f-c9985\" (UID: \"f2d31f98-a2f2-4976-a57e-f7e4f46a93f6\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" Apr 16 19:27:31.147172 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:31.147123 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" Apr 16 19:27:31.281541 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:31.281511 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985"] Apr 16 19:27:31.283324 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:27:31.283295 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2d31f98_a2f2_4976_a57e_f7e4f46a93f6.slice/crio-cedce2f5ba88dc25d589799adf66c1a0b835d558fe5f4c4ec22745356f068471 WatchSource:0}: Error finding container cedce2f5ba88dc25d589799adf66c1a0b835d558fe5f4c4ec22745356f068471: Status 404 returned error can't find the container with id cedce2f5ba88dc25d589799adf66c1a0b835d558fe5f4c4ec22745356f068471 Apr 16 19:27:31.608209 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:31.608169 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" event={"ID":"f2d31f98-a2f2-4976-a57e-f7e4f46a93f6","Type":"ContainerStarted","Data":"cedce2f5ba88dc25d589799adf66c1a0b835d558fe5f4c4ec22745356f068471"} Apr 16 19:27:31.610061 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:31.610036 2578 generic.go:358] "Generic (PLEG): container finished" podID="49eb59f3-a6d4-46a0-9a16-eac2f381fd05" containerID="105d847a1bdf75f54686145dce8affe44f82f351d66910bab02d55cacf9254a6" exitCode=0 Apr 16 19:27:31.610164 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:31.610129 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" event={"ID":"49eb59f3-a6d4-46a0-9a16-eac2f381fd05","Type":"ContainerDied","Data":"105d847a1bdf75f54686145dce8affe44f82f351d66910bab02d55cacf9254a6"} Apr 16 19:27:33.798367 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:33.798341 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" Apr 16 19:27:33.989954 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:33.989912 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-bundle\") pod \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\" (UID: \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\") " Apr 16 19:27:33.990115 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:33.989983 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rhfq\" (UniqueName: \"kubernetes.io/projected/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-kube-api-access-6rhfq\") pod \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\" (UID: \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\") " Apr 16 19:27:33.990115 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:33.990010 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-util\") pod \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\" (UID: \"49eb59f3-a6d4-46a0-9a16-eac2f381fd05\") " Apr 16 19:27:33.990905 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:33.990871 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-bundle" (OuterVolumeSpecName: "bundle") pod "49eb59f3-a6d4-46a0-9a16-eac2f381fd05" (UID: "49eb59f3-a6d4-46a0-9a16-eac2f381fd05"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:27:33.992451 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:33.992429 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-kube-api-access-6rhfq" (OuterVolumeSpecName: "kube-api-access-6rhfq") pod "49eb59f3-a6d4-46a0-9a16-eac2f381fd05" (UID: "49eb59f3-a6d4-46a0-9a16-eac2f381fd05"). InnerVolumeSpecName "kube-api-access-6rhfq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:27:33.997018 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:33.996989 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-util" (OuterVolumeSpecName: "util") pod "49eb59f3-a6d4-46a0-9a16-eac2f381fd05" (UID: "49eb59f3-a6d4-46a0-9a16-eac2f381fd05"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:27:34.091538 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:34.091454 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6rhfq\" (UniqueName: \"kubernetes.io/projected/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-kube-api-access-6rhfq\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:27:34.091538 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:34.091484 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-util\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:27:34.091538 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:34.091494 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/49eb59f3-a6d4-46a0-9a16-eac2f381fd05-bundle\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:27:34.622671 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:34.622623 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" event={"ID":"f2d31f98-a2f2-4976-a57e-f7e4f46a93f6","Type":"ContainerStarted","Data":"2b94bb5c274d1d878e467f1cf2828f2824e676f0b53af666685d45306019f827"} Apr 16 19:27:34.623009 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:34.622721 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" Apr 16 19:27:34.624441 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:34.624396 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" Apr 16 19:27:34.624441 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:34.624430 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9kt7dk" event={"ID":"49eb59f3-a6d4-46a0-9a16-eac2f381fd05","Type":"ContainerDied","Data":"eff58de84677f19d2e08fc07f524a31e0037a6cab39640e2de00c4a7fda19fd4"} Apr 16 19:27:34.624592 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:34.624457 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eff58de84677f19d2e08fc07f524a31e0037a6cab39640e2de00c4a7fda19fd4" Apr 16 19:27:34.665877 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:34.665811 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" podStartSLOduration=2.121344849 podStartE2EDuration="4.66579532s" podCreationTimestamp="2026-04-16 19:27:30 +0000 UTC" firstStartedPulling="2026-04-16 19:27:31.285084184 +0000 UTC m=+563.022987430" lastFinishedPulling="2026-04-16 19:27:33.829534654 +0000 UTC m=+565.567437901" observedRunningTime="2026-04-16 19:27:34.664262316 +0000 UTC m=+566.402165583" watchObservedRunningTime="2026-04-16 19:27:34.66579532 +0000 UTC m=+566.403698587" Apr 16 19:27:45.629508 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:45.629476 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-c9985" Apr 16 19:27:47.654076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.654040 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd"] Apr 16 19:27:47.654485 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.654347 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49eb59f3-a6d4-46a0-9a16-eac2f381fd05" containerName="pull" Apr 16 19:27:47.654485 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.654361 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="49eb59f3-a6d4-46a0-9a16-eac2f381fd05" containerName="pull" Apr 16 19:27:47.654485 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.654371 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49eb59f3-a6d4-46a0-9a16-eac2f381fd05" containerName="extract" Apr 16 19:27:47.654485 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.654377 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="49eb59f3-a6d4-46a0-9a16-eac2f381fd05" containerName="extract" Apr 16 19:27:47.654485 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.654392 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49eb59f3-a6d4-46a0-9a16-eac2f381fd05" containerName="util" Apr 16 19:27:47.654485 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.654399 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="49eb59f3-a6d4-46a0-9a16-eac2f381fd05" containerName="util" Apr 16 19:27:47.654485 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.654474 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="49eb59f3-a6d4-46a0-9a16-eac2f381fd05" containerName="extract" Apr 16 19:27:47.660478 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.660455 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" Apr 16 19:27:47.663931 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.663911 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:27:47.664209 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.664191 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:27:47.668788 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.668770 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ddqj5\"" Apr 16 19:27:47.680124 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.680090 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd"] Apr 16 19:27:47.690508 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.690474 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b8ff5a-6a18-4557-9c02-49d991328ef2-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd\" (UID: \"44b8ff5a-6a18-4557-9c02-49d991328ef2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" Apr 16 19:27:47.690683 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.690522 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b8ff5a-6a18-4557-9c02-49d991328ef2-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd\" (UID: \"44b8ff5a-6a18-4557-9c02-49d991328ef2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" Apr 16 19:27:47.690683 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.690542 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n55tm\" (UniqueName: \"kubernetes.io/projected/44b8ff5a-6a18-4557-9c02-49d991328ef2-kube-api-access-n55tm\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd\" (UID: \"44b8ff5a-6a18-4557-9c02-49d991328ef2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" Apr 16 19:27:47.791607 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.791566 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b8ff5a-6a18-4557-9c02-49d991328ef2-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd\" (UID: \"44b8ff5a-6a18-4557-9c02-49d991328ef2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" Apr 16 19:27:47.791607 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.791612 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b8ff5a-6a18-4557-9c02-49d991328ef2-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd\" (UID: \"44b8ff5a-6a18-4557-9c02-49d991328ef2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" Apr 16 19:27:47.791822 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.791634 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n55tm\" (UniqueName: \"kubernetes.io/projected/44b8ff5a-6a18-4557-9c02-49d991328ef2-kube-api-access-n55tm\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd\" (UID: \"44b8ff5a-6a18-4557-9c02-49d991328ef2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" Apr 16 19:27:47.791995 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.791973 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b8ff5a-6a18-4557-9c02-49d991328ef2-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd\" (UID: \"44b8ff5a-6a18-4557-9c02-49d991328ef2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" Apr 16 19:27:47.792031 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.792017 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b8ff5a-6a18-4557-9c02-49d991328ef2-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd\" (UID: \"44b8ff5a-6a18-4557-9c02-49d991328ef2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" Apr 16 19:27:47.804907 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.804863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n55tm\" (UniqueName: \"kubernetes.io/projected/44b8ff5a-6a18-4557-9c02-49d991328ef2-kube-api-access-n55tm\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd\" (UID: \"44b8ff5a-6a18-4557-9c02-49d991328ef2\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" Apr 16 19:27:47.904196 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.904121 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-d894ddccb-92r84"] Apr 16 19:27:47.908537 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.908518 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-d894ddccb-92r84" Apr 16 19:27:47.911257 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.911236 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 19:27:47.911845 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.911802 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 19:27:47.911983 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.911849 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 19:27:47.911983 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.911883 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-fsk68\"" Apr 16 19:27:47.912129 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.912112 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 19:27:47.920449 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.920424 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-d894ddccb-92r84"] Apr 16 19:27:47.969838 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.969804 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" Apr 16 19:27:47.994579 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.994538 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/819903d8-46ef-467a-8e58-d186915a391c-tmp\") pod \"kube-auth-proxy-d894ddccb-92r84\" (UID: \"819903d8-46ef-467a-8e58-d186915a391c\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-92r84" Apr 16 19:27:47.994579 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.994579 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbptb\" (UniqueName: \"kubernetes.io/projected/819903d8-46ef-467a-8e58-d186915a391c-kube-api-access-xbptb\") pod \"kube-auth-proxy-d894ddccb-92r84\" (UID: \"819903d8-46ef-467a-8e58-d186915a391c\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-92r84" Apr 16 19:27:47.994775 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:47.994642 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/819903d8-46ef-467a-8e58-d186915a391c-tls-certs\") pod \"kube-auth-proxy-d894ddccb-92r84\" (UID: \"819903d8-46ef-467a-8e58-d186915a391c\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-92r84" Apr 16 19:27:48.095601 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:48.095564 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/819903d8-46ef-467a-8e58-d186915a391c-tmp\") pod \"kube-auth-proxy-d894ddccb-92r84\" (UID: \"819903d8-46ef-467a-8e58-d186915a391c\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-92r84" Apr 16 19:27:48.095793 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:48.095607 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbptb\" (UniqueName: \"kubernetes.io/projected/819903d8-46ef-467a-8e58-d186915a391c-kube-api-access-xbptb\") pod \"kube-auth-proxy-d894ddccb-92r84\" (UID: \"819903d8-46ef-467a-8e58-d186915a391c\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-92r84" Apr 16 19:27:48.095793 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:48.095648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/819903d8-46ef-467a-8e58-d186915a391c-tls-certs\") pod \"kube-auth-proxy-d894ddccb-92r84\" (UID: \"819903d8-46ef-467a-8e58-d186915a391c\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-92r84" Apr 16 19:27:48.098116 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:48.098087 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/819903d8-46ef-467a-8e58-d186915a391c-tmp\") pod \"kube-auth-proxy-d894ddccb-92r84\" (UID: \"819903d8-46ef-467a-8e58-d186915a391c\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-92r84" Apr 16 19:27:48.098250 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:48.098229 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/819903d8-46ef-467a-8e58-d186915a391c-tls-certs\") pod \"kube-auth-proxy-d894ddccb-92r84\" (UID: \"819903d8-46ef-467a-8e58-d186915a391c\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-92r84" Apr 16 19:27:48.109667 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:48.109638 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd"] Apr 16 19:27:48.111577 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:27:48.111547 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44b8ff5a_6a18_4557_9c02_49d991328ef2.slice/crio-962cad87b848cdead357e407601b6d8e08c7ffcc9eb892c52f4ef44fadcebaf3 WatchSource:0}: Error finding container 962cad87b848cdead357e407601b6d8e08c7ffcc9eb892c52f4ef44fadcebaf3: Status 404 returned error can't find the container with id 962cad87b848cdead357e407601b6d8e08c7ffcc9eb892c52f4ef44fadcebaf3 Apr 16 19:27:48.118565 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:48.118544 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbptb\" (UniqueName: \"kubernetes.io/projected/819903d8-46ef-467a-8e58-d186915a391c-kube-api-access-xbptb\") pod \"kube-auth-proxy-d894ddccb-92r84\" (UID: \"819903d8-46ef-467a-8e58-d186915a391c\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-92r84" Apr 16 19:27:48.220153 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:48.220124 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-d894ddccb-92r84" Apr 16 19:27:48.367537 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:48.367503 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-d894ddccb-92r84"] Apr 16 19:27:48.370533 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:27:48.370490 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod819903d8_46ef_467a_8e58_d186915a391c.slice/crio-b62b72feb24efc9fd3ae6c915157382f02fe0eabcf98aeec3037bbdc8182d191 WatchSource:0}: Error finding container b62b72feb24efc9fd3ae6c915157382f02fe0eabcf98aeec3037bbdc8182d191: Status 404 returned error can't find the container with id b62b72feb24efc9fd3ae6c915157382f02fe0eabcf98aeec3037bbdc8182d191 Apr 16 19:27:48.685597 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:48.685558 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-d894ddccb-92r84" event={"ID":"819903d8-46ef-467a-8e58-d186915a391c","Type":"ContainerStarted","Data":"b62b72feb24efc9fd3ae6c915157382f02fe0eabcf98aeec3037bbdc8182d191"} Apr 16 19:27:48.688362 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:48.687348 2578 generic.go:358] "Generic (PLEG): container finished" podID="44b8ff5a-6a18-4557-9c02-49d991328ef2" containerID="87f8b57d4f57393cdfe9533f1b6c59868aedc4d77552f64a20802c3449e84b57" exitCode=0 Apr 16 19:27:48.688362 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:48.687445 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" event={"ID":"44b8ff5a-6a18-4557-9c02-49d991328ef2","Type":"ContainerDied","Data":"87f8b57d4f57393cdfe9533f1b6c59868aedc4d77552f64a20802c3449e84b57"} Apr 16 19:27:48.688362 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:48.687474 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" event={"ID":"44b8ff5a-6a18-4557-9c02-49d991328ef2","Type":"ContainerStarted","Data":"962cad87b848cdead357e407601b6d8e08c7ffcc9eb892c52f4ef44fadcebaf3"} Apr 16 19:27:49.693767 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:49.693681 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" event={"ID":"44b8ff5a-6a18-4557-9c02-49d991328ef2","Type":"ContainerStarted","Data":"74233755d0fa89ba9a95afbd7bf5802f04d2a0304955c62c3610135d31de70ca"} Apr 16 19:27:50.698662 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:50.698570 2578 generic.go:358] "Generic (PLEG): container finished" podID="44b8ff5a-6a18-4557-9c02-49d991328ef2" containerID="74233755d0fa89ba9a95afbd7bf5802f04d2a0304955c62c3610135d31de70ca" exitCode=0 Apr 16 19:27:50.699015 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:50.698662 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" event={"ID":"44b8ff5a-6a18-4557-9c02-49d991328ef2","Type":"ContainerDied","Data":"74233755d0fa89ba9a95afbd7bf5802f04d2a0304955c62c3610135d31de70ca"} Apr 16 19:27:51.683166 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:51.683128 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-xt2qv"] Apr 16 19:27:51.686541 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:51.686523 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" Apr 16 19:27:51.689082 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:51.689059 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 16 19:27:51.689193 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:51.689080 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-fgzpg\"" Apr 16 19:27:51.694544 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:51.694516 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-xt2qv"] Apr 16 19:27:51.704094 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:51.704063 2578 generic.go:358] "Generic (PLEG): container finished" podID="44b8ff5a-6a18-4557-9c02-49d991328ef2" containerID="06e6bb75ca747a8c337f282073995d1590656f47dc15e727d6222c98f030210e" exitCode=0 Apr 16 19:27:51.704499 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:51.704139 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" event={"ID":"44b8ff5a-6a18-4557-9c02-49d991328ef2","Type":"ContainerDied","Data":"06e6bb75ca747a8c337f282073995d1590656f47dc15e727d6222c98f030210e"} Apr 16 19:27:51.705604 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:51.705580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-d894ddccb-92r84" event={"ID":"819903d8-46ef-467a-8e58-d186915a391c","Type":"ContainerStarted","Data":"c33701899f3db25aa86ce723086f565c2d5ebd732eb33d4fb18454a478215a30"} Apr 16 19:27:51.728238 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:51.728198 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebdd9a87-3859-41df-9bb9-7b8244bbebaa-cert\") pod \"odh-model-controller-858dbf95b8-xt2qv\" (UID: \"ebdd9a87-3859-41df-9bb9-7b8244bbebaa\") " pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" Apr 16 19:27:51.728238 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:51.728231 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg6s2\" (UniqueName: \"kubernetes.io/projected/ebdd9a87-3859-41df-9bb9-7b8244bbebaa-kube-api-access-wg6s2\") pod \"odh-model-controller-858dbf95b8-xt2qv\" (UID: \"ebdd9a87-3859-41df-9bb9-7b8244bbebaa\") " pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" Apr 16 19:27:51.740879 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:51.740325 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-d894ddccb-92r84" podStartSLOduration=1.557225065 podStartE2EDuration="4.740305928s" podCreationTimestamp="2026-04-16 19:27:47 +0000 UTC" firstStartedPulling="2026-04-16 19:27:48.372215084 +0000 UTC m=+580.110118330" lastFinishedPulling="2026-04-16 19:27:51.555295938 +0000 UTC m=+583.293199193" observedRunningTime="2026-04-16 19:27:51.740282863 +0000 UTC m=+583.478186132" watchObservedRunningTime="2026-04-16 19:27:51.740305928 +0000 UTC m=+583.478209196" Apr 16 19:27:51.829851 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:51.829745 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebdd9a87-3859-41df-9bb9-7b8244bbebaa-cert\") pod \"odh-model-controller-858dbf95b8-xt2qv\" (UID: \"ebdd9a87-3859-41df-9bb9-7b8244bbebaa\") " pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" Apr 16 19:27:51.829851 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:51.829789 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wg6s2\" (UniqueName: \"kubernetes.io/projected/ebdd9a87-3859-41df-9bb9-7b8244bbebaa-kube-api-access-wg6s2\") pod \"odh-model-controller-858dbf95b8-xt2qv\" (UID: \"ebdd9a87-3859-41df-9bb9-7b8244bbebaa\") " pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" Apr 16 19:27:51.830023 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:27:51.829901 2578 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 19:27:51.830023 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:27:51.829970 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ebdd9a87-3859-41df-9bb9-7b8244bbebaa-cert podName:ebdd9a87-3859-41df-9bb9-7b8244bbebaa nodeName:}" failed. No retries permitted until 2026-04-16 19:27:52.329952989 +0000 UTC m=+584.067856256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ebdd9a87-3859-41df-9bb9-7b8244bbebaa-cert") pod "odh-model-controller-858dbf95b8-xt2qv" (UID: "ebdd9a87-3859-41df-9bb9-7b8244bbebaa") : secret "odh-model-controller-webhook-cert" not found Apr 16 19:27:51.839120 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:51.839094 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg6s2\" (UniqueName: \"kubernetes.io/projected/ebdd9a87-3859-41df-9bb9-7b8244bbebaa-kube-api-access-wg6s2\") pod \"odh-model-controller-858dbf95b8-xt2qv\" (UID: \"ebdd9a87-3859-41df-9bb9-7b8244bbebaa\") " pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" Apr 16 19:27:52.335242 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:52.335189 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebdd9a87-3859-41df-9bb9-7b8244bbebaa-cert\") pod \"odh-model-controller-858dbf95b8-xt2qv\" (UID: \"ebdd9a87-3859-41df-9bb9-7b8244bbebaa\") " pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" Apr 16 19:27:52.337892 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:52.337861 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ebdd9a87-3859-41df-9bb9-7b8244bbebaa-cert\") pod \"odh-model-controller-858dbf95b8-xt2qv\" (UID: \"ebdd9a87-3859-41df-9bb9-7b8244bbebaa\") " pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" Apr 16 19:27:52.597672 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:52.597627 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" Apr 16 19:27:52.722720 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:52.722680 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-xt2qv"] Apr 16 19:27:52.725694 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:27:52.725661 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebdd9a87_3859_41df_9bb9_7b8244bbebaa.slice/crio-8af6145610dd29a5dd86fb50e2d571633c5150fb161941553a940adbbae51881 WatchSource:0}: Error finding container 8af6145610dd29a5dd86fb50e2d571633c5150fb161941553a940adbbae51881: Status 404 returned error can't find the container with id 8af6145610dd29a5dd86fb50e2d571633c5150fb161941553a940adbbae51881 Apr 16 19:27:52.832359 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:52.832335 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" Apr 16 19:27:52.941488 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:52.941374 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n55tm\" (UniqueName: \"kubernetes.io/projected/44b8ff5a-6a18-4557-9c02-49d991328ef2-kube-api-access-n55tm\") pod \"44b8ff5a-6a18-4557-9c02-49d991328ef2\" (UID: \"44b8ff5a-6a18-4557-9c02-49d991328ef2\") " Apr 16 19:27:52.941488 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:52.941462 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b8ff5a-6a18-4557-9c02-49d991328ef2-util\") pod \"44b8ff5a-6a18-4557-9c02-49d991328ef2\" (UID: \"44b8ff5a-6a18-4557-9c02-49d991328ef2\") " Apr 16 19:27:52.941680 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:52.941519 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b8ff5a-6a18-4557-9c02-49d991328ef2-bundle\") pod \"44b8ff5a-6a18-4557-9c02-49d991328ef2\" (UID: \"44b8ff5a-6a18-4557-9c02-49d991328ef2\") " Apr 16 19:27:52.942354 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:52.942325 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44b8ff5a-6a18-4557-9c02-49d991328ef2-bundle" (OuterVolumeSpecName: "bundle") pod "44b8ff5a-6a18-4557-9c02-49d991328ef2" (UID: "44b8ff5a-6a18-4557-9c02-49d991328ef2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:27:52.943659 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:52.943634 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44b8ff5a-6a18-4557-9c02-49d991328ef2-kube-api-access-n55tm" (OuterVolumeSpecName: "kube-api-access-n55tm") pod "44b8ff5a-6a18-4557-9c02-49d991328ef2" (UID: "44b8ff5a-6a18-4557-9c02-49d991328ef2"). InnerVolumeSpecName "kube-api-access-n55tm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:27:52.946752 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:52.946728 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44b8ff5a-6a18-4557-9c02-49d991328ef2-util" (OuterVolumeSpecName: "util") pod "44b8ff5a-6a18-4557-9c02-49d991328ef2" (UID: "44b8ff5a-6a18-4557-9c02-49d991328ef2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:27:53.042742 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:53.042703 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n55tm\" (UniqueName: \"kubernetes.io/projected/44b8ff5a-6a18-4557-9c02-49d991328ef2-kube-api-access-n55tm\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:27:53.042742 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:53.042737 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44b8ff5a-6a18-4557-9c02-49d991328ef2-util\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:27:53.042742 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:53.042748 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44b8ff5a-6a18-4557-9c02-49d991328ef2-bundle\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:27:53.723304 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:53.719127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" event={"ID":"ebdd9a87-3859-41df-9bb9-7b8244bbebaa","Type":"ContainerStarted","Data":"8af6145610dd29a5dd86fb50e2d571633c5150fb161941553a940adbbae51881"} Apr 16 19:27:53.725619 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:53.725550 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" event={"ID":"44b8ff5a-6a18-4557-9c02-49d991328ef2","Type":"ContainerDied","Data":"962cad87b848cdead357e407601b6d8e08c7ffcc9eb892c52f4ef44fadcebaf3"} Apr 16 19:27:53.725619 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:53.725593 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="962cad87b848cdead357e407601b6d8e08c7ffcc9eb892c52f4ef44fadcebaf3" Apr 16 19:27:53.725818 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:53.725725 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48352hwzd" Apr 16 19:27:55.735129 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:55.735020 2578 generic.go:358] "Generic (PLEG): container finished" podID="ebdd9a87-3859-41df-9bb9-7b8244bbebaa" containerID="ecf22a8b5165a987ce1498d828f156d082005322c2aac1b0ce43592e4ba23fb4" exitCode=1 Apr 16 19:27:55.735129 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:55.735098 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" event={"ID":"ebdd9a87-3859-41df-9bb9-7b8244bbebaa","Type":"ContainerDied","Data":"ecf22a8b5165a987ce1498d828f156d082005322c2aac1b0ce43592e4ba23fb4"} Apr 16 19:27:55.735566 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:55.735291 2578 scope.go:117] "RemoveContainer" containerID="ecf22a8b5165a987ce1498d828f156d082005322c2aac1b0ce43592e4ba23fb4" Apr 16 19:27:56.742258 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:56.742228 2578 generic.go:358] "Generic (PLEG): container finished" podID="ebdd9a87-3859-41df-9bb9-7b8244bbebaa" containerID="029b2e6bb3976e903a46719071345f0eb7de0939d6e3981afaa0a3e4fe1a2f90" exitCode=1 Apr 16 19:27:56.742662 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:56.742307 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" event={"ID":"ebdd9a87-3859-41df-9bb9-7b8244bbebaa","Type":"ContainerDied","Data":"029b2e6bb3976e903a46719071345f0eb7de0939d6e3981afaa0a3e4fe1a2f90"} Apr 16 19:27:56.742662 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:56.742352 2578 scope.go:117] "RemoveContainer" containerID="ecf22a8b5165a987ce1498d828f156d082005322c2aac1b0ce43592e4ba23fb4" Apr 16 19:27:56.742662 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:56.742560 2578 scope.go:117] "RemoveContainer" containerID="029b2e6bb3976e903a46719071345f0eb7de0939d6e3981afaa0a3e4fe1a2f90" Apr 16 19:27:56.742806 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:27:56.742787 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-xt2qv_opendatahub(ebdd9a87-3859-41df-9bb9-7b8244bbebaa)\"" pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" podUID="ebdd9a87-3859-41df-9bb9-7b8244bbebaa" Apr 16 19:27:57.611433 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.611375 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-s95tf"] Apr 16 19:27:57.611714 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.611701 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44b8ff5a-6a18-4557-9c02-49d991328ef2" containerName="extract" Apr 16 19:27:57.611767 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.611717 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b8ff5a-6a18-4557-9c02-49d991328ef2" containerName="extract" Apr 16 19:27:57.611767 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.611727 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44b8ff5a-6a18-4557-9c02-49d991328ef2" containerName="util" Apr 16 19:27:57.611767 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.611733 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b8ff5a-6a18-4557-9c02-49d991328ef2" containerName="util" Apr 16 19:27:57.611767 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.611741 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44b8ff5a-6a18-4557-9c02-49d991328ef2" containerName="pull" Apr 16 19:27:57.611767 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.611747 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b8ff5a-6a18-4557-9c02-49d991328ef2" containerName="pull" Apr 16 19:27:57.611973 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.611802 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="44b8ff5a-6a18-4557-9c02-49d991328ef2" containerName="extract" Apr 16 19:27:57.615910 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.615886 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-s95tf" Apr 16 19:27:57.619326 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.619298 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 16 19:27:57.619571 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.619542 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-4dvzw\"" Apr 16 19:27:57.627515 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.627487 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-s95tf"] Apr 16 19:27:57.685835 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.685790 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc7l2\" (UniqueName: \"kubernetes.io/projected/0718cf45-497e-48d3-8dc6-e073adda1fea-kube-api-access-zc7l2\") pod \"kserve-controller-manager-856948b99f-s95tf\" (UID: \"0718cf45-497e-48d3-8dc6-e073adda1fea\") " pod="opendatahub/kserve-controller-manager-856948b99f-s95tf" Apr 16 19:27:57.686033 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.685931 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0718cf45-497e-48d3-8dc6-e073adda1fea-cert\") pod \"kserve-controller-manager-856948b99f-s95tf\" (UID: \"0718cf45-497e-48d3-8dc6-e073adda1fea\") " pod="opendatahub/kserve-controller-manager-856948b99f-s95tf" Apr 16 19:27:57.747126 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.747097 2578 scope.go:117] "RemoveContainer" containerID="029b2e6bb3976e903a46719071345f0eb7de0939d6e3981afaa0a3e4fe1a2f90" Apr 16 19:27:57.747521 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:27:57.747268 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-xt2qv_opendatahub(ebdd9a87-3859-41df-9bb9-7b8244bbebaa)\"" pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" podUID="ebdd9a87-3859-41df-9bb9-7b8244bbebaa" Apr 16 19:27:57.786917 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.786885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zc7l2\" (UniqueName: \"kubernetes.io/projected/0718cf45-497e-48d3-8dc6-e073adda1fea-kube-api-access-zc7l2\") pod \"kserve-controller-manager-856948b99f-s95tf\" (UID: \"0718cf45-497e-48d3-8dc6-e073adda1fea\") " pod="opendatahub/kserve-controller-manager-856948b99f-s95tf" Apr 16 19:27:57.787129 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.786978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0718cf45-497e-48d3-8dc6-e073adda1fea-cert\") pod \"kserve-controller-manager-856948b99f-s95tf\" (UID: \"0718cf45-497e-48d3-8dc6-e073adda1fea\") " pod="opendatahub/kserve-controller-manager-856948b99f-s95tf" Apr 16 19:27:57.787129 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:27:57.787077 2578 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 19:27:57.787251 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:27:57.787150 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0718cf45-497e-48d3-8dc6-e073adda1fea-cert podName:0718cf45-497e-48d3-8dc6-e073adda1fea nodeName:}" failed. No retries permitted until 2026-04-16 19:27:58.287125556 +0000 UTC m=+590.025028813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0718cf45-497e-48d3-8dc6-e073adda1fea-cert") pod "kserve-controller-manager-856948b99f-s95tf" (UID: "0718cf45-497e-48d3-8dc6-e073adda1fea") : secret "kserve-webhook-server-cert" not found Apr 16 19:27:57.806547 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:57.806514 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc7l2\" (UniqueName: \"kubernetes.io/projected/0718cf45-497e-48d3-8dc6-e073adda1fea-kube-api-access-zc7l2\") pod \"kserve-controller-manager-856948b99f-s95tf\" (UID: \"0718cf45-497e-48d3-8dc6-e073adda1fea\") " pod="opendatahub/kserve-controller-manager-856948b99f-s95tf" Apr 16 19:27:58.291530 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:58.291489 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0718cf45-497e-48d3-8dc6-e073adda1fea-cert\") pod \"kserve-controller-manager-856948b99f-s95tf\" (UID: \"0718cf45-497e-48d3-8dc6-e073adda1fea\") " pod="opendatahub/kserve-controller-manager-856948b99f-s95tf" Apr 16 19:27:58.294311 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:58.294281 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0718cf45-497e-48d3-8dc6-e073adda1fea-cert\") pod \"kserve-controller-manager-856948b99f-s95tf\" (UID: \"0718cf45-497e-48d3-8dc6-e073adda1fea\") " pod="opendatahub/kserve-controller-manager-856948b99f-s95tf" Apr 16 19:27:58.526655 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:58.526615 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-s95tf" Apr 16 19:27:58.656038 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:58.655961 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-s95tf"] Apr 16 19:27:58.658348 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:27:58.658318 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0718cf45_497e_48d3_8dc6_e073adda1fea.slice/crio-73c76f154d5d0053046172ad2c3cfedd937f7b1784b693c24b1be8bf4ec72f34 WatchSource:0}: Error finding container 73c76f154d5d0053046172ad2c3cfedd937f7b1784b693c24b1be8bf4ec72f34: Status 404 returned error can't find the container with id 73c76f154d5d0053046172ad2c3cfedd937f7b1784b693c24b1be8bf4ec72f34 Apr 16 19:27:58.751053 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:27:58.751014 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-s95tf" event={"ID":"0718cf45-497e-48d3-8dc6-e073adda1fea","Type":"ContainerStarted","Data":"73c76f154d5d0053046172ad2c3cfedd937f7b1784b693c24b1be8bf4ec72f34"} Apr 16 19:28:01.763615 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:01.763572 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-s95tf" event={"ID":"0718cf45-497e-48d3-8dc6-e073adda1fea","Type":"ContainerStarted","Data":"2ae99c70f7bec974bb67abdd52feb0e9186cafd9a91b878ac350811669c4aa47"} Apr 16 19:28:01.763968 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:01.763725 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-s95tf" Apr 16 19:28:01.809446 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:01.809359 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-s95tf" podStartSLOduration=2.232721039 podStartE2EDuration="4.809343098s" podCreationTimestamp="2026-04-16 19:27:57 +0000 UTC" firstStartedPulling="2026-04-16 19:27:58.659759562 +0000 UTC m=+590.397662808" lastFinishedPulling="2026-04-16 19:28:01.236381613 +0000 UTC m=+592.974284867" observedRunningTime="2026-04-16 19:28:01.808141421 +0000 UTC m=+593.546044689" watchObservedRunningTime="2026-04-16 19:28:01.809343098 +0000 UTC m=+593.547246366" Apr 16 19:28:02.208484 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.208367 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255"] Apr 16 19:28:02.212184 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.212159 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" Apr 16 19:28:02.215969 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.215928 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ddqj5\"" Apr 16 19:28:02.216171 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.216149 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:28:02.216268 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.216205 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:28:02.242965 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.242924 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255"] Apr 16 19:28:02.326703 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.326657 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75adae89-dbd2-440f-a396-c0c6c32de3ec-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255\" (UID: \"75adae89-dbd2-440f-a396-c0c6c32de3ec\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" Apr 16 19:28:02.326858 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.326717 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj2xv\" (UniqueName: \"kubernetes.io/projected/75adae89-dbd2-440f-a396-c0c6c32de3ec-kube-api-access-dj2xv\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255\" (UID: \"75adae89-dbd2-440f-a396-c0c6c32de3ec\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" Apr 16 19:28:02.326858 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.326804 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75adae89-dbd2-440f-a396-c0c6c32de3ec-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255\" (UID: \"75adae89-dbd2-440f-a396-c0c6c32de3ec\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" Apr 16 19:28:02.427848 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.427810 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75adae89-dbd2-440f-a396-c0c6c32de3ec-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255\" (UID: \"75adae89-dbd2-440f-a396-c0c6c32de3ec\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" Apr 16 19:28:02.428032 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.427860 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dj2xv\" (UniqueName: \"kubernetes.io/projected/75adae89-dbd2-440f-a396-c0c6c32de3ec-kube-api-access-dj2xv\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255\" (UID: \"75adae89-dbd2-440f-a396-c0c6c32de3ec\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" Apr 16 19:28:02.428032 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.427895 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75adae89-dbd2-440f-a396-c0c6c32de3ec-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255\" (UID: \"75adae89-dbd2-440f-a396-c0c6c32de3ec\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" Apr 16 19:28:02.428229 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.428205 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75adae89-dbd2-440f-a396-c0c6c32de3ec-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255\" (UID: \"75adae89-dbd2-440f-a396-c0c6c32de3ec\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" Apr 16 19:28:02.428270 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.428236 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75adae89-dbd2-440f-a396-c0c6c32de3ec-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255\" (UID: \"75adae89-dbd2-440f-a396-c0c6c32de3ec\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" Apr 16 19:28:02.448100 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.448057 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj2xv\" (UniqueName: \"kubernetes.io/projected/75adae89-dbd2-440f-a396-c0c6c32de3ec-kube-api-access-dj2xv\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255\" (UID: \"75adae89-dbd2-440f-a396-c0c6c32de3ec\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" Apr 16 19:28:02.521906 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.521862 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" Apr 16 19:28:02.598386 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.598349 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" Apr 16 19:28:02.598807 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.598791 2578 scope.go:117] "RemoveContainer" containerID="029b2e6bb3976e903a46719071345f0eb7de0939d6e3981afaa0a3e4fe1a2f90" Apr 16 19:28:02.599045 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:28:02.599024 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-xt2qv_opendatahub(ebdd9a87-3859-41df-9bb9-7b8244bbebaa)\"" pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" podUID="ebdd9a87-3859-41df-9bb9-7b8244bbebaa" Apr 16 19:28:02.732607 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.732569 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255"] Apr 16 19:28:02.760707 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:28:02.760660 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75adae89_dbd2_440f_a396_c0c6c32de3ec.slice/crio-03bcd1bae8482f4a0e57dee91952bd4c9505c80e3b5eb5c6413291fe14c25547 WatchSource:0}: Error finding container 03bcd1bae8482f4a0e57dee91952bd4c9505c80e3b5eb5c6413291fe14c25547: Status 404 returned error can't find the container with id 03bcd1bae8482f4a0e57dee91952bd4c9505c80e3b5eb5c6413291fe14c25547 Apr 16 19:28:02.768909 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:02.768877 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" event={"ID":"75adae89-dbd2-440f-a396-c0c6c32de3ec","Type":"ContainerStarted","Data":"03bcd1bae8482f4a0e57dee91952bd4c9505c80e3b5eb5c6413291fe14c25547"} Apr 16 19:28:03.774265 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:03.774224 2578 generic.go:358] "Generic (PLEG): container finished" podID="75adae89-dbd2-440f-a396-c0c6c32de3ec" containerID="6abed7a23cad158fa06a9e580eaf6ff05abc4b6d19f0058685d820106cde13c7" exitCode=0 Apr 16 19:28:03.774665 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:03.774313 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" event={"ID":"75adae89-dbd2-440f-a396-c0c6c32de3ec","Type":"ContainerDied","Data":"6abed7a23cad158fa06a9e580eaf6ff05abc4b6d19f0058685d820106cde13c7"} Apr 16 19:28:04.136457 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.136337 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9"] Apr 16 19:28:04.139870 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.139843 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9" Apr 16 19:28:04.142527 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.142497 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 19:28:04.142662 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.142497 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 19:28:04.142826 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.142809 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-dtv78\"" Apr 16 19:28:04.156812 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.156780 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9"] Apr 16 19:28:04.245144 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.245097 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f90caa0b-b7ed-48fe-8776-e744dfead290-operator-config\") pod \"servicemesh-operator3-55f49c5f94-wnnv9\" (UID: \"f90caa0b-b7ed-48fe-8776-e744dfead290\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9" Apr 16 19:28:04.245328 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.245185 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mk2c\" (UniqueName: \"kubernetes.io/projected/f90caa0b-b7ed-48fe-8776-e744dfead290-kube-api-access-2mk2c\") pod \"servicemesh-operator3-55f49c5f94-wnnv9\" (UID: \"f90caa0b-b7ed-48fe-8776-e744dfead290\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9" Apr 16 19:28:04.346545 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.346506 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f90caa0b-b7ed-48fe-8776-e744dfead290-operator-config\") pod \"servicemesh-operator3-55f49c5f94-wnnv9\" (UID: \"f90caa0b-b7ed-48fe-8776-e744dfead290\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9" Apr 16 19:28:04.346750 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.346574 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mk2c\" (UniqueName: \"kubernetes.io/projected/f90caa0b-b7ed-48fe-8776-e744dfead290-kube-api-access-2mk2c\") pod \"servicemesh-operator3-55f49c5f94-wnnv9\" (UID: \"f90caa0b-b7ed-48fe-8776-e744dfead290\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9" Apr 16 19:28:04.349323 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.349288 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/f90caa0b-b7ed-48fe-8776-e744dfead290-operator-config\") pod \"servicemesh-operator3-55f49c5f94-wnnv9\" (UID: \"f90caa0b-b7ed-48fe-8776-e744dfead290\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9" Apr 16 19:28:04.357257 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.357226 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mk2c\" (UniqueName: \"kubernetes.io/projected/f90caa0b-b7ed-48fe-8776-e744dfead290-kube-api-access-2mk2c\") pod \"servicemesh-operator3-55f49c5f94-wnnv9\" (UID: \"f90caa0b-b7ed-48fe-8776-e744dfead290\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9" Apr 16 19:28:04.449199 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.449170 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9" Apr 16 19:28:04.583519 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.583491 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9"] Apr 16 19:28:04.585838 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:28:04.585808 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf90caa0b_b7ed_48fe_8776_e744dfead290.slice/crio-cc621f1d70049f8519ff5de20206e21006cb4ae90cd1582d74254b5550b46d9d WatchSource:0}: Error finding container cc621f1d70049f8519ff5de20206e21006cb4ae90cd1582d74254b5550b46d9d: Status 404 returned error can't find the container with id cc621f1d70049f8519ff5de20206e21006cb4ae90cd1582d74254b5550b46d9d Apr 16 19:28:04.779949 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.779900 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9" event={"ID":"f90caa0b-b7ed-48fe-8776-e744dfead290","Type":"ContainerStarted","Data":"cc621f1d70049f8519ff5de20206e21006cb4ae90cd1582d74254b5550b46d9d"} Apr 16 19:28:04.781502 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.781475 2578 generic.go:358] "Generic (PLEG): container finished" podID="75adae89-dbd2-440f-a396-c0c6c32de3ec" containerID="b3e06afedb07597637cdfea6f6c981a68e1d451d480850304582451b10e7a4cb" exitCode=0 Apr 16 19:28:04.781615 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:04.781567 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" event={"ID":"75adae89-dbd2-440f-a396-c0c6c32de3ec","Type":"ContainerDied","Data":"b3e06afedb07597637cdfea6f6c981a68e1d451d480850304582451b10e7a4cb"} Apr 16 19:28:05.787794 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:05.787764 2578 generic.go:358] "Generic (PLEG): container finished" podID="75adae89-dbd2-440f-a396-c0c6c32de3ec" containerID="86388b09da75cbeee6c462e73cc3ca27ab5b2bcb24a61173265730fe2f255544" exitCode=0 Apr 16 19:28:05.788300 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:05.787803 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" event={"ID":"75adae89-dbd2-440f-a396-c0c6c32de3ec","Type":"ContainerDied","Data":"86388b09da75cbeee6c462e73cc3ca27ab5b2bcb24a61173265730fe2f255544"} Apr 16 19:28:07.091585 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.091560 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" Apr 16 19:28:07.171179 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.171155 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75adae89-dbd2-440f-a396-c0c6c32de3ec-util\") pod \"75adae89-dbd2-440f-a396-c0c6c32de3ec\" (UID: \"75adae89-dbd2-440f-a396-c0c6c32de3ec\") " Apr 16 19:28:07.171298 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.171201 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75adae89-dbd2-440f-a396-c0c6c32de3ec-bundle\") pod \"75adae89-dbd2-440f-a396-c0c6c32de3ec\" (UID: \"75adae89-dbd2-440f-a396-c0c6c32de3ec\") " Apr 16 19:28:07.171298 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.171276 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj2xv\" (UniqueName: \"kubernetes.io/projected/75adae89-dbd2-440f-a396-c0c6c32de3ec-kube-api-access-dj2xv\") pod \"75adae89-dbd2-440f-a396-c0c6c32de3ec\" (UID: \"75adae89-dbd2-440f-a396-c0c6c32de3ec\") " Apr 16 19:28:07.172432 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.172369 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75adae89-dbd2-440f-a396-c0c6c32de3ec-bundle" (OuterVolumeSpecName: "bundle") pod "75adae89-dbd2-440f-a396-c0c6c32de3ec" (UID: "75adae89-dbd2-440f-a396-c0c6c32de3ec"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:28:07.173799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.173775 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75adae89-dbd2-440f-a396-c0c6c32de3ec-kube-api-access-dj2xv" (OuterVolumeSpecName: "kube-api-access-dj2xv") pod "75adae89-dbd2-440f-a396-c0c6c32de3ec" (UID: "75adae89-dbd2-440f-a396-c0c6c32de3ec"). InnerVolumeSpecName "kube-api-access-dj2xv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:28:07.176689 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.176666 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75adae89-dbd2-440f-a396-c0c6c32de3ec-util" (OuterVolumeSpecName: "util") pod "75adae89-dbd2-440f-a396-c0c6c32de3ec" (UID: "75adae89-dbd2-440f-a396-c0c6c32de3ec"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:28:07.272285 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.272245 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dj2xv\" (UniqueName: \"kubernetes.io/projected/75adae89-dbd2-440f-a396-c0c6c32de3ec-kube-api-access-dj2xv\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:28:07.272285 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.272283 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/75adae89-dbd2-440f-a396-c0c6c32de3ec-util\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:28:07.272493 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.272297 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/75adae89-dbd2-440f-a396-c0c6c32de3ec-bundle\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:28:07.799745 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.799703 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9" event={"ID":"f90caa0b-b7ed-48fe-8776-e744dfead290","Type":"ContainerStarted","Data":"532c297d11a4ca596fdc0917b14c7c25146915f779e3dad9d6fb11112fb39d31"} Apr 16 19:28:07.799960 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.799793 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9" Apr 16 19:28:07.801475 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.801443 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" event={"ID":"75adae89-dbd2-440f-a396-c0c6c32de3ec","Type":"ContainerDied","Data":"03bcd1bae8482f4a0e57dee91952bd4c9505c80e3b5eb5c6413291fe14c25547"} Apr 16 19:28:07.801587 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.801479 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2sw255" Apr 16 19:28:07.801587 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.801480 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03bcd1bae8482f4a0e57dee91952bd4c9505c80e3b5eb5c6413291fe14c25547" Apr 16 19:28:07.823002 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:07.822951 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9" podStartSLOduration=1.260107881 podStartE2EDuration="3.822934373s" podCreationTimestamp="2026-04-16 19:28:04 +0000 UTC" firstStartedPulling="2026-04-16 19:28:04.588390166 +0000 UTC m=+596.326293413" lastFinishedPulling="2026-04-16 19:28:07.151216657 +0000 UTC m=+598.889119905" observedRunningTime="2026-04-16 19:28:07.821011962 +0000 UTC m=+599.558915246" watchObservedRunningTime="2026-04-16 19:28:07.822934373 +0000 UTC m=+599.560837641" Apr 16 19:28:08.797556 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:08.797523 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:28:08.797556 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:08.797537 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:28:12.598548 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:12.598506 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" Apr 16 19:28:12.598936 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:12.598892 2578 scope.go:117] "RemoveContainer" containerID="029b2e6bb3976e903a46719071345f0eb7de0939d6e3981afaa0a3e4fe1a2f90" Apr 16 19:28:13.826799 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:13.826759 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" event={"ID":"ebdd9a87-3859-41df-9bb9-7b8244bbebaa","Type":"ContainerStarted","Data":"86e64b7e27127b5a4aa6e3a13122e5605c01f5cad4aa5e65299a37439bafcb94"} Apr 16 19:28:13.827239 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:13.827000 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" Apr 16 19:28:13.850314 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:13.850248 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" podStartSLOduration=2.725269194 podStartE2EDuration="22.85023142s" podCreationTimestamp="2026-04-16 19:27:51 +0000 UTC" firstStartedPulling="2026-04-16 19:27:52.727779875 +0000 UTC m=+584.465683126" lastFinishedPulling="2026-04-16 19:28:12.852742089 +0000 UTC m=+604.590645352" observedRunningTime="2026-04-16 19:28:13.84900241 +0000 UTC m=+605.586905681" watchObservedRunningTime="2026-04-16 19:28:13.85023142 +0000 UTC m=+605.588134690" Apr 16 19:28:18.806957 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:18.806873 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-wnnv9" Apr 16 19:28:24.833205 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:24.833167 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-xt2qv" Apr 16 19:28:32.775009 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:32.774970 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-s95tf" Apr 16 19:28:48.676839 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.676800 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8fmfw"] Apr 16 19:28:48.678723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.677111 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75adae89-dbd2-440f-a396-c0c6c32de3ec" containerName="util" Apr 16 19:28:48.678723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.677121 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="75adae89-dbd2-440f-a396-c0c6c32de3ec" containerName="util" Apr 16 19:28:48.678723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.677131 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75adae89-dbd2-440f-a396-c0c6c32de3ec" containerName="pull" Apr 16 19:28:48.678723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.677138 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="75adae89-dbd2-440f-a396-c0c6c32de3ec" containerName="pull" Apr 16 19:28:48.678723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.677145 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75adae89-dbd2-440f-a396-c0c6c32de3ec" containerName="extract" Apr 16 19:28:48.678723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.677151 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="75adae89-dbd2-440f-a396-c0c6c32de3ec" containerName="extract" Apr 16 19:28:48.678723 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.677212 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="75adae89-dbd2-440f-a396-c0c6c32de3ec" containerName="extract" Apr 16 19:28:48.679611 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.679581 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-8fmfw" Apr 16 19:28:48.682909 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.682887 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 19:28:48.683032 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.682909 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 19:28:48.684564 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.684547 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-4lc7v\"" Apr 16 19:28:48.695995 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.695969 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8fmfw"] Apr 16 19:28:48.727932 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.727898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w2nx\" (UniqueName: \"kubernetes.io/projected/6073e468-570c-48ce-b0b9-0d90bf19f663-kube-api-access-4w2nx\") pod \"kuadrant-operator-catalog-8fmfw\" (UID: \"6073e468-570c-48ce-b0b9-0d90bf19f663\") " pod="kuadrant-system/kuadrant-operator-catalog-8fmfw" Apr 16 19:28:48.829272 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.829223 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4w2nx\" (UniqueName: \"kubernetes.io/projected/6073e468-570c-48ce-b0b9-0d90bf19f663-kube-api-access-4w2nx\") pod \"kuadrant-operator-catalog-8fmfw\" (UID: \"6073e468-570c-48ce-b0b9-0d90bf19f663\") " pod="kuadrant-system/kuadrant-operator-catalog-8fmfw" Apr 16 19:28:48.838893 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.838847 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w2nx\" (UniqueName: \"kubernetes.io/projected/6073e468-570c-48ce-b0b9-0d90bf19f663-kube-api-access-4w2nx\") pod \"kuadrant-operator-catalog-8fmfw\" (UID: \"6073e468-570c-48ce-b0b9-0d90bf19f663\") " pod="kuadrant-system/kuadrant-operator-catalog-8fmfw" Apr 16 19:28:48.988739 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:48.988701 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-8fmfw" Apr 16 19:28:49.118504 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.118454 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8fmfw"] Apr 16 19:28:49.122431 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:28:49.122378 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6073e468_570c_48ce_b0b9_0d90bf19f663.slice/crio-c4ff77532718ae2abc271624952741a41ad9a20babec2a8b4fbe524c5b6fbad0 WatchSource:0}: Error finding container c4ff77532718ae2abc271624952741a41ad9a20babec2a8b4fbe524c5b6fbad0: Status 404 returned error can't find the container with id c4ff77532718ae2abc271624952741a41ad9a20babec2a8b4fbe524c5b6fbad0 Apr 16 19:28:49.269691 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.269600 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8fmfw"] Apr 16 19:28:49.282881 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.282849 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zzkfr"] Apr 16 19:28:49.286864 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.286843 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-zzkfr" Apr 16 19:28:49.293827 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.293797 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zzkfr"] Apr 16 19:28:49.334266 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.334232 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhp6t\" (UniqueName: \"kubernetes.io/projected/ed475c47-de86-443a-bd41-0a5b65b616c9-kube-api-access-nhp6t\") pod \"kuadrant-operator-catalog-zzkfr\" (UID: \"ed475c47-de86-443a-bd41-0a5b65b616c9\") " pod="kuadrant-system/kuadrant-operator-catalog-zzkfr" Apr 16 19:28:49.434971 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.434931 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhp6t\" (UniqueName: \"kubernetes.io/projected/ed475c47-de86-443a-bd41-0a5b65b616c9-kube-api-access-nhp6t\") pod \"kuadrant-operator-catalog-zzkfr\" (UID: \"ed475c47-de86-443a-bd41-0a5b65b616c9\") " pod="kuadrant-system/kuadrant-operator-catalog-zzkfr" Apr 16 19:28:49.444076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.444039 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhp6t\" (UniqueName: \"kubernetes.io/projected/ed475c47-de86-443a-bd41-0a5b65b616c9-kube-api-access-nhp6t\") pod \"kuadrant-operator-catalog-zzkfr\" (UID: \"ed475c47-de86-443a-bd41-0a5b65b616c9\") " pod="kuadrant-system/kuadrant-operator-catalog-zzkfr" Apr 16 19:28:49.598874 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.598785 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-zzkfr" Apr 16 19:28:49.745941 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.745913 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-zzkfr"] Apr 16 19:28:49.748522 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:28:49.748494 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded475c47_de86_443a_bd41_0a5b65b616c9.slice/crio-85aff5d21dccba4059909dbc85c315465cf610c051e41005158baf4c1898efda WatchSource:0}: Error finding container 85aff5d21dccba4059909dbc85c315465cf610c051e41005158baf4c1898efda: Status 404 returned error can't find the container with id 85aff5d21dccba4059909dbc85c315465cf610c051e41005158baf4c1898efda Apr 16 19:28:49.771457 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.771402 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp"] Apr 16 19:28:49.774748 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.774725 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.777107 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.777084 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 19:28:49.777229 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.777165 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 19:28:49.777368 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.777346 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-m7sct\"" Apr 16 19:28:49.777462 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.777372 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 19:28:49.777462 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.777376 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 19:28:49.784581 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.784545 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp"] Apr 16 19:28:49.838782 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.838749 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ml9l\" (UniqueName: \"kubernetes.io/projected/9b488a9b-2a6e-46f5-80ea-620284daa662-kube-api-access-2ml9l\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.838782 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.838792 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/9b488a9b-2a6e-46f5-80ea-620284daa662-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.839028 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.838825 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9b488a9b-2a6e-46f5-80ea-620284daa662-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.839028 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.838877 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/9b488a9b-2a6e-46f5-80ea-620284daa662-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.839028 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.838915 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9b488a9b-2a6e-46f5-80ea-620284daa662-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.839028 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.838948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/9b488a9b-2a6e-46f5-80ea-620284daa662-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.839028 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.839020 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/9b488a9b-2a6e-46f5-80ea-620284daa662-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.940247 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.940164 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ml9l\" (UniqueName: \"kubernetes.io/projected/9b488a9b-2a6e-46f5-80ea-620284daa662-kube-api-access-2ml9l\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.940247 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.940204 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/9b488a9b-2a6e-46f5-80ea-620284daa662-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.940247 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.940229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9b488a9b-2a6e-46f5-80ea-620284daa662-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.940624 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.940263 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/9b488a9b-2a6e-46f5-80ea-620284daa662-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.940624 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.940300 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9b488a9b-2a6e-46f5-80ea-620284daa662-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.940624 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.940325 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/9b488a9b-2a6e-46f5-80ea-620284daa662-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.940624 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.940391 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/9b488a9b-2a6e-46f5-80ea-620284daa662-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.941046 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.941006 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/9b488a9b-2a6e-46f5-80ea-620284daa662-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.943707 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.943678 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/9b488a9b-2a6e-46f5-80ea-620284daa662-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.943836 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.943722 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/9b488a9b-2a6e-46f5-80ea-620284daa662-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.943836 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.943722 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/9b488a9b-2a6e-46f5-80ea-620284daa662-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.943960 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.943933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/9b488a9b-2a6e-46f5-80ea-620284daa662-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.949218 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.949187 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ml9l\" (UniqueName: \"kubernetes.io/projected/9b488a9b-2a6e-46f5-80ea-620284daa662-kube-api-access-2ml9l\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.949418 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.949388 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/9b488a9b-2a6e-46f5-80ea-620284daa662-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-bnzvp\" (UID: \"9b488a9b-2a6e-46f5-80ea-620284daa662\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:49.954582 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.954531 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-8fmfw" event={"ID":"6073e468-570c-48ce-b0b9-0d90bf19f663","Type":"ContainerStarted","Data":"c4ff77532718ae2abc271624952741a41ad9a20babec2a8b4fbe524c5b6fbad0"} Apr 16 19:28:49.956019 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:49.955990 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-zzkfr" event={"ID":"ed475c47-de86-443a-bd41-0a5b65b616c9","Type":"ContainerStarted","Data":"85aff5d21dccba4059909dbc85c315465cf610c051e41005158baf4c1898efda"} Apr 16 19:28:50.085929 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:50.085881 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:50.562023 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:50.561992 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp"] Apr 16 19:28:50.625261 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:28:50.625219 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b488a9b_2a6e_46f5_80ea_620284daa662.slice/crio-23cff45961d922bc2d4f1f95c60a08e16263c8f6489f7298b5a2912bf4a0c219 WatchSource:0}: Error finding container 23cff45961d922bc2d4f1f95c60a08e16263c8f6489f7298b5a2912bf4a0c219: Status 404 returned error can't find the container with id 23cff45961d922bc2d4f1f95c60a08e16263c8f6489f7298b5a2912bf4a0c219 Apr 16 19:28:50.967726 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:50.967680 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" event={"ID":"9b488a9b-2a6e-46f5-80ea-620284daa662","Type":"ContainerStarted","Data":"23cff45961d922bc2d4f1f95c60a08e16263c8f6489f7298b5a2912bf4a0c219"} Apr 16 19:28:51.974590 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:51.974551 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-8fmfw" event={"ID":"6073e468-570c-48ce-b0b9-0d90bf19f663","Type":"ContainerStarted","Data":"047beb47b3e37f6974327d2bd6716fcb41af90b165c91491a7724892163af73e"} Apr 16 19:28:51.975052 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:51.974664 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-8fmfw" podUID="6073e468-570c-48ce-b0b9-0d90bf19f663" containerName="registry-server" containerID="cri-o://047beb47b3e37f6974327d2bd6716fcb41af90b165c91491a7724892163af73e" gracePeriod=2 Apr 16 19:28:51.976939 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:51.976883 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-zzkfr" event={"ID":"ed475c47-de86-443a-bd41-0a5b65b616c9","Type":"ContainerStarted","Data":"ef1fd57c6cd62b9e75585d3967d0558503172dc9dbf4c8c1771eb52f883cc6b8"} Apr 16 19:28:51.991583 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:51.991532 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-8fmfw" podStartSLOduration=1.7244781 podStartE2EDuration="3.991514632s" podCreationTimestamp="2026-04-16 19:28:48 +0000 UTC" firstStartedPulling="2026-04-16 19:28:49.123730242 +0000 UTC m=+640.861633488" lastFinishedPulling="2026-04-16 19:28:51.390766772 +0000 UTC m=+643.128670020" observedRunningTime="2026-04-16 19:28:51.990256028 +0000 UTC m=+643.728159297" watchObservedRunningTime="2026-04-16 19:28:51.991514632 +0000 UTC m=+643.729417903" Apr 16 19:28:52.013668 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:52.013599 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-zzkfr" podStartSLOduration=1.3724934389999999 podStartE2EDuration="3.013578641s" podCreationTimestamp="2026-04-16 19:28:49 +0000 UTC" firstStartedPulling="2026-04-16 19:28:49.750160488 +0000 UTC m=+641.488063734" lastFinishedPulling="2026-04-16 19:28:51.39124569 +0000 UTC m=+643.129148936" observedRunningTime="2026-04-16 19:28:52.012493839 +0000 UTC m=+643.750397108" watchObservedRunningTime="2026-04-16 19:28:52.013578641 +0000 UTC m=+643.751481912" Apr 16 19:28:52.242683 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:52.242656 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-8fmfw" Apr 16 19:28:52.364417 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:52.364357 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w2nx\" (UniqueName: \"kubernetes.io/projected/6073e468-570c-48ce-b0b9-0d90bf19f663-kube-api-access-4w2nx\") pod \"6073e468-570c-48ce-b0b9-0d90bf19f663\" (UID: \"6073e468-570c-48ce-b0b9-0d90bf19f663\") " Apr 16 19:28:52.367252 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:52.367211 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6073e468-570c-48ce-b0b9-0d90bf19f663-kube-api-access-4w2nx" (OuterVolumeSpecName: "kube-api-access-4w2nx") pod "6073e468-570c-48ce-b0b9-0d90bf19f663" (UID: "6073e468-570c-48ce-b0b9-0d90bf19f663"). InnerVolumeSpecName "kube-api-access-4w2nx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:28:52.465792 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:52.465733 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4w2nx\" (UniqueName: \"kubernetes.io/projected/6073e468-570c-48ce-b0b9-0d90bf19f663-kube-api-access-4w2nx\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:28:52.982925 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:52.982886 2578 generic.go:358] "Generic (PLEG): container finished" podID="6073e468-570c-48ce-b0b9-0d90bf19f663" containerID="047beb47b3e37f6974327d2bd6716fcb41af90b165c91491a7724892163af73e" exitCode=0 Apr 16 19:28:52.983392 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:52.982953 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-8fmfw" Apr 16 19:28:52.983392 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:52.982974 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-8fmfw" event={"ID":"6073e468-570c-48ce-b0b9-0d90bf19f663","Type":"ContainerDied","Data":"047beb47b3e37f6974327d2bd6716fcb41af90b165c91491a7724892163af73e"} Apr 16 19:28:52.983392 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:52.983032 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-8fmfw" event={"ID":"6073e468-570c-48ce-b0b9-0d90bf19f663","Type":"ContainerDied","Data":"c4ff77532718ae2abc271624952741a41ad9a20babec2a8b4fbe524c5b6fbad0"} Apr 16 19:28:52.983392 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:52.983058 2578 scope.go:117] "RemoveContainer" containerID="047beb47b3e37f6974327d2bd6716fcb41af90b165c91491a7724892163af73e" Apr 16 19:28:53.000350 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:53.000313 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8fmfw"] Apr 16 19:28:53.002725 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:53.002690 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-8fmfw"] Apr 16 19:28:53.265363 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:53.265328 2578 scope.go:117] "RemoveContainer" containerID="047beb47b3e37f6974327d2bd6716fcb41af90b165c91491a7724892163af73e" Apr 16 19:28:53.265741 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:28:53.265719 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"047beb47b3e37f6974327d2bd6716fcb41af90b165c91491a7724892163af73e\": container with ID starting with 047beb47b3e37f6974327d2bd6716fcb41af90b165c91491a7724892163af73e not found: ID does not exist" containerID="047beb47b3e37f6974327d2bd6716fcb41af90b165c91491a7724892163af73e" Apr 16 19:28:53.265796 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:53.265750 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"047beb47b3e37f6974327d2bd6716fcb41af90b165c91491a7724892163af73e"} err="failed to get container status \"047beb47b3e37f6974327d2bd6716fcb41af90b165c91491a7724892163af73e\": rpc error: code = NotFound desc = could not find container \"047beb47b3e37f6974327d2bd6716fcb41af90b165c91491a7724892163af73e\": container with ID starting with 047beb47b3e37f6974327d2bd6716fcb41af90b165c91491a7724892163af73e not found: ID does not exist" Apr 16 19:28:53.327571 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:53.327522 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 19:28:53.327688 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:53.327613 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 19:28:53.990575 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:53.990530 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" event={"ID":"9b488a9b-2a6e-46f5-80ea-620284daa662","Type":"ContainerStarted","Data":"4a97d86397d48d7dc2ae5c506d09da519547e6d736253363ce5dee2fc14796d0"} Apr 16 19:28:53.991037 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:53.990851 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:54.017336 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:54.017244 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" podStartSLOduration=2.317583766 podStartE2EDuration="5.017223291s" podCreationTimestamp="2026-04-16 19:28:49 +0000 UTC" firstStartedPulling="2026-04-16 19:28:50.627575478 +0000 UTC m=+642.365478723" lastFinishedPulling="2026-04-16 19:28:53.327214994 +0000 UTC m=+645.065118248" observedRunningTime="2026-04-16 19:28:54.014071823 +0000 UTC m=+645.751975106" watchObservedRunningTime="2026-04-16 19:28:54.017223291 +0000 UTC m=+645.755126564" Apr 16 19:28:54.887093 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:54.887057 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6073e468-570c-48ce-b0b9-0d90bf19f663" path="/var/lib/kubelet/pods/6073e468-570c-48ce-b0b9-0d90bf19f663/volumes" Apr 16 19:28:54.996792 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:54.996764 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-bnzvp" Apr 16 19:28:59.599116 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:59.599067 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-zzkfr" Apr 16 19:28:59.599116 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:59.599128 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-zzkfr" Apr 16 19:28:59.622267 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:28:59.622237 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-zzkfr" Apr 16 19:29:00.036502 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:00.036469 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-zzkfr" Apr 16 19:29:04.304922 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.304882 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd"] Apr 16 19:29:04.305328 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.305217 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6073e468-570c-48ce-b0b9-0d90bf19f663" containerName="registry-server" Apr 16 19:29:04.305328 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.305228 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6073e468-570c-48ce-b0b9-0d90bf19f663" containerName="registry-server" Apr 16 19:29:04.305328 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.305291 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6073e468-570c-48ce-b0b9-0d90bf19f663" containerName="registry-server" Apr 16 19:29:04.309548 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.309528 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" Apr 16 19:29:04.312024 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.312002 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-c46nd\"" Apr 16 19:29:04.318035 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.318010 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd"] Apr 16 19:29:04.368678 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.368617 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd\" (UID: \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" Apr 16 19:29:04.368861 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.368792 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctcrv\" (UniqueName: \"kubernetes.io/projected/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-kube-api-access-ctcrv\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd\" (UID: \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" Apr 16 19:29:04.368905 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.368863 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd\" (UID: \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" Apr 16 19:29:04.469529 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.469480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctcrv\" (UniqueName: \"kubernetes.io/projected/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-kube-api-access-ctcrv\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd\" (UID: \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" Apr 16 19:29:04.469727 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.469548 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd\" (UID: \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" Apr 16 19:29:04.469727 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.469580 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd\" (UID: \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" Apr 16 19:29:04.469977 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.469957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd\" (UID: \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" Apr 16 19:29:04.470032 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.469973 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd\" (UID: \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" Apr 16 19:29:04.479660 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.479624 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctcrv\" (UniqueName: \"kubernetes.io/projected/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-kube-api-access-ctcrv\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd\" (UID: \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" Apr 16 19:29:04.619525 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.619395 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" Apr 16 19:29:04.753335 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.753308 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd"] Apr 16 19:29:04.755483 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:29:04.755442 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7cd65a_fd18_4bfb_b4cf_9ac911ae2aab.slice/crio-b9db93fd74f00441b811feafa9aa7754446ea49ad7b3a209bdbe8e27b99d9035 WatchSource:0}: Error finding container b9db93fd74f00441b811feafa9aa7754446ea49ad7b3a209bdbe8e27b99d9035: Status 404 returned error can't find the container with id b9db93fd74f00441b811feafa9aa7754446ea49ad7b3a209bdbe8e27b99d9035 Apr 16 19:29:04.902336 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.902248 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp"] Apr 16 19:29:04.905808 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.905789 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" Apr 16 19:29:04.913499 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.913478 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp"] Apr 16 19:29:04.975333 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.975286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbj5f\" (UniqueName: \"kubernetes.io/projected/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-kube-api-access-lbj5f\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp\" (UID: \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" Apr 16 19:29:04.975520 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.975436 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp\" (UID: \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" Apr 16 19:29:04.975607 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:04.975588 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp\" (UID: \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" Apr 16 19:29:05.031765 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.031723 2578 generic.go:358] "Generic (PLEG): container finished" podID="1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab" containerID="5bbb77952a5ef2fca3b31e360bd40d569e76941ff59e8c1f03e8565cacb3afa2" exitCode=0 Apr 16 19:29:05.031956 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.031816 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" event={"ID":"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab","Type":"ContainerDied","Data":"5bbb77952a5ef2fca3b31e360bd40d569e76941ff59e8c1f03e8565cacb3afa2"} Apr 16 19:29:05.031956 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.031861 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" event={"ID":"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab","Type":"ContainerStarted","Data":"b9db93fd74f00441b811feafa9aa7754446ea49ad7b3a209bdbe8e27b99d9035"} Apr 16 19:29:05.077019 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.076983 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp\" (UID: \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" Apr 16 19:29:05.077209 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.077047 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbj5f\" (UniqueName: \"kubernetes.io/projected/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-kube-api-access-lbj5f\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp\" (UID: \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" Apr 16 19:29:05.077209 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.077069 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp\" (UID: \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" Apr 16 19:29:05.077527 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.077505 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp\" (UID: \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" Apr 16 19:29:05.077605 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.077509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp\" (UID: \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" Apr 16 19:29:05.086370 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.086337 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbj5f\" (UniqueName: \"kubernetes.io/projected/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-kube-api-access-lbj5f\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp\" (UID: \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" Apr 16 19:29:05.232928 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.232828 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" Apr 16 19:29:05.360850 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.360823 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp"] Apr 16 19:29:05.363040 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:29:05.363008 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d817d9e_99f2_4cf7_86f6_9c1b51fccaec.slice/crio-27890a5eb859c5f2e99f036e992bba8b044d6e8ac951179d542050771f45939d WatchSource:0}: Error finding container 27890a5eb859c5f2e99f036e992bba8b044d6e8ac951179d542050771f45939d: Status 404 returned error can't find the container with id 27890a5eb859c5f2e99f036e992bba8b044d6e8ac951179d542050771f45939d Apr 16 19:29:05.500991 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.500947 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54"] Apr 16 19:29:05.504359 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.504334 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" Apr 16 19:29:05.514793 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.514762 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54"] Apr 16 19:29:05.582662 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.582609 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/179bc98b-905e-44ee-b185-aeabdd7718ec-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54\" (UID: \"179bc98b-905e-44ee-b185-aeabdd7718ec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" Apr 16 19:29:05.582816 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.582675 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/179bc98b-905e-44ee-b185-aeabdd7718ec-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54\" (UID: \"179bc98b-905e-44ee-b185-aeabdd7718ec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" Apr 16 19:29:05.582816 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.582726 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbjk7\" (UniqueName: \"kubernetes.io/projected/179bc98b-905e-44ee-b185-aeabdd7718ec-kube-api-access-sbjk7\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54\" (UID: \"179bc98b-905e-44ee-b185-aeabdd7718ec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" Apr 16 19:29:05.684080 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.684029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/179bc98b-905e-44ee-b185-aeabdd7718ec-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54\" (UID: \"179bc98b-905e-44ee-b185-aeabdd7718ec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" Apr 16 19:29:05.684230 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.684132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/179bc98b-905e-44ee-b185-aeabdd7718ec-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54\" (UID: \"179bc98b-905e-44ee-b185-aeabdd7718ec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" Apr 16 19:29:05.684230 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.684177 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbjk7\" (UniqueName: \"kubernetes.io/projected/179bc98b-905e-44ee-b185-aeabdd7718ec-kube-api-access-sbjk7\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54\" (UID: \"179bc98b-905e-44ee-b185-aeabdd7718ec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" Apr 16 19:29:05.684420 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.684380 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/179bc98b-905e-44ee-b185-aeabdd7718ec-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54\" (UID: \"179bc98b-905e-44ee-b185-aeabdd7718ec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" Apr 16 19:29:05.684463 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.684423 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/179bc98b-905e-44ee-b185-aeabdd7718ec-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54\" (UID: \"179bc98b-905e-44ee-b185-aeabdd7718ec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" Apr 16 19:29:05.692957 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.692932 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbjk7\" (UniqueName: \"kubernetes.io/projected/179bc98b-905e-44ee-b185-aeabdd7718ec-kube-api-access-sbjk7\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54\" (UID: \"179bc98b-905e-44ee-b185-aeabdd7718ec\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" Apr 16 19:29:05.846051 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.845950 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" Apr 16 19:29:05.910758 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.910717 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf"] Apr 16 19:29:05.914889 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.914862 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" Apr 16 19:29:05.925320 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.924858 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf"] Apr 16 19:29:05.986639 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.986602 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e531a1f-6088-43fd-8a79-76d9f94a2aea-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf\" (UID: \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" Apr 16 19:29:05.986761 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.986695 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e531a1f-6088-43fd-8a79-76d9f94a2aea-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf\" (UID: \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" Apr 16 19:29:05.986761 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:05.986723 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhwb4\" (UniqueName: \"kubernetes.io/projected/0e531a1f-6088-43fd-8a79-76d9f94a2aea-kube-api-access-hhwb4\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf\" (UID: \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" Apr 16 19:29:06.011359 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.011322 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54"] Apr 16 19:29:06.012767 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:29:06.012727 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod179bc98b_905e_44ee_b185_aeabdd7718ec.slice/crio-abade578222a92a1871dc327e6b2787754f8ebcae671b57ec69f81b9f0c273af WatchSource:0}: Error finding container abade578222a92a1871dc327e6b2787754f8ebcae671b57ec69f81b9f0c273af: Status 404 returned error can't find the container with id abade578222a92a1871dc327e6b2787754f8ebcae671b57ec69f81b9f0c273af Apr 16 19:29:06.037294 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.037264 2578 generic.go:358] "Generic (PLEG): container finished" podID="8d817d9e-99f2-4cf7-86f6-9c1b51fccaec" containerID="fb0e336ffdd6aaa2de6a2c72b8d5d2cb4c1993628462df5942b3e403caddbfe8" exitCode=0 Apr 16 19:29:06.037432 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.037346 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" event={"ID":"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec","Type":"ContainerDied","Data":"fb0e336ffdd6aaa2de6a2c72b8d5d2cb4c1993628462df5942b3e403caddbfe8"} Apr 16 19:29:06.037432 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.037379 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" event={"ID":"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec","Type":"ContainerStarted","Data":"27890a5eb859c5f2e99f036e992bba8b044d6e8ac951179d542050771f45939d"} Apr 16 19:29:06.038544 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.038518 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" event={"ID":"179bc98b-905e-44ee-b185-aeabdd7718ec","Type":"ContainerStarted","Data":"abade578222a92a1871dc327e6b2787754f8ebcae671b57ec69f81b9f0c273af"} Apr 16 19:29:06.040064 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.040041 2578 generic.go:358] "Generic (PLEG): container finished" podID="1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab" containerID="304e32f307d28cc1f7305cf45ffcecd0d5cc72781e9ecb4baae19bafa8e8d6fc" exitCode=0 Apr 16 19:29:06.040142 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.040116 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" event={"ID":"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab","Type":"ContainerDied","Data":"304e32f307d28cc1f7305cf45ffcecd0d5cc72781e9ecb4baae19bafa8e8d6fc"} Apr 16 19:29:06.087822 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.087796 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e531a1f-6088-43fd-8a79-76d9f94a2aea-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf\" (UID: \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" Apr 16 19:29:06.087940 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.087831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e531a1f-6088-43fd-8a79-76d9f94a2aea-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf\" (UID: \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" Apr 16 19:29:06.087940 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.087858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhwb4\" (UniqueName: \"kubernetes.io/projected/0e531a1f-6088-43fd-8a79-76d9f94a2aea-kube-api-access-hhwb4\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf\" (UID: \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" Apr 16 19:29:06.088263 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.088242 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e531a1f-6088-43fd-8a79-76d9f94a2aea-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf\" (UID: \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" Apr 16 19:29:06.088333 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.088275 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e531a1f-6088-43fd-8a79-76d9f94a2aea-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf\" (UID: \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" Apr 16 19:29:06.096431 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.096349 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhwb4\" (UniqueName: \"kubernetes.io/projected/0e531a1f-6088-43fd-8a79-76d9f94a2aea-kube-api-access-hhwb4\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf\" (UID: \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" Apr 16 19:29:06.238375 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.238343 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" Apr 16 19:29:06.363148 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:06.363074 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf"] Apr 16 19:29:06.366240 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:29:06.366212 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e531a1f_6088_43fd_8a79_76d9f94a2aea.slice/crio-1a9359a5a7dca89b8615310ac716b2943cd0cccc336f19af9d07ab7499f231b2 WatchSource:0}: Error finding container 1a9359a5a7dca89b8615310ac716b2943cd0cccc336f19af9d07ab7499f231b2: Status 404 returned error can't find the container with id 1a9359a5a7dca89b8615310ac716b2943cd0cccc336f19af9d07ab7499f231b2 Apr 16 19:29:07.045887 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:07.045835 2578 generic.go:358] "Generic (PLEG): container finished" podID="8d817d9e-99f2-4cf7-86f6-9c1b51fccaec" containerID="2adb9ca0026f6714ed93fd4c2b9c8b4d7bcbfbbe1c4874ac89b92a48e1090198" exitCode=0 Apr 16 19:29:07.046094 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:07.045914 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" event={"ID":"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec","Type":"ContainerDied","Data":"2adb9ca0026f6714ed93fd4c2b9c8b4d7bcbfbbe1c4874ac89b92a48e1090198"} Apr 16 19:29:07.047463 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:07.047397 2578 generic.go:358] "Generic (PLEG): container finished" podID="0e531a1f-6088-43fd-8a79-76d9f94a2aea" containerID="7971c0b054cc18aa26035f17cd478221b203927dda1a14bfdc2f2ae3ab146756" exitCode=0 Apr 16 19:29:07.047583 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:07.047436 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" event={"ID":"0e531a1f-6088-43fd-8a79-76d9f94a2aea","Type":"ContainerDied","Data":"7971c0b054cc18aa26035f17cd478221b203927dda1a14bfdc2f2ae3ab146756"} Apr 16 19:29:07.047583 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:07.047554 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" event={"ID":"0e531a1f-6088-43fd-8a79-76d9f94a2aea","Type":"ContainerStarted","Data":"1a9359a5a7dca89b8615310ac716b2943cd0cccc336f19af9d07ab7499f231b2"} Apr 16 19:29:07.049086 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:07.049059 2578 generic.go:358] "Generic (PLEG): container finished" podID="179bc98b-905e-44ee-b185-aeabdd7718ec" containerID="fb490b944183ee6b18a49ab4870b35731bbb7b0fb1027ad52eb8848be61576c9" exitCode=0 Apr 16 19:29:07.049238 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:07.049167 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" event={"ID":"179bc98b-905e-44ee-b185-aeabdd7718ec","Type":"ContainerDied","Data":"fb490b944183ee6b18a49ab4870b35731bbb7b0fb1027ad52eb8848be61576c9"} Apr 16 19:29:07.051913 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:07.051888 2578 generic.go:358] "Generic (PLEG): container finished" podID="1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab" containerID="c3a04fe20ab5f4c5d8fad5e681ab541184eb0a00f05f6733b26bd6bb46e8636f" exitCode=0 Apr 16 19:29:07.051995 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:07.051921 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" event={"ID":"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab","Type":"ContainerDied","Data":"c3a04fe20ab5f4c5d8fad5e681ab541184eb0a00f05f6733b26bd6bb46e8636f"} Apr 16 19:29:08.058535 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.058493 2578 generic.go:358] "Generic (PLEG): container finished" podID="8d817d9e-99f2-4cf7-86f6-9c1b51fccaec" containerID="3ada06cefce407618ece240f6dece676e2f65aee81ec9694d02ed2be827f65ef" exitCode=0 Apr 16 19:29:08.058971 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.058570 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" event={"ID":"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec","Type":"ContainerDied","Data":"3ada06cefce407618ece240f6dece676e2f65aee81ec9694d02ed2be827f65ef"} Apr 16 19:29:08.060213 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.060185 2578 generic.go:358] "Generic (PLEG): container finished" podID="0e531a1f-6088-43fd-8a79-76d9f94a2aea" containerID="894ad0027a2a3b80a72681c2bc3f07ba803c2a72d5e991731e3ea0288b77a06b" exitCode=0 Apr 16 19:29:08.060338 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.060264 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" event={"ID":"0e531a1f-6088-43fd-8a79-76d9f94a2aea","Type":"ContainerDied","Data":"894ad0027a2a3b80a72681c2bc3f07ba803c2a72d5e991731e3ea0288b77a06b"} Apr 16 19:29:08.062022 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.061906 2578 generic.go:358] "Generic (PLEG): container finished" podID="179bc98b-905e-44ee-b185-aeabdd7718ec" containerID="cbc85c29dda7d8bac641371300fead2e8353ffd7eabbc638bd27de0bf80a52de" exitCode=0 Apr 16 19:29:08.062166 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.062142 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" event={"ID":"179bc98b-905e-44ee-b185-aeabdd7718ec","Type":"ContainerDied","Data":"cbc85c29dda7d8bac641371300fead2e8353ffd7eabbc638bd27de0bf80a52de"} Apr 16 19:29:08.192345 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.192316 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" Apr 16 19:29:08.310095 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.310054 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctcrv\" (UniqueName: \"kubernetes.io/projected/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-kube-api-access-ctcrv\") pod \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\" (UID: \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\") " Apr 16 19:29:08.310265 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.310102 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-util\") pod \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\" (UID: \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\") " Apr 16 19:29:08.310325 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.310296 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-bundle\") pod \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\" (UID: \"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab\") " Apr 16 19:29:08.310831 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.310807 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-bundle" (OuterVolumeSpecName: "bundle") pod "1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab" (UID: "1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:29:08.312460 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.312435 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-kube-api-access-ctcrv" (OuterVolumeSpecName: "kube-api-access-ctcrv") pod "1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab" (UID: "1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab"). InnerVolumeSpecName "kube-api-access-ctcrv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:29:08.315375 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.315348 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-util" (OuterVolumeSpecName: "util") pod "1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab" (UID: "1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:29:08.411683 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.411582 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-bundle\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:29:08.411683 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.411624 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ctcrv\" (UniqueName: \"kubernetes.io/projected/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-kube-api-access-ctcrv\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:29:08.411683 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:08.411637 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab-util\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:29:09.068070 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.067977 2578 generic.go:358] "Generic (PLEG): container finished" podID="0e531a1f-6088-43fd-8a79-76d9f94a2aea" containerID="02804f4a9d3ae19341f150c6865273665394e2121e1a1c5b87ce0c8620812c66" exitCode=0 Apr 16 19:29:09.068070 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.068010 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" event={"ID":"0e531a1f-6088-43fd-8a79-76d9f94a2aea","Type":"ContainerDied","Data":"02804f4a9d3ae19341f150c6865273665394e2121e1a1c5b87ce0c8620812c66"} Apr 16 19:29:09.069911 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.069887 2578 generic.go:358] "Generic (PLEG): container finished" podID="179bc98b-905e-44ee-b185-aeabdd7718ec" containerID="66a7a7d204ff0b64bc8ce13ddc4cdf1ebc282014ef553c0b3438e59e97c83917" exitCode=0 Apr 16 19:29:09.070057 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.069969 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" event={"ID":"179bc98b-905e-44ee-b185-aeabdd7718ec","Type":"ContainerDied","Data":"66a7a7d204ff0b64bc8ce13ddc4cdf1ebc282014ef553c0b3438e59e97c83917"} Apr 16 19:29:09.071651 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.071629 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" Apr 16 19:29:09.071766 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.071630 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd" event={"ID":"1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab","Type":"ContainerDied","Data":"b9db93fd74f00441b811feafa9aa7754446ea49ad7b3a209bdbe8e27b99d9035"} Apr 16 19:29:09.071766 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.071731 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9db93fd74f00441b811feafa9aa7754446ea49ad7b3a209bdbe8e27b99d9035" Apr 16 19:29:09.200784 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.200757 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" Apr 16 19:29:09.322583 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.322486 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-util\") pod \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\" (UID: \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\") " Apr 16 19:29:09.322583 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.322563 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-bundle\") pod \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\" (UID: \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\") " Apr 16 19:29:09.322795 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.322590 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbj5f\" (UniqueName: \"kubernetes.io/projected/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-kube-api-access-lbj5f\") pod \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\" (UID: \"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec\") " Apr 16 19:29:09.323045 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.323019 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-bundle" (OuterVolumeSpecName: "bundle") pod "8d817d9e-99f2-4cf7-86f6-9c1b51fccaec" (UID: "8d817d9e-99f2-4cf7-86f6-9c1b51fccaec"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:29:09.324966 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.324943 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-kube-api-access-lbj5f" (OuterVolumeSpecName: "kube-api-access-lbj5f") pod "8d817d9e-99f2-4cf7-86f6-9c1b51fccaec" (UID: "8d817d9e-99f2-4cf7-86f6-9c1b51fccaec"). InnerVolumeSpecName "kube-api-access-lbj5f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:29:09.327714 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.327678 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-util" (OuterVolumeSpecName: "util") pod "8d817d9e-99f2-4cf7-86f6-9c1b51fccaec" (UID: "8d817d9e-99f2-4cf7-86f6-9c1b51fccaec"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:29:09.423651 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.423595 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-bundle\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:29:09.423651 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.423643 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lbj5f\" (UniqueName: \"kubernetes.io/projected/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-kube-api-access-lbj5f\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:29:09.423651 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:09.423657 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8d817d9e-99f2-4cf7-86f6-9c1b51fccaec-util\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:29:10.076880 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.076843 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" Apr 16 19:29:10.076880 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.076858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp" event={"ID":"8d817d9e-99f2-4cf7-86f6-9c1b51fccaec","Type":"ContainerDied","Data":"27890a5eb859c5f2e99f036e992bba8b044d6e8ac951179d542050771f45939d"} Apr 16 19:29:10.077401 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.076899 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27890a5eb859c5f2e99f036e992bba8b044d6e8ac951179d542050771f45939d" Apr 16 19:29:10.205565 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.205543 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" Apr 16 19:29:10.229914 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.229891 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" Apr 16 19:29:10.331125 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.331028 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e531a1f-6088-43fd-8a79-76d9f94a2aea-util\") pod \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\" (UID: \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\") " Apr 16 19:29:10.331125 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.331074 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/179bc98b-905e-44ee-b185-aeabdd7718ec-bundle\") pod \"179bc98b-905e-44ee-b185-aeabdd7718ec\" (UID: \"179bc98b-905e-44ee-b185-aeabdd7718ec\") " Apr 16 19:29:10.331324 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.331132 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/179bc98b-905e-44ee-b185-aeabdd7718ec-util\") pod \"179bc98b-905e-44ee-b185-aeabdd7718ec\" (UID: \"179bc98b-905e-44ee-b185-aeabdd7718ec\") " Apr 16 19:29:10.331324 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.331159 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e531a1f-6088-43fd-8a79-76d9f94a2aea-bundle\") pod \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\" (UID: \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\") " Apr 16 19:29:10.331324 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.331221 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbjk7\" (UniqueName: \"kubernetes.io/projected/179bc98b-905e-44ee-b185-aeabdd7718ec-kube-api-access-sbjk7\") pod \"179bc98b-905e-44ee-b185-aeabdd7718ec\" (UID: \"179bc98b-905e-44ee-b185-aeabdd7718ec\") " Apr 16 19:29:10.331324 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.331246 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhwb4\" (UniqueName: \"kubernetes.io/projected/0e531a1f-6088-43fd-8a79-76d9f94a2aea-kube-api-access-hhwb4\") pod \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\" (UID: \"0e531a1f-6088-43fd-8a79-76d9f94a2aea\") " Apr 16 19:29:10.331739 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.331702 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/179bc98b-905e-44ee-b185-aeabdd7718ec-bundle" (OuterVolumeSpecName: "bundle") pod "179bc98b-905e-44ee-b185-aeabdd7718ec" (UID: "179bc98b-905e-44ee-b185-aeabdd7718ec"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:29:10.332018 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.331973 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e531a1f-6088-43fd-8a79-76d9f94a2aea-bundle" (OuterVolumeSpecName: "bundle") pod "0e531a1f-6088-43fd-8a79-76d9f94a2aea" (UID: "0e531a1f-6088-43fd-8a79-76d9f94a2aea"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:29:10.333844 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.333818 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e531a1f-6088-43fd-8a79-76d9f94a2aea-kube-api-access-hhwb4" (OuterVolumeSpecName: "kube-api-access-hhwb4") pod "0e531a1f-6088-43fd-8a79-76d9f94a2aea" (UID: "0e531a1f-6088-43fd-8a79-76d9f94a2aea"). InnerVolumeSpecName "kube-api-access-hhwb4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:29:10.333945 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.333855 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/179bc98b-905e-44ee-b185-aeabdd7718ec-kube-api-access-sbjk7" (OuterVolumeSpecName: "kube-api-access-sbjk7") pod "179bc98b-905e-44ee-b185-aeabdd7718ec" (UID: "179bc98b-905e-44ee-b185-aeabdd7718ec"). InnerVolumeSpecName "kube-api-access-sbjk7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:29:10.339627 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.339590 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e531a1f-6088-43fd-8a79-76d9f94a2aea-util" (OuterVolumeSpecName: "util") pod "0e531a1f-6088-43fd-8a79-76d9f94a2aea" (UID: "0e531a1f-6088-43fd-8a79-76d9f94a2aea"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:29:10.340000 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.339983 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/179bc98b-905e-44ee-b185-aeabdd7718ec-util" (OuterVolumeSpecName: "util") pod "179bc98b-905e-44ee-b185-aeabdd7718ec" (UID: "179bc98b-905e-44ee-b185-aeabdd7718ec"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:29:10.432330 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.432289 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sbjk7\" (UniqueName: \"kubernetes.io/projected/179bc98b-905e-44ee-b185-aeabdd7718ec-kube-api-access-sbjk7\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:29:10.432330 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.432328 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hhwb4\" (UniqueName: \"kubernetes.io/projected/0e531a1f-6088-43fd-8a79-76d9f94a2aea-kube-api-access-hhwb4\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:29:10.432605 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.432345 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e531a1f-6088-43fd-8a79-76d9f94a2aea-util\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:29:10.432605 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.432360 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/179bc98b-905e-44ee-b185-aeabdd7718ec-bundle\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:29:10.432605 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.432375 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/179bc98b-905e-44ee-b185-aeabdd7718ec-util\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:29:10.432605 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:10.432389 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e531a1f-6088-43fd-8a79-76d9f94a2aea-bundle\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:29:11.082208 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:11.082159 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" event={"ID":"0e531a1f-6088-43fd-8a79-76d9f94a2aea","Type":"ContainerDied","Data":"1a9359a5a7dca89b8615310ac716b2943cd0cccc336f19af9d07ab7499f231b2"} Apr 16 19:29:11.082208 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:11.082192 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf" Apr 16 19:29:11.082208 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:11.082205 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a9359a5a7dca89b8615310ac716b2943cd0cccc336f19af9d07ab7499f231b2" Apr 16 19:29:11.083927 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:11.083901 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" event={"ID":"179bc98b-905e-44ee-b185-aeabdd7718ec","Type":"ContainerDied","Data":"abade578222a92a1871dc327e6b2787754f8ebcae671b57ec69f81b9f0c273af"} Apr 16 19:29:11.083927 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:11.083925 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54" Apr 16 19:29:11.084076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:11.083933 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abade578222a92a1871dc327e6b2787754f8ebcae671b57ec69f81b9f0c273af" Apr 16 19:29:24.770277 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.770223 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-2nkqg"] Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.770986 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d817d9e-99f2-4cf7-86f6-9c1b51fccaec" containerName="extract" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771017 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d817d9e-99f2-4cf7-86f6-9c1b51fccaec" containerName="extract" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771032 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d817d9e-99f2-4cf7-86f6-9c1b51fccaec" containerName="util" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771046 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d817d9e-99f2-4cf7-86f6-9c1b51fccaec" containerName="util" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771068 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab" containerName="util" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771076 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab" containerName="util" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771094 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab" containerName="pull" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771102 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab" containerName="pull" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771120 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e531a1f-6088-43fd-8a79-76d9f94a2aea" containerName="pull" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771127 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e531a1f-6088-43fd-8a79-76d9f94a2aea" containerName="pull" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771136 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e531a1f-6088-43fd-8a79-76d9f94a2aea" containerName="extract" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771144 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e531a1f-6088-43fd-8a79-76d9f94a2aea" containerName="extract" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771161 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab" containerName="extract" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771168 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab" containerName="extract" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771184 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d817d9e-99f2-4cf7-86f6-9c1b51fccaec" containerName="pull" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771192 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d817d9e-99f2-4cf7-86f6-9c1b51fccaec" containerName="pull" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771202 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="179bc98b-905e-44ee-b185-aeabdd7718ec" containerName="extract" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771210 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="179bc98b-905e-44ee-b185-aeabdd7718ec" containerName="extract" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771227 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="179bc98b-905e-44ee-b185-aeabdd7718ec" containerName="util" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771234 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="179bc98b-905e-44ee-b185-aeabdd7718ec" containerName="util" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771251 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e531a1f-6088-43fd-8a79-76d9f94a2aea" containerName="util" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771258 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e531a1f-6088-43fd-8a79-76d9f94a2aea" containerName="util" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771274 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="179bc98b-905e-44ee-b185-aeabdd7718ec" containerName="pull" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771281 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="179bc98b-905e-44ee-b185-aeabdd7718ec" containerName="pull" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771464 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d817d9e-99f2-4cf7-86f6-9c1b51fccaec" containerName="extract" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771487 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab" containerName="extract" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771500 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="179bc98b-905e-44ee-b185-aeabdd7718ec" containerName="extract" Apr 16 19:29:24.772134 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.771517 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e531a1f-6088-43fd-8a79-76d9f94a2aea" containerName="extract" Apr 16 19:29:24.775171 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.775148 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-2nkqg" Apr 16 19:29:24.777766 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.777742 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-plbsp\"" Apr 16 19:29:24.778267 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.778244 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-2nkqg"] Apr 16 19:29:24.955618 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:24.955578 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krm7r\" (UniqueName: \"kubernetes.io/projected/42eaca63-e226-40c1-ad9a-98319b8d009f-kube-api-access-krm7r\") pod \"authorino-operator-657f44b778-2nkqg\" (UID: \"42eaca63-e226-40c1-ad9a-98319b8d009f\") " pod="kuadrant-system/authorino-operator-657f44b778-2nkqg" Apr 16 19:29:25.056394 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:25.056294 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krm7r\" (UniqueName: \"kubernetes.io/projected/42eaca63-e226-40c1-ad9a-98319b8d009f-kube-api-access-krm7r\") pod \"authorino-operator-657f44b778-2nkqg\" (UID: \"42eaca63-e226-40c1-ad9a-98319b8d009f\") " pod="kuadrant-system/authorino-operator-657f44b778-2nkqg" Apr 16 19:29:25.069433 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:25.069380 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krm7r\" (UniqueName: \"kubernetes.io/projected/42eaca63-e226-40c1-ad9a-98319b8d009f-kube-api-access-krm7r\") pod \"authorino-operator-657f44b778-2nkqg\" (UID: \"42eaca63-e226-40c1-ad9a-98319b8d009f\") " pod="kuadrant-system/authorino-operator-657f44b778-2nkqg" Apr 16 19:29:25.086314 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:25.086283 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-2nkqg" Apr 16 19:29:25.241448 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:29:25.241397 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42eaca63_e226_40c1_ad9a_98319b8d009f.slice/crio-d2a99cd69a7feec1aee974af3fa1e720179121bcd2cc414a5c003e704b77bd42 WatchSource:0}: Error finding container d2a99cd69a7feec1aee974af3fa1e720179121bcd2cc414a5c003e704b77bd42: Status 404 returned error can't find the container with id d2a99cd69a7feec1aee974af3fa1e720179121bcd2cc414a5c003e704b77bd42 Apr 16 19:29:25.246167 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:25.246144 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-2nkqg"] Apr 16 19:29:26.143810 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:26.143771 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-2nkqg" event={"ID":"42eaca63-e226-40c1-ad9a-98319b8d009f","Type":"ContainerStarted","Data":"d2a99cd69a7feec1aee974af3fa1e720179121bcd2cc414a5c003e704b77bd42"} Apr 16 19:29:27.148844 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:27.148797 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-2nkqg" event={"ID":"42eaca63-e226-40c1-ad9a-98319b8d009f","Type":"ContainerStarted","Data":"4e443730188dd6ca9cd83bf562c68d483eccf84592560812bc8f94e505815580"} Apr 16 19:29:27.149273 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:27.148919 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-2nkqg" Apr 16 19:29:27.181924 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:27.181799 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-2nkqg" podStartSLOduration=1.515412964 podStartE2EDuration="3.181781677s" podCreationTimestamp="2026-04-16 19:29:24 +0000 UTC" firstStartedPulling="2026-04-16 19:29:25.24397356 +0000 UTC m=+676.981876821" lastFinishedPulling="2026-04-16 19:29:26.910342286 +0000 UTC m=+678.648245534" observedRunningTime="2026-04-16 19:29:27.178897008 +0000 UTC m=+678.916800276" watchObservedRunningTime="2026-04-16 19:29:27.181781677 +0000 UTC m=+678.919684944" Apr 16 19:29:38.155098 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:38.155049 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-2nkqg" Apr 16 19:29:52.148599 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:52.148516 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46"] Apr 16 19:29:52.152247 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:52.152223 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" Apr 16 19:29:52.155051 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:52.155030 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-fm64z\"" Apr 16 19:29:52.164382 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:52.164356 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46"] Apr 16 19:29:52.189639 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:52.189594 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/24b5fb3d-610a-4a51-b573-65fd203f1a2d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cqf46\" (UID: \"24b5fb3d-610a-4a51-b573-65fd203f1a2d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" Apr 16 19:29:52.189823 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:52.189761 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7fc8\" (UniqueName: \"kubernetes.io/projected/24b5fb3d-610a-4a51-b573-65fd203f1a2d-kube-api-access-b7fc8\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cqf46\" (UID: \"24b5fb3d-610a-4a51-b573-65fd203f1a2d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" Apr 16 19:29:52.290395 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:52.290342 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7fc8\" (UniqueName: \"kubernetes.io/projected/24b5fb3d-610a-4a51-b573-65fd203f1a2d-kube-api-access-b7fc8\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cqf46\" (UID: \"24b5fb3d-610a-4a51-b573-65fd203f1a2d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" Apr 16 19:29:52.290641 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:52.290451 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/24b5fb3d-610a-4a51-b573-65fd203f1a2d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cqf46\" (UID: \"24b5fb3d-610a-4a51-b573-65fd203f1a2d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" Apr 16 19:29:52.290867 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:52.290842 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/24b5fb3d-610a-4a51-b573-65fd203f1a2d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cqf46\" (UID: \"24b5fb3d-610a-4a51-b573-65fd203f1a2d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" Apr 16 19:29:52.306957 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:52.306917 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7fc8\" (UniqueName: \"kubernetes.io/projected/24b5fb3d-610a-4a51-b573-65fd203f1a2d-kube-api-access-b7fc8\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cqf46\" (UID: \"24b5fb3d-610a-4a51-b573-65fd203f1a2d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" Apr 16 19:29:52.463164 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:52.463055 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" Apr 16 19:29:52.609727 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:52.609697 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46"] Apr 16 19:29:52.612026 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:29:52.611998 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24b5fb3d_610a_4a51_b573_65fd203f1a2d.slice/crio-7ea717f5d9604722d9c8feda4d37a6dab703a1f2016dfbe49d6d94a51efeb23e WatchSource:0}: Error finding container 7ea717f5d9604722d9c8feda4d37a6dab703a1f2016dfbe49d6d94a51efeb23e: Status 404 returned error can't find the container with id 7ea717f5d9604722d9c8feda4d37a6dab703a1f2016dfbe49d6d94a51efeb23e Apr 16 19:29:52.615021 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:52.615004 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:29:53.252628 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:53.252584 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" event={"ID":"24b5fb3d-610a-4a51-b573-65fd203f1a2d","Type":"ContainerStarted","Data":"7ea717f5d9604722d9c8feda4d37a6dab703a1f2016dfbe49d6d94a51efeb23e"} Apr 16 19:29:57.272657 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:57.272610 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" event={"ID":"24b5fb3d-610a-4a51-b573-65fd203f1a2d","Type":"ContainerStarted","Data":"13342c5f3387d7ee3bc1e333958e23eb67f052094ce2783c3a94842b87110000"} Apr 16 19:29:57.273156 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:57.272756 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" Apr 16 19:29:57.316588 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:29:57.316534 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" podStartSLOduration=1.459832781 podStartE2EDuration="5.316516302s" podCreationTimestamp="2026-04-16 19:29:52 +0000 UTC" firstStartedPulling="2026-04-16 19:29:52.615135906 +0000 UTC m=+704.353039151" lastFinishedPulling="2026-04-16 19:29:56.471819426 +0000 UTC m=+708.209722672" observedRunningTime="2026-04-16 19:29:57.314036371 +0000 UTC m=+709.051939640" watchObservedRunningTime="2026-04-16 19:29:57.316516302 +0000 UTC m=+709.054419648" Apr 16 19:30:08.279013 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:08.278977 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" Apr 16 19:30:58.253464 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:58.253431 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-74dd74c689-btkxl"] Apr 16 19:30:58.256805 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:58.256781 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-74dd74c689-btkxl" Apr 16 19:30:58.260105 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:58.260083 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 19:30:58.260225 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:58.260149 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-kq8gt\"" Apr 16 19:30:58.271036 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:58.271004 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-74dd74c689-btkxl"] Apr 16 19:30:58.387096 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:58.387041 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfr44\" (UniqueName: \"kubernetes.io/projected/32dbcf5b-1ddf-44eb-a1e0-1d187279c952-kube-api-access-lfr44\") pod \"maas-controller-74dd74c689-btkxl\" (UID: \"32dbcf5b-1ddf-44eb-a1e0-1d187279c952\") " pod="opendatahub/maas-controller-74dd74c689-btkxl" Apr 16 19:30:58.487648 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:58.487608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfr44\" (UniqueName: \"kubernetes.io/projected/32dbcf5b-1ddf-44eb-a1e0-1d187279c952-kube-api-access-lfr44\") pod \"maas-controller-74dd74c689-btkxl\" (UID: \"32dbcf5b-1ddf-44eb-a1e0-1d187279c952\") " pod="opendatahub/maas-controller-74dd74c689-btkxl" Apr 16 19:30:58.496562 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:58.496526 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfr44\" (UniqueName: \"kubernetes.io/projected/32dbcf5b-1ddf-44eb-a1e0-1d187279c952-kube-api-access-lfr44\") pod \"maas-controller-74dd74c689-btkxl\" (UID: \"32dbcf5b-1ddf-44eb-a1e0-1d187279c952\") " pod="opendatahub/maas-controller-74dd74c689-btkxl" Apr 16 19:30:58.566838 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:58.566753 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-74dd74c689-btkxl" Apr 16 19:30:58.714917 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:58.714879 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-74dd74c689-btkxl"] Apr 16 19:30:58.718463 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:30:58.718436 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32dbcf5b_1ddf_44eb_a1e0_1d187279c952.slice/crio-5785917f5a582c4db62454ee53f307a648f24b0f1a9176f35e8340053e728551 WatchSource:0}: Error finding container 5785917f5a582c4db62454ee53f307a648f24b0f1a9176f35e8340053e728551: Status 404 returned error can't find the container with id 5785917f5a582c4db62454ee53f307a648f24b0f1a9176f35e8340053e728551 Apr 16 19:30:59.158288 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:59.158249 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-866cd599f-xjm69"] Apr 16 19:30:59.163041 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:59.162990 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-866cd599f-xjm69" Apr 16 19:30:59.166845 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:59.166821 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 19:30:59.167789 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:59.167763 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-cczkd\"" Apr 16 19:30:59.189768 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:59.189736 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-866cd599f-xjm69"] Apr 16 19:30:59.294965 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:59.294928 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ngqk\" (UniqueName: \"kubernetes.io/projected/dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d-kube-api-access-7ngqk\") pod \"maas-api-866cd599f-xjm69\" (UID: \"dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d\") " pod="opendatahub/maas-api-866cd599f-xjm69" Apr 16 19:30:59.295377 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:59.295017 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d-maas-api-tls\") pod \"maas-api-866cd599f-xjm69\" (UID: \"dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d\") " pod="opendatahub/maas-api-866cd599f-xjm69" Apr 16 19:30:59.396170 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:59.396125 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d-maas-api-tls\") pod \"maas-api-866cd599f-xjm69\" (UID: \"dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d\") " pod="opendatahub/maas-api-866cd599f-xjm69" Apr 16 19:30:59.396361 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:59.396223 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ngqk\" (UniqueName: \"kubernetes.io/projected/dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d-kube-api-access-7ngqk\") pod \"maas-api-866cd599f-xjm69\" (UID: \"dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d\") " pod="opendatahub/maas-api-866cd599f-xjm69" Apr 16 19:30:59.399387 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:59.399357 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d-maas-api-tls\") pod \"maas-api-866cd599f-xjm69\" (UID: \"dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d\") " pod="opendatahub/maas-api-866cd599f-xjm69" Apr 16 19:30:59.431133 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:59.431043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ngqk\" (UniqueName: \"kubernetes.io/projected/dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d-kube-api-access-7ngqk\") pod \"maas-api-866cd599f-xjm69\" (UID: \"dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d\") " pod="opendatahub/maas-api-866cd599f-xjm69" Apr 16 19:30:59.473616 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:59.473573 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-866cd599f-xjm69" Apr 16 19:30:59.515170 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:59.515103 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-74dd74c689-btkxl" event={"ID":"32dbcf5b-1ddf-44eb-a1e0-1d187279c952","Type":"ContainerStarted","Data":"5785917f5a582c4db62454ee53f307a648f24b0f1a9176f35e8340053e728551"} Apr 16 19:30:59.695524 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:30:59.695462 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-866cd599f-xjm69"] Apr 16 19:30:59.696720 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:30:59.696694 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc0c645f_bc3d_4d13_9cd4_6ec51e6c2a4d.slice/crio-ccc73ee100b38e1c3a81db16f28a0c5c53782c800c9febaee808f027bec242ed WatchSource:0}: Error finding container ccc73ee100b38e1c3a81db16f28a0c5c53782c800c9febaee808f027bec242ed: Status 404 returned error can't find the container with id ccc73ee100b38e1c3a81db16f28a0c5c53782c800c9febaee808f027bec242ed Apr 16 19:31:00.521924 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:00.521884 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-866cd599f-xjm69" event={"ID":"dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d","Type":"ContainerStarted","Data":"ccc73ee100b38e1c3a81db16f28a0c5c53782c800c9febaee808f027bec242ed"} Apr 16 19:31:01.531491 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:01.531442 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-74dd74c689-btkxl" event={"ID":"32dbcf5b-1ddf-44eb-a1e0-1d187279c952","Type":"ContainerStarted","Data":"870c190b669de1b657a11b41431fbcdbc90ca4571313f1eff77895a6213fe8a6"} Apr 16 19:31:01.531944 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:01.531769 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-74dd74c689-btkxl" Apr 16 19:31:01.553527 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:01.553459 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-74dd74c689-btkxl" podStartSLOduration=1.097665474 podStartE2EDuration="3.553440742s" podCreationTimestamp="2026-04-16 19:30:58 +0000 UTC" firstStartedPulling="2026-04-16 19:30:58.720034886 +0000 UTC m=+770.457938133" lastFinishedPulling="2026-04-16 19:31:01.175810152 +0000 UTC m=+772.913713401" observedRunningTime="2026-04-16 19:31:01.551578346 +0000 UTC m=+773.289481615" watchObservedRunningTime="2026-04-16 19:31:01.553440742 +0000 UTC m=+773.291344009" Apr 16 19:31:02.537820 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:02.537782 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-866cd599f-xjm69" event={"ID":"dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d","Type":"ContainerStarted","Data":"3592f412ce4cffbd3f4424a3711820e606281ef7aa9fbebcb9bc279a5859e87c"} Apr 16 19:31:02.538302 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:02.537895 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-866cd599f-xjm69" Apr 16 19:31:02.557160 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:02.557102 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-866cd599f-xjm69" podStartSLOduration=1.30984152 podStartE2EDuration="3.557083475s" podCreationTimestamp="2026-04-16 19:30:59 +0000 UTC" firstStartedPulling="2026-04-16 19:30:59.698619093 +0000 UTC m=+771.436522346" lastFinishedPulling="2026-04-16 19:31:01.945861049 +0000 UTC m=+773.683764301" observedRunningTime="2026-04-16 19:31:02.555325911 +0000 UTC m=+774.293229180" watchObservedRunningTime="2026-04-16 19:31:02.557083475 +0000 UTC m=+774.294986742" Apr 16 19:31:08.547151 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:08.547118 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-866cd599f-xjm69" Apr 16 19:31:12.545835 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:12.545803 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-74dd74c689-btkxl" Apr 16 19:31:26.523905 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:26.523803 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-74dd74c689-btkxl"] Apr 16 19:31:26.524329 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:26.524115 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-74dd74c689-btkxl" podUID="32dbcf5b-1ddf-44eb-a1e0-1d187279c952" containerName="manager" containerID="cri-o://870c190b669de1b657a11b41431fbcdbc90ca4571313f1eff77895a6213fe8a6" gracePeriod=10 Apr 16 19:31:26.770098 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:26.770055 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-74dd74c689-btkxl" Apr 16 19:31:26.823806 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:26.823717 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfr44\" (UniqueName: \"kubernetes.io/projected/32dbcf5b-1ddf-44eb-a1e0-1d187279c952-kube-api-access-lfr44\") pod \"32dbcf5b-1ddf-44eb-a1e0-1d187279c952\" (UID: \"32dbcf5b-1ddf-44eb-a1e0-1d187279c952\") " Apr 16 19:31:26.826058 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:26.826031 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32dbcf5b-1ddf-44eb-a1e0-1d187279c952-kube-api-access-lfr44" (OuterVolumeSpecName: "kube-api-access-lfr44") pod "32dbcf5b-1ddf-44eb-a1e0-1d187279c952" (UID: "32dbcf5b-1ddf-44eb-a1e0-1d187279c952"). InnerVolumeSpecName "kube-api-access-lfr44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:31:26.924907 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:26.924859 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lfr44\" (UniqueName: \"kubernetes.io/projected/32dbcf5b-1ddf-44eb-a1e0-1d187279c952-kube-api-access-lfr44\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:31:27.637505 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:27.637458 2578 generic.go:358] "Generic (PLEG): container finished" podID="32dbcf5b-1ddf-44eb-a1e0-1d187279c952" containerID="870c190b669de1b657a11b41431fbcdbc90ca4571313f1eff77895a6213fe8a6" exitCode=0 Apr 16 19:31:27.637952 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:27.637522 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-74dd74c689-btkxl" Apr 16 19:31:27.637952 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:27.637552 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-74dd74c689-btkxl" event={"ID":"32dbcf5b-1ddf-44eb-a1e0-1d187279c952","Type":"ContainerDied","Data":"870c190b669de1b657a11b41431fbcdbc90ca4571313f1eff77895a6213fe8a6"} Apr 16 19:31:27.637952 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:27.637592 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-74dd74c689-btkxl" event={"ID":"32dbcf5b-1ddf-44eb-a1e0-1d187279c952","Type":"ContainerDied","Data":"5785917f5a582c4db62454ee53f307a648f24b0f1a9176f35e8340053e728551"} Apr 16 19:31:27.637952 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:27.637610 2578 scope.go:117] "RemoveContainer" containerID="870c190b669de1b657a11b41431fbcdbc90ca4571313f1eff77895a6213fe8a6" Apr 16 19:31:27.646470 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:27.646446 2578 scope.go:117] "RemoveContainer" containerID="870c190b669de1b657a11b41431fbcdbc90ca4571313f1eff77895a6213fe8a6" Apr 16 19:31:27.646732 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:31:27.646712 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"870c190b669de1b657a11b41431fbcdbc90ca4571313f1eff77895a6213fe8a6\": container with ID starting with 870c190b669de1b657a11b41431fbcdbc90ca4571313f1eff77895a6213fe8a6 not found: ID does not exist" containerID="870c190b669de1b657a11b41431fbcdbc90ca4571313f1eff77895a6213fe8a6" Apr 16 19:31:27.646786 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:27.646744 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"870c190b669de1b657a11b41431fbcdbc90ca4571313f1eff77895a6213fe8a6"} err="failed to get container status \"870c190b669de1b657a11b41431fbcdbc90ca4571313f1eff77895a6213fe8a6\": rpc error: code = NotFound desc = could not find container \"870c190b669de1b657a11b41431fbcdbc90ca4571313f1eff77895a6213fe8a6\": container with ID starting with 870c190b669de1b657a11b41431fbcdbc90ca4571313f1eff77895a6213fe8a6 not found: ID does not exist" Apr 16 19:31:27.655339 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:27.655311 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-74dd74c689-btkxl"] Apr 16 19:31:27.659207 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:27.659181 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-74dd74c689-btkxl"] Apr 16 19:31:28.887631 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:28.887597 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32dbcf5b-1ddf-44eb-a1e0-1d187279c952" path="/var/lib/kubelet/pods/32dbcf5b-1ddf-44eb-a1e0-1d187279c952/volumes" Apr 16 19:31:52.178880 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.178841 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9"] Apr 16 19:31:52.179356 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.179197 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32dbcf5b-1ddf-44eb-a1e0-1d187279c952" containerName="manager" Apr 16 19:31:52.179356 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.179209 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="32dbcf5b-1ddf-44eb-a1e0-1d187279c952" containerName="manager" Apr 16 19:31:52.179356 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.179275 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="32dbcf5b-1ddf-44eb-a1e0-1d187279c952" containerName="manager" Apr 16 19:31:52.182435 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.182397 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.186434 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.186384 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 19:31:52.186434 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.186424 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-94ng7\"" Apr 16 19:31:52.186619 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.186445 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 16 19:31:52.186619 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.186402 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 19:31:52.193230 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.193203 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9"] Apr 16 19:31:52.246671 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.246621 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0a25ddc3-d68d-47af-93b0-38968711bc7e-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.246847 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.246677 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a25ddc3-d68d-47af-93b0-38968711bc7e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.246847 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.246757 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptftf\" (UniqueName: \"kubernetes.io/projected/0a25ddc3-d68d-47af-93b0-38968711bc7e-kube-api-access-ptftf\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.246847 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.246802 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a25ddc3-d68d-47af-93b0-38968711bc7e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.246975 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.246846 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a25ddc3-d68d-47af-93b0-38968711bc7e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.246975 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.246874 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0a25ddc3-d68d-47af-93b0-38968711bc7e-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.348048 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.348001 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0a25ddc3-d68d-47af-93b0-38968711bc7e-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.348048 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.348047 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a25ddc3-d68d-47af-93b0-38968711bc7e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.348299 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.348077 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptftf\" (UniqueName: \"kubernetes.io/projected/0a25ddc3-d68d-47af-93b0-38968711bc7e-kube-api-access-ptftf\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.348299 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.348125 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a25ddc3-d68d-47af-93b0-38968711bc7e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.348299 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.348178 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a25ddc3-d68d-47af-93b0-38968711bc7e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.348299 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.348211 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0a25ddc3-d68d-47af-93b0-38968711bc7e-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.348575 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.348548 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0a25ddc3-d68d-47af-93b0-38968711bc7e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.348641 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.348581 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0a25ddc3-d68d-47af-93b0-38968711bc7e-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.348641 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.348625 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0a25ddc3-d68d-47af-93b0-38968711bc7e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.350601 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.350575 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0a25ddc3-d68d-47af-93b0-38968711bc7e-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.350819 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.350797 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0a25ddc3-d68d-47af-93b0-38968711bc7e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.357916 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.357892 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptftf\" (UniqueName: \"kubernetes.io/projected/0a25ddc3-d68d-47af-93b0-38968711bc7e-kube-api-access-ptftf\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9\" (UID: \"0a25ddc3-d68d-47af-93b0-38968711bc7e\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.494028 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.493985 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:31:52.639661 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.639624 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9"] Apr 16 19:31:52.641497 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:31:52.641462 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a25ddc3_d68d_47af_93b0_38968711bc7e.slice/crio-68ebe2fd450a27253bf2585cccb4a075a4ed4318ae40c82a89a2a8367ccd0acd WatchSource:0}: Error finding container 68ebe2fd450a27253bf2585cccb4a075a4ed4318ae40c82a89a2a8367ccd0acd: Status 404 returned error can't find the container with id 68ebe2fd450a27253bf2585cccb4a075a4ed4318ae40c82a89a2a8367ccd0acd Apr 16 19:31:52.751101 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:52.751012 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" event={"ID":"0a25ddc3-d68d-47af-93b0-38968711bc7e","Type":"ContainerStarted","Data":"68ebe2fd450a27253bf2585cccb4a075a4ed4318ae40c82a89a2a8367ccd0acd"} Apr 16 19:31:58.153524 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.153479 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9"] Apr 16 19:31:58.189129 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.189067 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9"] Apr 16 19:31:58.189315 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.189248 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.192876 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.192662 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 16 19:31:58.307878 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.307831 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.307878 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.307886 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.308156 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.307922 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.308156 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.307950 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.308156 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.308029 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbh64\" (UniqueName: \"kubernetes.io/projected/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-kube-api-access-bbh64\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.308156 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.308066 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.409703 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.409611 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.409891 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.409713 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.409891 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.409734 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.409891 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.409766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.409891 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.409785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.409891 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.409883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbh64\" (UniqueName: \"kubernetes.io/projected/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-kube-api-access-bbh64\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.410218 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.410192 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.410481 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.410222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.410539 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.410505 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.412190 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.412166 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.412704 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.412686 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.419534 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.419463 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbh64\" (UniqueName: \"kubernetes.io/projected/5c9ae267-b0d1-4411-bbb6-28a8b01022c8-kube-api-access-bbh64\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9\" (UID: \"5c9ae267-b0d1-4411-bbb6-28a8b01022c8\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:58.511390 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:58.511037 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:31:59.283002 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:59.282972 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9"] Apr 16 19:31:59.284749 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:31:59.284722 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c9ae267_b0d1_4411_bbb6_28a8b01022c8.slice/crio-d442c4e59eba4362b91b8695475576c5b8c1257b3a977f1ebe51acbc40ca3d6d WatchSource:0}: Error finding container d442c4e59eba4362b91b8695475576c5b8c1257b3a977f1ebe51acbc40ca3d6d: Status 404 returned error can't find the container with id d442c4e59eba4362b91b8695475576c5b8c1257b3a977f1ebe51acbc40ca3d6d Apr 16 19:31:59.784533 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:59.784475 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" event={"ID":"5c9ae267-b0d1-4411-bbb6-28a8b01022c8","Type":"ContainerStarted","Data":"e0224a48362b44d7f0920d53af313983f0fefb0d22a896e09263d6a2fae4c047"} Apr 16 19:31:59.784533 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:59.784517 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" event={"ID":"5c9ae267-b0d1-4411-bbb6-28a8b01022c8","Type":"ContainerStarted","Data":"d442c4e59eba4362b91b8695475576c5b8c1257b3a977f1ebe51acbc40ca3d6d"} Apr 16 19:31:59.785992 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:31:59.785942 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" event={"ID":"0a25ddc3-d68d-47af-93b0-38968711bc7e","Type":"ContainerStarted","Data":"08dac7c1555a7dd4c3d2c6e25199852d155a571e009c421a9eb75cf6176859a9"} Apr 16 19:32:05.822131 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:05.822090 2578 generic.go:358] "Generic (PLEG): container finished" podID="5c9ae267-b0d1-4411-bbb6-28a8b01022c8" containerID="e0224a48362b44d7f0920d53af313983f0fefb0d22a896e09263d6a2fae4c047" exitCode=0 Apr 16 19:32:05.822639 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:05.822181 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" event={"ID":"5c9ae267-b0d1-4411-bbb6-28a8b01022c8","Type":"ContainerDied","Data":"e0224a48362b44d7f0920d53af313983f0fefb0d22a896e09263d6a2fae4c047"} Apr 16 19:32:05.823887 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:05.823863 2578 generic.go:358] "Generic (PLEG): container finished" podID="0a25ddc3-d68d-47af-93b0-38968711bc7e" containerID="08dac7c1555a7dd4c3d2c6e25199852d155a571e009c421a9eb75cf6176859a9" exitCode=0 Apr 16 19:32:05.823963 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:05.823940 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" event={"ID":"0a25ddc3-d68d-47af-93b0-38968711bc7e","Type":"ContainerDied","Data":"08dac7c1555a7dd4c3d2c6e25199852d155a571e009c421a9eb75cf6176859a9"} Apr 16 19:32:10.848200 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:10.848151 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" event={"ID":"5c9ae267-b0d1-4411-bbb6-28a8b01022c8","Type":"ContainerStarted","Data":"86aee563f06e25a65143b7033f1c96a99b5a25d877a8845b74d2abced720a4ac"} Apr 16 19:32:10.848656 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:10.848381 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:32:10.849865 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:10.849838 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" event={"ID":"0a25ddc3-d68d-47af-93b0-38968711bc7e","Type":"ContainerStarted","Data":"6e62b580ae964bce03898a7e2590b829232ac5a9ea6ab1551b9991ebd965a6c0"} Apr 16 19:32:10.850063 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:10.850046 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:32:10.870232 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:10.870178 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" podStartSLOduration=8.231762821 podStartE2EDuration="12.870163053s" podCreationTimestamp="2026-04-16 19:31:58 +0000 UTC" firstStartedPulling="2026-04-16 19:32:05.823074394 +0000 UTC m=+837.560977646" lastFinishedPulling="2026-04-16 19:32:10.461474626 +0000 UTC m=+842.199377878" observedRunningTime="2026-04-16 19:32:10.867497701 +0000 UTC m=+842.605400984" watchObservedRunningTime="2026-04-16 19:32:10.870163053 +0000 UTC m=+842.608066361" Apr 16 19:32:10.890553 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:10.890506 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" podStartSLOduration=1.055536248 podStartE2EDuration="18.890490311s" podCreationTimestamp="2026-04-16 19:31:52 +0000 UTC" firstStartedPulling="2026-04-16 19:31:52.643629841 +0000 UTC m=+824.381533090" lastFinishedPulling="2026-04-16 19:32:10.478583902 +0000 UTC m=+842.216487153" observedRunningTime="2026-04-16 19:32:10.887149926 +0000 UTC m=+842.625053219" watchObservedRunningTime="2026-04-16 19:32:10.890490311 +0000 UTC m=+842.628393579" Apr 16 19:32:16.046525 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.046482 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7"] Apr 16 19:32:16.053491 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.053459 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.056399 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.056374 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 16 19:32:16.061582 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.061169 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7"] Apr 16 19:32:16.077007 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.076972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.077007 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.077010 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9dbl\" (UniqueName: \"kubernetes.io/projected/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-kube-api-access-z9dbl\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.077191 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.077046 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.077191 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.077064 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.077191 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.077137 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.077191 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.077158 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.178563 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.178517 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.178778 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.178571 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9dbl\" (UniqueName: \"kubernetes.io/projected/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-kube-api-access-z9dbl\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.178778 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.178608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.178778 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.178631 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.178778 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.178690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.178994 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.178826 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.179183 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.179160 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.179248 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.179189 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.179248 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.179216 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.181164 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.181140 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.181400 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.181382 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.188701 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.188676 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9dbl\" (UniqueName: \"kubernetes.io/projected/800ec9bc-dbdc-4abc-a578-5c04c5ee19a8-kube-api-access-z9dbl\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-sqzm7\" (UID: \"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.365993 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.365881 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:16.513013 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.512982 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7"] Apr 16 19:32:16.515358 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:32:16.515325 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod800ec9bc_dbdc_4abc_a578_5c04c5ee19a8.slice/crio-9cbc8e2da1a58d4207331ae684f7a1bb395ec9366cf9353119a8e5e1c8bc54d0 WatchSource:0}: Error finding container 9cbc8e2da1a58d4207331ae684f7a1bb395ec9366cf9353119a8e5e1c8bc54d0: Status 404 returned error can't find the container with id 9cbc8e2da1a58d4207331ae684f7a1bb395ec9366cf9353119a8e5e1c8bc54d0 Apr 16 19:32:16.874687 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.874626 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" event={"ID":"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8","Type":"ContainerStarted","Data":"85320a5230f420579d220250d3ef1df32660b3cd5419e7395745a556aff52c05"} Apr 16 19:32:16.874687 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:16.874689 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" event={"ID":"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8","Type":"ContainerStarted","Data":"9cbc8e2da1a58d4207331ae684f7a1bb395ec9366cf9353119a8e5e1c8bc54d0"} Apr 16 19:32:21.867664 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:21.867629 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9" Apr 16 19:32:21.868603 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:21.868581 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9" Apr 16 19:32:22.901368 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:22.901322 2578 generic.go:358] "Generic (PLEG): container finished" podID="800ec9bc-dbdc-4abc-a578-5c04c5ee19a8" containerID="85320a5230f420579d220250d3ef1df32660b3cd5419e7395745a556aff52c05" exitCode=0 Apr 16 19:32:22.901889 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:22.901395 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" event={"ID":"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8","Type":"ContainerDied","Data":"85320a5230f420579d220250d3ef1df32660b3cd5419e7395745a556aff52c05"} Apr 16 19:32:23.915115 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:23.915078 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" event={"ID":"800ec9bc-dbdc-4abc-a578-5c04c5ee19a8","Type":"ContainerStarted","Data":"aa399bb4dba19023113ac4d6b31337163cdf2c188baec9be3cbd659293e506d8"} Apr 16 19:32:23.915607 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:23.915352 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:32:23.934814 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:23.934759 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" podStartSLOduration=7.694401447 podStartE2EDuration="7.934742716s" podCreationTimestamp="2026-04-16 19:32:16 +0000 UTC" firstStartedPulling="2026-04-16 19:32:22.902288818 +0000 UTC m=+854.640192069" lastFinishedPulling="2026-04-16 19:32:23.14263009 +0000 UTC m=+854.880533338" observedRunningTime="2026-04-16 19:32:23.933023104 +0000 UTC m=+855.670926406" watchObservedRunningTime="2026-04-16 19:32:23.934742716 +0000 UTC m=+855.672645984" Apr 16 19:32:34.933460 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:32:34.933401 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-sqzm7" Apr 16 19:33:08.839349 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:33:08.839313 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:33:08.842186 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:33:08.842158 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:34:47.976900 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:34:47.976857 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-66c6fd6db6-k4gb7"] Apr 16 19:34:47.980400 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:34:47.980381 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66c6fd6db6-k4gb7" Apr 16 19:34:47.983224 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:34:47.983199 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-kq8gt\"" Apr 16 19:34:47.992001 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:34:47.991972 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-66c6fd6db6-k4gb7"] Apr 16 19:34:48.052523 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:34:48.052481 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mtp9\" (UniqueName: \"kubernetes.io/projected/fe45c81b-39c5-421f-848e-09468ccfff65-kube-api-access-2mtp9\") pod \"maas-controller-66c6fd6db6-k4gb7\" (UID: \"fe45c81b-39c5-421f-848e-09468ccfff65\") " pod="opendatahub/maas-controller-66c6fd6db6-k4gb7" Apr 16 19:34:48.153219 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:34:48.153177 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mtp9\" (UniqueName: \"kubernetes.io/projected/fe45c81b-39c5-421f-848e-09468ccfff65-kube-api-access-2mtp9\") pod \"maas-controller-66c6fd6db6-k4gb7\" (UID: \"fe45c81b-39c5-421f-848e-09468ccfff65\") " pod="opendatahub/maas-controller-66c6fd6db6-k4gb7" Apr 16 19:34:48.163482 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:34:48.163455 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mtp9\" (UniqueName: \"kubernetes.io/projected/fe45c81b-39c5-421f-848e-09468ccfff65-kube-api-access-2mtp9\") pod \"maas-controller-66c6fd6db6-k4gb7\" (UID: \"fe45c81b-39c5-421f-848e-09468ccfff65\") " pod="opendatahub/maas-controller-66c6fd6db6-k4gb7" Apr 16 19:34:48.291977 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:34:48.291934 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66c6fd6db6-k4gb7" Apr 16 19:34:48.440701 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:34:48.438144 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-66c6fd6db6-k4gb7"] Apr 16 19:34:48.522561 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:34:48.522521 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66c6fd6db6-k4gb7" event={"ID":"fe45c81b-39c5-421f-848e-09468ccfff65","Type":"ContainerStarted","Data":"55d7c254291fbe00afeebb2b704fbd776835c3e0a3a10872cbb4a5827fbe2388"} Apr 16 19:34:49.528856 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:34:49.528818 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66c6fd6db6-k4gb7" event={"ID":"fe45c81b-39c5-421f-848e-09468ccfff65","Type":"ContainerStarted","Data":"e55adbff8b6aee0727d9448533180d2e6b7ecdd39469876ff561810bdd0adec5"} Apr 16 19:34:49.529228 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:34:49.528928 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-66c6fd6db6-k4gb7" Apr 16 19:34:49.549673 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:34:49.549610 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-66c6fd6db6-k4gb7" podStartSLOduration=2.158392975 podStartE2EDuration="2.549589548s" podCreationTimestamp="2026-04-16 19:34:47 +0000 UTC" firstStartedPulling="2026-04-16 19:34:48.438441128 +0000 UTC m=+1000.176344388" lastFinishedPulling="2026-04-16 19:34:48.829637711 +0000 UTC m=+1000.567540961" observedRunningTime="2026-04-16 19:34:49.548781538 +0000 UTC m=+1001.286684804" watchObservedRunningTime="2026-04-16 19:34:49.549589548 +0000 UTC m=+1001.287492816" Apr 16 19:35:00.540057 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:35:00.540014 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-66c6fd6db6-k4gb7" Apr 16 19:38:08.875667 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:38:08.875635 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:38:08.880857 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:38:08.880832 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:43:08.910481 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:43:08.910436 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:43:08.917595 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:43:08.917567 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:44:59.417244 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:44:59.417158 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46"] Apr 16 19:44:59.417764 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:44:59.417464 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" podUID="24b5fb3d-610a-4a51-b573-65fd203f1a2d" containerName="manager" containerID="cri-o://13342c5f3387d7ee3bc1e333958e23eb67f052094ce2783c3a94842b87110000" gracePeriod=10 Apr 16 19:44:59.665608 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:44:59.665581 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" Apr 16 19:44:59.701441 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:44:59.701340 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7fc8\" (UniqueName: \"kubernetes.io/projected/24b5fb3d-610a-4a51-b573-65fd203f1a2d-kube-api-access-b7fc8\") pod \"24b5fb3d-610a-4a51-b573-65fd203f1a2d\" (UID: \"24b5fb3d-610a-4a51-b573-65fd203f1a2d\") " Apr 16 19:44:59.701601 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:44:59.701453 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/24b5fb3d-610a-4a51-b573-65fd203f1a2d-extensions-socket-volume\") pod \"24b5fb3d-610a-4a51-b573-65fd203f1a2d\" (UID: \"24b5fb3d-610a-4a51-b573-65fd203f1a2d\") " Apr 16 19:44:59.701820 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:44:59.701795 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24b5fb3d-610a-4a51-b573-65fd203f1a2d-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "24b5fb3d-610a-4a51-b573-65fd203f1a2d" (UID: "24b5fb3d-610a-4a51-b573-65fd203f1a2d"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:44:59.703797 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:44:59.703769 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24b5fb3d-610a-4a51-b573-65fd203f1a2d-kube-api-access-b7fc8" (OuterVolumeSpecName: "kube-api-access-b7fc8") pod "24b5fb3d-610a-4a51-b573-65fd203f1a2d" (UID: "24b5fb3d-610a-4a51-b573-65fd203f1a2d"). InnerVolumeSpecName "kube-api-access-b7fc8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:44:59.802602 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:44:59.802565 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/24b5fb3d-610a-4a51-b573-65fd203f1a2d-extensions-socket-volume\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:44:59.802602 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:44:59.802600 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b7fc8\" (UniqueName: \"kubernetes.io/projected/24b5fb3d-610a-4a51-b573-65fd203f1a2d-kube-api-access-b7fc8\") on node \"ip-10-0-130-163.ec2.internal\" DevicePath \"\"" Apr 16 19:45:00.061396 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:45:00.061355 2578 generic.go:358] "Generic (PLEG): container finished" podID="24b5fb3d-610a-4a51-b573-65fd203f1a2d" containerID="13342c5f3387d7ee3bc1e333958e23eb67f052094ce2783c3a94842b87110000" exitCode=0 Apr 16 19:45:00.061628 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:45:00.061443 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" Apr 16 19:45:00.061628 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:45:00.061451 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" event={"ID":"24b5fb3d-610a-4a51-b573-65fd203f1a2d","Type":"ContainerDied","Data":"13342c5f3387d7ee3bc1e333958e23eb67f052094ce2783c3a94842b87110000"} Apr 16 19:45:00.061628 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:45:00.061492 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46" event={"ID":"24b5fb3d-610a-4a51-b573-65fd203f1a2d","Type":"ContainerDied","Data":"7ea717f5d9604722d9c8feda4d37a6dab703a1f2016dfbe49d6d94a51efeb23e"} Apr 16 19:45:00.061628 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:45:00.061510 2578 scope.go:117] "RemoveContainer" containerID="13342c5f3387d7ee3bc1e333958e23eb67f052094ce2783c3a94842b87110000" Apr 16 19:45:00.072160 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:45:00.072136 2578 scope.go:117] "RemoveContainer" containerID="13342c5f3387d7ee3bc1e333958e23eb67f052094ce2783c3a94842b87110000" Apr 16 19:45:00.072499 ip-10-0-130-163 kubenswrapper[2578]: E0416 19:45:00.072474 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13342c5f3387d7ee3bc1e333958e23eb67f052094ce2783c3a94842b87110000\": container with ID starting with 13342c5f3387d7ee3bc1e333958e23eb67f052094ce2783c3a94842b87110000 not found: ID does not exist" containerID="13342c5f3387d7ee3bc1e333958e23eb67f052094ce2783c3a94842b87110000" Apr 16 19:45:00.072580 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:45:00.072507 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13342c5f3387d7ee3bc1e333958e23eb67f052094ce2783c3a94842b87110000"} err="failed to get container status \"13342c5f3387d7ee3bc1e333958e23eb67f052094ce2783c3a94842b87110000\": rpc error: code = NotFound desc = could not find container \"13342c5f3387d7ee3bc1e333958e23eb67f052094ce2783c3a94842b87110000\": container with ID starting with 13342c5f3387d7ee3bc1e333958e23eb67f052094ce2783c3a94842b87110000 not found: ID does not exist" Apr 16 19:45:00.085171 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:45:00.085137 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46"] Apr 16 19:45:00.088899 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:45:00.088870 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cqf46"] Apr 16 19:45:00.887722 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:45:00.887683 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24b5fb3d-610a-4a51-b573-65fd203f1a2d" path="/var/lib/kubelet/pods/24b5fb3d-610a-4a51-b573-65fd203f1a2d/volumes" Apr 16 19:46:05.510894 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.510851 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h"] Apr 16 19:46:05.511363 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.511261 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24b5fb3d-610a-4a51-b573-65fd203f1a2d" containerName="manager" Apr 16 19:46:05.511363 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.511276 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b5fb3d-610a-4a51-b573-65fd203f1a2d" containerName="manager" Apr 16 19:46:05.511471 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.511364 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="24b5fb3d-610a-4a51-b573-65fd203f1a2d" containerName="manager" Apr 16 19:46:05.514463 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.514445 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h" Apr 16 19:46:05.516993 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.516973 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-fm64z\"" Apr 16 19:46:05.524999 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.524972 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h"] Apr 16 19:46:05.578607 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.578562 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tksf2\" (UniqueName: \"kubernetes.io/projected/c5f21efe-2cad-4abc-ae30-4c9d54ddd59d-kube-api-access-tksf2\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h\" (UID: \"c5f21efe-2cad-4abc-ae30-4c9d54ddd59d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h" Apr 16 19:46:05.578798 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.578627 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5f21efe-2cad-4abc-ae30-4c9d54ddd59d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h\" (UID: \"c5f21efe-2cad-4abc-ae30-4c9d54ddd59d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h" Apr 16 19:46:05.679316 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.679274 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5f21efe-2cad-4abc-ae30-4c9d54ddd59d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h\" (UID: \"c5f21efe-2cad-4abc-ae30-4c9d54ddd59d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h" Apr 16 19:46:05.679493 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.679431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tksf2\" (UniqueName: \"kubernetes.io/projected/c5f21efe-2cad-4abc-ae30-4c9d54ddd59d-kube-api-access-tksf2\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h\" (UID: \"c5f21efe-2cad-4abc-ae30-4c9d54ddd59d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h" Apr 16 19:46:05.679727 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.679708 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c5f21efe-2cad-4abc-ae30-4c9d54ddd59d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h\" (UID: \"c5f21efe-2cad-4abc-ae30-4c9d54ddd59d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h" Apr 16 19:46:05.687651 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.687623 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tksf2\" (UniqueName: \"kubernetes.io/projected/c5f21efe-2cad-4abc-ae30-4c9d54ddd59d-kube-api-access-tksf2\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h\" (UID: \"c5f21efe-2cad-4abc-ae30-4c9d54ddd59d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h" Apr 16 19:46:05.826448 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.826325 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h" Apr 16 19:46:05.973041 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.973012 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h"] Apr 16 19:46:05.975505 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:46:05.975472 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5f21efe_2cad_4abc_ae30_4c9d54ddd59d.slice/crio-e7a8f8672e0574c739e7991690210c4821f094b5cff9f8700ae2492d77099cf1 WatchSource:0}: Error finding container e7a8f8672e0574c739e7991690210c4821f094b5cff9f8700ae2492d77099cf1: Status 404 returned error can't find the container with id e7a8f8672e0574c739e7991690210c4821f094b5cff9f8700ae2492d77099cf1 Apr 16 19:46:05.978076 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:05.978055 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:46:06.334025 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:06.333989 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h" event={"ID":"c5f21efe-2cad-4abc-ae30-4c9d54ddd59d","Type":"ContainerStarted","Data":"3f4a0cc8721f361316a97e8fc08ebcd875145a17415bc5a8fcfed03ee4975083"} Apr 16 19:46:06.334025 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:06.334028 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h" event={"ID":"c5f21efe-2cad-4abc-ae30-4c9d54ddd59d","Type":"ContainerStarted","Data":"e7a8f8672e0574c739e7991690210c4821f094b5cff9f8700ae2492d77099cf1"} Apr 16 19:46:06.334311 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:06.334053 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h" Apr 16 19:46:06.353061 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:06.352999 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h" podStartSLOduration=1.352980725 podStartE2EDuration="1.352980725s" podCreationTimestamp="2026-04-16 19:46:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:46:06.350697985 +0000 UTC m=+1678.088601252" watchObservedRunningTime="2026-04-16 19:46:06.352980725 +0000 UTC m=+1678.090883992" Apr 16 19:46:17.339744 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:46:17.339701 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h" Apr 16 19:48:08.943142 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:48:08.943013 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:48:08.953102 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:48:08.953075 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:53:08.975038 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:53:08.974906 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:53:08.986889 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:53:08.986864 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:55:51.243751 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:51.243709 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-s95tf_0718cf45-497e-48d3-8dc6-e073adda1fea/manager/0.log" Apr 16 19:55:51.353774 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:51.353740 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-866cd599f-xjm69_dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d/maas-api/0.log" Apr 16 19:55:51.467009 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:51.466978 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-66c6fd6db6-k4gb7_fe45c81b-39c5-421f-848e-09468ccfff65/manager/0.log" Apr 16 19:55:51.575630 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:51.575541 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-xt2qv_ebdd9a87-3859-41df-9bb9-7b8244bbebaa/manager/2.log" Apr 16 19:55:51.695448 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:51.695394 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-66b64c949f-c9985_f2d31f98-a2f2-4976-a57e-f7e4f46a93f6/manager/0.log" Apr 16 19:55:52.803917 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:52.803876 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54_179bc98b-905e-44ee-b185-aeabdd7718ec/util/0.log" Apr 16 19:55:52.810155 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:52.810124 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54_179bc98b-905e-44ee-b185-aeabdd7718ec/pull/0.log" Apr 16 19:55:52.816029 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:52.816005 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54_179bc98b-905e-44ee-b185-aeabdd7718ec/extract/0.log" Apr 16 19:55:52.923617 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:52.923582 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp_8d817d9e-99f2-4cf7-86f6-9c1b51fccaec/extract/0.log" Apr 16 19:55:52.929041 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:52.929011 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp_8d817d9e-99f2-4cf7-86f6-9c1b51fccaec/util/0.log" Apr 16 19:55:52.934363 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:52.934343 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp_8d817d9e-99f2-4cf7-86f6-9c1b51fccaec/pull/0.log" Apr 16 19:55:53.040945 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:53.040917 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf_0e531a1f-6088-43fd-8a79-76d9f94a2aea/util/0.log" Apr 16 19:55:53.046512 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:53.046491 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf_0e531a1f-6088-43fd-8a79-76d9f94a2aea/pull/0.log" Apr 16 19:55:53.052280 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:53.052257 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf_0e531a1f-6088-43fd-8a79-76d9f94a2aea/extract/0.log" Apr 16 19:55:53.157627 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:53.157520 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd_1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab/extract/0.log" Apr 16 19:55:53.163349 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:53.163324 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd_1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab/util/0.log" Apr 16 19:55:53.169458 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:53.169425 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd_1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab/pull/0.log" Apr 16 19:55:53.395288 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:53.395256 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-2nkqg_42eaca63-e226-40c1-ad9a-98319b8d009f/manager/0.log" Apr 16 19:55:53.721776 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:53.721739 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-zzkfr_ed475c47-de86-443a-bd41-0a5b65b616c9/registry-server/0.log" Apr 16 19:55:53.830639 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:53.830601 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h_c5f21efe-2cad-4abc-ae30-4c9d54ddd59d/manager/0.log" Apr 16 19:55:54.487564 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:54.487530 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-bnzvp_9b488a9b-2a6e-46f5-80ea-620284daa662/discovery/0.log" Apr 16 19:55:54.591161 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:54.591107 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-d894ddccb-92r84_819903d8-46ef-467a-8e58-d186915a391c/kube-auth-proxy/0.log" Apr 16 19:55:55.264727 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:55.264692 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9_5c9ae267-b0d1-4411-bbb6-28a8b01022c8/storage-initializer/0.log" Apr 16 19:55:55.271205 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:55.271178 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-6n4f9_5c9ae267-b0d1-4411-bbb6-28a8b01022c8/main/0.log" Apr 16 19:55:55.378049 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:55.378015 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-sqzm7_800ec9bc-dbdc-4abc-a578-5c04c5ee19a8/storage-initializer/0.log" Apr 16 19:55:55.384911 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:55.384880 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-sqzm7_800ec9bc-dbdc-4abc-a578-5c04c5ee19a8/main/0.log" Apr 16 19:55:55.711920 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:55.711767 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9_0a25ddc3-d68d-47af-93b0-38968711bc7e/storage-initializer/0.log" Apr 16 19:55:55.718102 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:55:55.718069 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-jtgq9_0a25ddc3-d68d-47af-93b0-38968711bc7e/main/0.log" Apr 16 19:56:02.849495 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:02.849465 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-258dd_ae60b17b-715c-479d-a9cc-496e69796c4e/global-pull-secret-syncer/0.log" Apr 16 19:56:03.022758 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:03.022728 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-h9xsl_c01ac44c-e356-4ba6-9aa3-005d0558378f/konnectivity-agent/0.log" Apr 16 19:56:03.099108 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:03.099075 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-163.ec2.internal_01a926c8ca0a1b00ab2f53afa74bfa04/haproxy/0.log" Apr 16 19:56:07.002038 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.001992 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54_179bc98b-905e-44ee-b185-aeabdd7718ec/extract/0.log" Apr 16 19:56:07.038009 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.037980 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54_179bc98b-905e-44ee-b185-aeabdd7718ec/util/0.log" Apr 16 19:56:07.059714 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.059683 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759ssp54_179bc98b-905e-44ee-b185-aeabdd7718ec/pull/0.log" Apr 16 19:56:07.098119 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.098084 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp_8d817d9e-99f2-4cf7-86f6-9c1b51fccaec/extract/0.log" Apr 16 19:56:07.118796 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.118761 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp_8d817d9e-99f2-4cf7-86f6-9c1b51fccaec/util/0.log" Apr 16 19:56:07.139873 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.139832 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0t66vp_8d817d9e-99f2-4cf7-86f6-9c1b51fccaec/pull/0.log" Apr 16 19:56:07.167647 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.167594 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf_0e531a1f-6088-43fd-8a79-76d9f94a2aea/extract/0.log" Apr 16 19:56:07.193427 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.193372 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf_0e531a1f-6088-43fd-8a79-76d9f94a2aea/util/0.log" Apr 16 19:56:07.217895 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.217855 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739s4xf_0e531a1f-6088-43fd-8a79-76d9f94a2aea/pull/0.log" Apr 16 19:56:07.250782 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.250754 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd_1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab/extract/0.log" Apr 16 19:56:07.272285 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.272213 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd_1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab/util/0.log" Apr 16 19:56:07.296677 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.296615 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1ggztd_1f7cd65a-fd18-4bfb-b4cf-9ac911ae2aab/pull/0.log" Apr 16 19:56:07.517013 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.516963 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-2nkqg_42eaca63-e226-40c1-ad9a-98319b8d009f/manager/0.log" Apr 16 19:56:07.606934 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.606850 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-zzkfr_ed475c47-de86-443a-bd41-0a5b65b616c9/registry-server/0.log" Apr 16 19:56:07.673752 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:07.673714 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-rzp7h_c5f21efe-2cad-4abc-ae30-4c9d54ddd59d/manager/0.log" Apr 16 19:56:09.577875 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:09.577842 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s75jz_c95ee486-2760-4ad8-9188-14bfc1cb67df/node-exporter/0.log" Apr 16 19:56:09.599605 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:09.599571 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s75jz_c95ee486-2760-4ad8-9188-14bfc1cb67df/kube-rbac-proxy/0.log" Apr 16 19:56:09.619622 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:09.619587 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s75jz_c95ee486-2760-4ad8-9188-14bfc1cb67df/init-textfile/0.log" Apr 16 19:56:09.972065 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:09.971976 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-h4qrx_bdf9751d-36a3-47a6-93d9-c268be17be2f/prometheus-operator/0.log" Apr 16 19:56:09.991050 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:09.991020 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-h4qrx_bdf9751d-36a3-47a6-93d9-c268be17be2f/kube-rbac-proxy/0.log" Apr 16 19:56:10.147900 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:10.147867 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5774b77849-6p65s_c3c44c36-e89c-4df8-bf92-0bc611cfe392/thanos-query/0.log" Apr 16 19:56:10.168308 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:10.168274 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5774b77849-6p65s_c3c44c36-e89c-4df8-bf92-0bc611cfe392/kube-rbac-proxy-web/0.log" Apr 16 19:56:10.188967 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:10.188939 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5774b77849-6p65s_c3c44c36-e89c-4df8-bf92-0bc611cfe392/kube-rbac-proxy/0.log" Apr 16 19:56:10.209013 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:10.208989 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5774b77849-6p65s_c3c44c36-e89c-4df8-bf92-0bc611cfe392/prom-label-proxy/0.log" Apr 16 19:56:10.229527 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:10.229445 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5774b77849-6p65s_c3c44c36-e89c-4df8-bf92-0bc611cfe392/kube-rbac-proxy-rules/0.log" Apr 16 19:56:10.249374 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:10.249347 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5774b77849-6p65s_c3c44c36-e89c-4df8-bf92-0bc611cfe392/kube-rbac-proxy-metrics/0.log" Apr 16 19:56:11.452292 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.452254 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d"] Apr 16 19:56:11.461818 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.461785 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.464889 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.464856 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d"] Apr 16 19:56:11.466056 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.466033 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g4x49\"/\"kube-root-ca.crt\"" Apr 16 19:56:11.466902 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.466883 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-g4x49\"/\"default-dockercfg-42kbq\"" Apr 16 19:56:11.466979 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.466884 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g4x49\"/\"openshift-service-ca.crt\"" Apr 16 19:56:11.491898 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.491859 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1cf06f15-fee7-4653-8471-a60744ab55f3-proc\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.491898 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.491898 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfblm\" (UniqueName: \"kubernetes.io/projected/1cf06f15-fee7-4653-8471-a60744ab55f3-kube-api-access-lfblm\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.492094 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.491925 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1cf06f15-fee7-4653-8471-a60744ab55f3-sys\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.492094 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.491964 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1cf06f15-fee7-4653-8471-a60744ab55f3-lib-modules\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.492094 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.491994 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1cf06f15-fee7-4653-8471-a60744ab55f3-podres\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.592881 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.592846 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1cf06f15-fee7-4653-8471-a60744ab55f3-proc\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.593091 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.592896 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfblm\" (UniqueName: \"kubernetes.io/projected/1cf06f15-fee7-4653-8471-a60744ab55f3-kube-api-access-lfblm\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.593091 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.592935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1cf06f15-fee7-4653-8471-a60744ab55f3-sys\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.593091 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.592947 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1cf06f15-fee7-4653-8471-a60744ab55f3-proc\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.593091 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.592961 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1cf06f15-fee7-4653-8471-a60744ab55f3-lib-modules\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.593091 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.593014 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1cf06f15-fee7-4653-8471-a60744ab55f3-sys\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.593091 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.593083 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1cf06f15-fee7-4653-8471-a60744ab55f3-lib-modules\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.593366 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.593084 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1cf06f15-fee7-4653-8471-a60744ab55f3-podres\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.593366 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.593172 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1cf06f15-fee7-4653-8471-a60744ab55f3-podres\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.602312 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.602278 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfblm\" (UniqueName: \"kubernetes.io/projected/1cf06f15-fee7-4653-8471-a60744ab55f3-kube-api-access-lfblm\") pod \"perf-node-gather-daemonset-2jm8d\" (UID: \"1cf06f15-fee7-4653-8471-a60744ab55f3\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.773827 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.773788 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:11.910466 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.910280 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d"] Apr 16 19:56:11.913001 ip-10-0-130-163 kubenswrapper[2578]: W0416 19:56:11.912963 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod1cf06f15_fee7_4653_8471_a60744ab55f3.slice/crio-ba032d25b45df226a854a9bc6f38e9ce02976d2069d1319243386ebe916ace05 WatchSource:0}: Error finding container ba032d25b45df226a854a9bc6f38e9ce02976d2069d1319243386ebe916ace05: Status 404 returned error can't find the container with id ba032d25b45df226a854a9bc6f38e9ce02976d2069d1319243386ebe916ace05 Apr 16 19:56:11.914846 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:11.914828 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:56:12.389743 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:12.389711 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c7f7f95f9-sn7pt_5b101b29-cfd8-4f18-9019-0845925a7e2d/console/0.log" Apr 16 19:56:12.888875 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:12.888838 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" event={"ID":"1cf06f15-fee7-4653-8471-a60744ab55f3","Type":"ContainerStarted","Data":"aa58f58f7bf72e5a79e0b9a988cdedb7b4b81efce2cc2f8df382abd1f4a9e4a9"} Apr 16 19:56:12.888875 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:12.888882 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" event={"ID":"1cf06f15-fee7-4653-8471-a60744ab55f3","Type":"ContainerStarted","Data":"ba032d25b45df226a854a9bc6f38e9ce02976d2069d1319243386ebe916ace05"} Apr 16 19:56:12.889300 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:12.888904 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:12.903918 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:12.903856 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" podStartSLOduration=1.9038353049999999 podStartE2EDuration="1.903835305s" podCreationTimestamp="2026-04-16 19:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:56:12.902899246 +0000 UTC m=+2284.640802514" watchObservedRunningTime="2026-04-16 19:56:12.903835305 +0000 UTC m=+2284.641738647" Apr 16 19:56:13.737133 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:13.737104 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7v82m_368abbc9-410f-479c-9fa1-3676e33eeb51/dns/0.log" Apr 16 19:56:13.756863 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:13.756835 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7v82m_368abbc9-410f-479c-9fa1-3676e33eeb51/kube-rbac-proxy/0.log" Apr 16 19:56:13.872716 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:13.872683 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kndsd_c8ee73e7-c318-4fd0-a148-eef6ac668052/dns-node-resolver/0.log" Apr 16 19:56:14.331196 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:14.331158 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-698cb67655-bhcxl_7a37b419-4e8b-4075-8e76-e678c39e2bdb/registry/0.log" Apr 16 19:56:14.371975 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:14.371945 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-v9w9k_d90da04b-9bd0-4142-9e92-b6e47a4f708c/node-ca/0.log" Apr 16 19:56:15.299900 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:15.299864 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-bnzvp_9b488a9b-2a6e-46f5-80ea-620284daa662/discovery/0.log" Apr 16 19:56:15.318665 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:15.318634 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-d894ddccb-92r84_819903d8-46ef-467a-8e58-d186915a391c/kube-auth-proxy/0.log" Apr 16 19:56:15.960303 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:15.960271 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rdj9w_419f622c-e0bd-4976-87ce-7df0a1ed0500/serve-healthcheck-canary/0.log" Apr 16 19:56:16.558065 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:16.558030 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qr8c4_245c93d8-aeaf-4048-9b6a-90741c0996b3/kube-rbac-proxy/0.log" Apr 16 19:56:16.577350 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:16.577322 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qr8c4_245c93d8-aeaf-4048-9b6a-90741c0996b3/exporter/0.log" Apr 16 19:56:16.596904 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:16.596873 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qr8c4_245c93d8-aeaf-4048-9b6a-90741c0996b3/extractor/0.log" Apr 16 19:56:18.486823 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:18.486780 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-s95tf_0718cf45-497e-48d3-8dc6-e073adda1fea/manager/0.log" Apr 16 19:56:18.521167 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:18.521134 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-866cd599f-xjm69_dc0c645f-bc3d-4d13-9cd4-6ec51e6c2a4d/maas-api/0.log" Apr 16 19:56:18.571332 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:18.571297 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-66c6fd6db6-k4gb7_fe45c81b-39c5-421f-848e-09468ccfff65/manager/0.log" Apr 16 19:56:18.592796 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:18.592758 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-xt2qv_ebdd9a87-3859-41df-9bb9-7b8244bbebaa/manager/1.log" Apr 16 19:56:18.606083 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:18.606046 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-xt2qv_ebdd9a87-3859-41df-9bb9-7b8244bbebaa/manager/2.log" Apr 16 19:56:18.629261 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:18.629223 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-66b64c949f-c9985_f2d31f98-a2f2-4976-a57e-f7e4f46a93f6/manager/0.log" Apr 16 19:56:18.903750 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:18.903720 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-2jm8d" Apr 16 19:56:20.016869 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:20.016832 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-89ht7_a7a5ab98-ce3a-4bf8-837c-3b1ba623c2f6/openshift-lws-operator/0.log" Apr 16 19:56:24.519124 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:24.519096 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wvlvr_aab238ff-d84e-4ed9-8165-c0f56eacaf68/migrator/0.log" Apr 16 19:56:24.539651 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:24.539619 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wvlvr_aab238ff-d84e-4ed9-8165-c0f56eacaf68/graceful-termination/0.log" Apr 16 19:56:26.196750 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:26.196716 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sqvwz_ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e/kube-multus-additional-cni-plugins/0.log" Apr 16 19:56:26.225313 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:26.225283 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sqvwz_ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e/egress-router-binary-copy/0.log" Apr 16 19:56:26.247647 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:26.247616 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sqvwz_ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e/cni-plugins/0.log" Apr 16 19:56:26.267633 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:26.267603 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sqvwz_ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e/bond-cni-plugin/0.log" Apr 16 19:56:26.287086 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:26.287057 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sqvwz_ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e/routeoverride-cni/0.log" Apr 16 19:56:26.308050 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:26.308014 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sqvwz_ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e/whereabouts-cni-bincopy/0.log" Apr 16 19:56:26.327733 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:26.327705 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sqvwz_ccb4c0a2-5a79-41a1-b87b-38f1a9443c9e/whereabouts-cni/0.log" Apr 16 19:56:26.364218 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:26.364188 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bl8m2_f85164ec-72d2-43d8-8a96-11e63cc91aeb/kube-multus/0.log" Apr 16 19:56:26.527541 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:26.527513 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xhnjz_81fa50c0-8c06-4a6c-9d00-a1ed89b88844/network-metrics-daemon/0.log" Apr 16 19:56:26.545655 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:26.545619 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xhnjz_81fa50c0-8c06-4a6c-9d00-a1ed89b88844/kube-rbac-proxy/0.log" Apr 16 19:56:27.635368 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:27.635332 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-controller/0.log" Apr 16 19:56:27.657013 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:27.656978 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/0.log" Apr 16 19:56:27.666836 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:27.666809 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovn-acl-logging/1.log" Apr 16 19:56:27.685698 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:27.685660 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/kube-rbac-proxy-node/0.log" Apr 16 19:56:27.710023 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:27.709984 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:56:27.733099 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:27.733073 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/northd/0.log" Apr 16 19:56:27.759008 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:27.758980 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/nbdb/0.log" Apr 16 19:56:27.782107 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:27.782073 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/sbdb/0.log" Apr 16 19:56:27.881874 ip-10-0-130-163 kubenswrapper[2578]: I0416 19:56:27.881834 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bcplr_5b22e5a6-5ee7-48e8-b3d5-b1b77686c765/ovnkube-controller/0.log"